crrlogo.jpg    Collaborative Robotics Research

crr lab.JPG

 

Principal Investigator - Dr. Thad Roppel
Sensor Fusion and Collaborative Robotics Laboratory
Electrical and Computer Engineering Department
Auburn University
Auburn, Alabama

 

Vision:

We will advance the state of the art in the ability of robots and humans to interact cooperatively in real-world situations. Our systems will provide natural communication modes (e.g., speech, sign language) between humans and robots, and will employ biologically-inspired fusion of sensory systems (e.g., vision, hearing, olfaction, etc.) to improve the ability of our machines operate effectively in real-world, dynamic environments.

Goals:

  • Demonstrate cooperative tasks involving teams of robots and humans.
  • Discover underlying principles of cooperation and apply these to human-machine collaboration.
  • Provide an opportunity for graduate students to conduct research in cooperative robotics.
  • Provide research experience for undergraduate students.
  • Conduct outreach to local K-12 schools to stimulate interest in science and technology among those students.
  • Contribute papers to regional, national, and international conferences and journals in the field of cooperative robotics.

Areas of particular interest:

  • RF communication: Investigate the use of adaptive, intelligent wireless communication links among robots and human interfaces (internet stations, etc.) using software radio. Consider the effects of time delay due to distance (e.g., interplanetary collaboration), internet propagation delay, and processing time. A typical scenario might involve a robot team in which the robots are outfitted with different types of communications transceivers. A particular robot, outfitted with a software-based dynamically reconfigurable radio transceiver, might receive a directive from a human controller and then act as a translator, broadcasting the message to the rest of the team using the protocol appropriate for each team member.
  • Natural human interfaces: Investigate the use of speech, sign language, body language, etc. to communicate with robots.
  • Use of vision for ranging and location.
  • Use of sensor fusion (vision, hearing, touching, smelling) to improve decision making.
  • Guided autonomy: Typical scenarios involve limited human oversight due to distance or communication lag. Robot teams must function autonomously for extended periods of time, but be able to accommodate updated directives from human controllers.

Useful Links

 Theses

 

People
 

Graduate Students - Spring 2010:  Nida Bano, Indraneil Gokhale, Xueming (Jennifer) Wang

Undergraduates – Spring 2010:  Kyle Van Riper, Brian Arnberg, Timothy (Andrew) Carnley

  • Grad students (past): Sreekanth Boga, Chris Wilson, Arun Raghunathan, Ratha An, Fraidun Akhi, Rama Narendran, Adam Ray
  • Undergrads (past): Aaron Steiner, Abhishek Davray, Anthony Zhang, Clay Schwendeman, Steven Hawkins, Clay Askew, Aaron Steiner, Steven Hawkins, Joel Hewlett, John Rogers
  • Affiliated Grad students:  Nick Cotton, David Hodo

 

REU Students: Isaac Rieksts (Kutztown U.), Jory Schossau (Linfield College), Chen Lim (Harvey-Mudd College)

Foundational Literature, Conferences, and Web Sites:

 

 Interesting robot movies:

  • Video clips of GRR-1 from our lab using sonar and IR for obstacle avoidance and beacon following

Obstacle avoidance and object tracking (14 meg avi)

Tracking (12 meg avi)

Tracking -overhead view (11 meg avi)

 

 


Page created February 10, 2005.
Last updated  Sept. 12, 2014.
Page maintained by Dr. Roppel