Over the past few years, the Robotics, Intelligence Sensing and Control (RISC) Laboratory at UB has supported several faculty research projects, as well as those of more than 300 undergraduate, graduate, and doctoral students in the areas of robotics, automation, manufacturing, and computer vision. RISC is equipped for research in robotic manipulation, distributed autonomous control, machine perception and mobile robotic devices.
One of Sobh’s current research projects involves the development of user-friendly robotic mobile modular platforms that move either autonomously or semi-autonomously. “It’s a project that I am very proud of,” says Sobh. The design provides a non-expert user with an accessible yet very robust robotic sensory platform on which to further add and develop complex robotic functions, actuators, sensing devices and task descriptions without having to design or delve deeply and create complicated software, firmware or hardware.
This “Plug-and-Play design,” Sobh explains, “permits the short-term installation of cameras, lasers and other sensors, in addition to various robotic actuation devices and electromechanical components, and provides for a very simple software tool without the worry of designing communication modules between the software and hardware. It all comes together automatically, with the code and the control strategy generated autonomously, so that a task is performed with the click of a button.”
The goal of the project is to create a modular design for activating platform sensors and task descriptions, and make this modular design available for multiple applications. For example, if there is a mobile platform or a manipulator with no mobility, different tasks can be performed, such as painting, navigation, 3-D scene recovery, map-building, welding or clean-up, and does not require the users to be robotic experts.
Sobh explains, “It is going to be like a cell phone application, which performs the tasks you need, without you being an expert in cell phones, coding or software design, or in the configuration required for the task that the application is performing. I can drag cameras, ultrasonic and infrared sensors, lasers, and add them to the platform.
Then you tell the phone app what task you want to do. If you say ‘paint,’ the platform and its control strategy and code would be generated and run based on the available hardware, motors and sensors, and the robot will just do its job at maximal efficiency.”
A second project currently in the research and development phase is a robotic swarm collective intelligence behavior project that uses small non-intelligent or slightly-intelligent robots that collectively perform complex tasks. The concept is borrowed from bees and ants, which have very limited intelligence and behaviors but can survive, reproduce, carry out tasks, attack and protect each other if they work in groups or with the entire colony. The overall high intelligence of the group is actually created by the simple acts and moderate local intelligence of each individual.
In the same respect, the project’s goal is to develop very simple and inexpensive robots that individually perform very simple behaviors such as catching images on a camera, moving things around, communicating with nearby peers and doing very simple laser scanning. When put together, these robots can perform very significant and complex tasks in parallel much cheaper, faster and more efficiently than one or more very complex and expensive robotic agents.
Returning to the painting example, Sobh continues, “If you want to paint a room, instead of deploying one smart and very expensive robot, you will have fifty or a hundred smaller and very cheap ones, with limited aptitude. In this case, the small robots would each have location sensors, simple communication modules, and vision capability to be able to moveaway from each other and start painting their little part of the wall in parallel. You will have the entire room painted in a fraction of the time as one robot, and at a much smaller price tag too.”
The potential applications for swarm intelligence are not only industrial. According to Sobh, swarm intelligence could have significant applications in the defense sector. For example, swarm intelligent robots could replace human reconnaissance in hostile or unfamiliar locations, performing the observation and examination of certain places or buildings. These smaller, simpler robots would be minimally equipped with simple processors, controllers, and small cameras.
Through ongoing, real-time communication between the robots and a human-staffed “control center,” the swarm could relay critical information, and could also be reconfigured and redirected to different tasks as the situation dictates.
The sustainability of robotic reproduction is also being researched in the RISC lab. In unstructured environments that are inaccessible or harsh for humans, mechanized machines are not only needed to perform certain duties, but they are needed to create other specialized mechanized machines. If two robots, i.e., “mom and dad,” have access to raw materials like iron, plastic, cameras, sensors and so on, they can use them to create a “baby” robot for specific tasks.
“It is a colony project with the idea to ‘procreate’ within the automation area, where robots with a limited number of tasks and sufficient raw materials can create a sustainable environment that includes assembling and programming other robots to perform new tasks,” says Sobh. To some, this idea might be reminiscent of intelligent machines depicted in futuristic movie classics like The Terminator and The Matrix, but the possibility is on the horizon and has the potential to provide significant support to human endeavors in a variety of environments and for a variety of purposes.
Sobh is just wrapping up a major research project as part of the Applied Nanotechnology Consortium, conducted with Khaled Elleithy, Ph.D., Associate Dean of Graduate Programs and Professor of Computer Science, and Engineering and Hassan Bajwa, Ph.D., Assistant Professor of Electrical Engineering. Following an earlier award for preliminary research partly conducted at UB by Sobh and select faculty, the U.S. Army awarded $2.4 million to the Applied Nanotechnology Consortium, a group comprised of UB, the University of Hartford, the University of Connecticut, and area organizations and industry. Dr. Sobh led the UB engineering research team through the 18-month research project designed to develop army drones.
UB faculty were responsible for developing computer-vision technologies—cameras and algorithms—that process images in the projectiles, and communications. The Consortium was charged with the design and creation of an unmanned device that will carry a video camera over large distances in real time. The aerial drone will provide a soldier with a means of “seeing” a limited range of landscape that would otherwise be hidden from view. The device will most likely be fired from a tube, similar to those used in mortar fire. The images are transmitted back in real time, so that the viewer can see what the device “sees” during its approximately 40 second journey.
Finally, Sobh is working in the area of sustainability with Elif Kongar, Ph.D., Assistant Professor of the Departments of Mechanical Engineering and Technology Management, and an expert in disassembly and green engineering. The idea behind their research is to create an automatic disassembly and recycling system for a product at the end of its life. Sobh brings the possibility of autonomous robotic and sensory activity to take the “dead” product and disassemble it automatically, repairing and/or reusing the good components, and discarding the remaining components.