In a research building in the heart of UConn’s Storrs campus, assistant professor Ashwin Dani is teaching a life-size industrial robot how to think.
Here, on a recent day inside the University’s Robotics and Controls Lab, Dani and a small team of graduate students are showing the humanoid bot how to assemble a simple desk drawer.
The “eyes” on the robot’s face screen look on as two students build the wooden drawer, reaching for different tools on a tabletop as they work together to complete the task.
The robot may not appear intently engaged. But it isn’t missing a thing – or at least that’s what the scientists hope. For inside the robot’s circuitry, its processors are capturing and cataloging all of the humans’ movements through an advanced camera lens and motion sensors embedded into his metallic frame.
Ultimately, the UConn scientists hope to develop software that will teach industrial robots how to use their sensory inputs to quickly “learn” the various steps for a manufacturing task – such as assembling a drawer or a circuit board – simply by watching their human counterparts do it first.