Through the years, the relationship we’ve had with humanoid robots has been fairly peripheral, kitschy and glamorized (or demonized) by science fiction.
Outside of conjured images of Star Wars’ protocol droids or the Asimov film adaptation “I, Robot,” real-life humans have thus far only been able to interact with robots through simple artificial intelligence or programmed scripts.
Concurrent Technologies Corp. is trying to take that relationship into the realm of the future through research into neurotechnology and motion-capturing – essentially designing and programming robots that can speak body language and read thoughts.
They call it “defining the art of the possible.”
Brett Wilmotte, information fusion and visualization director, said CTC is looking for ways to “improve the human condition” through these cold, bipedal servants.
“The focus is how do we make these robots more natural to use?” he said. “A keyboard and a mouse; we’re all used to using that, but it’s not the most natural way of controlling things.”
With an independent research fund, what their UPJ interns Zachary Weaver and Jesse Davis – along with Mount Aloysius intern Justin Hoberney and Penn State Altoona intern Ryan Romano – came up with were motion-control programs for a roughly one-foot tall robot called a “bioloid.”
They named it “Paul.”
Think of it as a highly sophisticated Lego set that is operated using a fairly unsophisticated tech: Microsoft’s motion-capturing Kinect peripheral for the Xbox 360.
Motion-capture is a well-traveled technology – video games make good use of it – but it’s too articulate for the everyman’s needs or capability. And hundreds of reflective balls attached to a wet suit can be cumbersome. Kinect doesn’t require any of that and you can pick one up at Walmart for about $150.
“With a couple Kinects, you can pretty much get just about the same fidelity (as full-on motion-capture technology) for $300,” said Executive Director Alan Hoberney. “The cool thing about (Kinect) is it’s cheap.”
Using the Microsoft Kinect software development kit, the intern team is able to show Paul how it should move based on how its human masters are moving. But while Paul can join you in raising the roof, its articulation is still in an early phase.
“This is ‘low-end,’ ” said Wilmotte. “It only has basic joint movements.”
But Wilmotte said he wonders “what could be?” seven or 10 years down the road.
Enter the Emotiv neural headset. It is the closest anyone will come to feeling like a telepath. The nonintrusive headgear measures and processes the neural activity of the user, making it translatable and applicable in many ways – like controlling a robot.
“It’s probably one of the coolest things I’m going to do in a long time,” said Weaver, whose research team worked closely with the Emotiv team on adapting the mind-control tech for Paul.
As Davis sat, hands folded, he used a single thought and was able to raise Paul’s arms, nearly instantaneously.
“Imagine what you could do with more sophisticated hardware or software,” Wilmotte said. And CTC speculates it could be consumer affordable in five to 10 years.
To read this story in its entirety, visit one of these links: