This story is the fourth in a news series on artificial intelligence and machine learning, published throughout the spring and summer of 2017.
A new academic-industry collaboration at U of T Engineering is harnessing improved sensors and artificial intelligence to make electric wheelchairs self-driving. The technology could greatly simplify the lives of more than 5 million power wheelchair users across North America, and millions more worldwide.
Since electric wheelchairs were first pioneered by inventors such as George Klein (MechE 2T8) in the 1950s, the fundamental technology has remained much the same. Most are controlled by joysticks, which may seem simple to use, but can be frustratingly cumbersome for many everyday tasks — from docking at a desk to traversing a narrow door frame.
“Imagine parking a car in a tight space using only a tiny joystick,” says Professor Jonathan Kelly (UTIAS), who is leading the new collaboration. “That would be annoying for anyone.”
The problem is compounded for users with multiple sclerosis, amytrophic lateral sclerosis (ALS, also known as Lou Gherig’s disease) or spinal cord injuries, who often lack upper body control, or those with Parkinson’s disease, who often have hand tremors. Some of these users employ devices known as the Sip-and-Puff (SNP) controllers, in which they input commands by sipping or puffing air using a plastic straw. They are an alternative to joysticks, but they can make complex tasks even more overwhelming.
Robotic automation could address those challenges. Several groups around the world are working on self-driving wheelchairs, but most rely on high-end sensors that are priced out of reach of a typical consumer. Kelly, an expert in robotic sensing and perception, believes that the task could be accomplished for much less, thanks to a recent explosion in mass-produced sensor technology.
He points to the Microsoft Kinect, which contains both a visible-spectrum camera and an infrared laser to detect distances. “Sensors like that used to cost thousands of dollars, but now you can buy them for less than $200,” says Kelly. “It has been a game-changer for robotics.”
Automation could also help with tasks that are less complex, but more routine. For example, an autonomous wheelchair could use sensors to map a space and tag certain key locations, such as the kitchen, bedroom, etc. The user could then navigate to those spaces with a single command.
Two years ago, Kelly was approached by Vivek Burhanpurkar, the CEO of Cyberworks Robotics, Inc. The company has a long history in autonomous robotics, including self-driving industrial cleaning machines, but Burhanpurkar saw an opportunity to move into assistive devices.
“It’s only the past five years that we’ve reached a critical inflection point, allowing us to achieve unprecedented levels of autonomous behaviours at consumer level price-points,” says Burhanpurkar, who started Cyberworks after studying under U of T Engineering Professor Emeritus K.C. Smith (ECE). “Jonathan’s group was a natural partner for us because they have the same set of altruistic values and goals as we do. We share a common vision for the future, which is a rare thing between academia and industry.”
Rather than designing a new chair from scratch, the team — composed of Cyberworks engineers and U of T Engineering researchers — focused on retrofitting existing chairs using inexpensive sensors, controllers and a small computer. Over the past two years, the team wrote software and developed algorithms to deal with many common situations, including driving down narrow corridors and avoiding obstacles. Another key collaborator on the project was François Michaud and his team at Université de Sherbrooke, who came on board in 2016.
https://www.youtube.com/watch?v=M4n6WedNzl0
See the self-driving electric wheelchair in action in this video from Cyberworks Robotics, Inc.
Rapid developments in sensor technology make this an ideal platform for aspiring engineers to gain research experience: much of the work on the U of T Engineering side has been carried out by undergraduates. Last summer, the team also included Maya Burhanpurkar, Vivek’s daughter, who is working on the project as part of a gap year before starting her undergraduate degree at Harvard next fall.
At 18, Burhanpurkar is already an accomplished researcher: she twice won the Platinum Award at the Canada-Wide Science Fair and is currently conducting cosmology research the Perimeter Institute for Theoretical Physics. “I was blown away by how much she had accomplished,” says Kelly. “I felt that the door-traversing algorithm would be a great project for Maya to dig into.”
Watch a segment about Maya Burhanpurkar’s work on CBC’s “We Are Canada”
During her four months on the project, Maya Burhanpurkar logged many late nights in the lab — in part because she was tackling a big challenge on a tight deadline, but also to ensure that the doorways she was navigating would be clear of people.
“I would come at around 8 or 9 p.m., and then work all night,” she says. “I saw the sun rise on many occasions. I put a lot of labour into that thing.” In the end, Burhanpurkar created an algorithm that enables the wheelchair to traverse a doorway with as little as 6.5 centimetres of clearance on either side. This month, she will be presenting her work at the International Conference on Rehabilitation Robotics in London, U.K.
Kelly says that the next step will be to test the wheelchair in controlled environments under the supervision of occupational therapists. He has lined up potential collaborators and is currently applying for ethics approval and funding.
“Once we have the user study data, the product is essentially ready for commercialization,” he says. “It wasn’t always easy, but I’ve been really surprised to see how far we’ve come in two years. We’ve had so many talented people working on the project, and now when I see it operating it always brings a smile to my face. I’m super excited about it now.”
Learn more about the largest and most diverse robotics program in Canada