Voice-Controlled Movements

Five North­eastern Uni­ver­sity engi­neering seniors have devel­oped an inno­v­a­tive, voice-​​operated wheel­chair that can nav­i­gate through a clut­tered room, move along­side walls and detect stair­wells and other obsta­cles in its path — a project, they say, that is designed to enhance the lives of people with a range of phys­ical impairments.

Their work took first place in the Elec­trical and Com­puter Engi­neering Department’s cap­stone pre­sen­ta­tions last month. The stu­dent team is com­prised of Barry Briggs, Dana Lopes, Shane Mulk­errin, Sam Ben­nett and Thayne Henry, with pro­fessor Bahram Shafai serving as their fac­ulty advisor.

“Just knowing that we were working on some­thing that could really make a dif­fer­ence for someone one day helped to boost our deter­mi­na­tion,” Briggs said.

The user would first mem­o­rize a short list of basic voice com­mands that would move the wheel­chair in dif­ferent direc­tions, stop its move­ment, and acti­vate and deac­ti­vate the system. Speak into a headset, and off goes the wheel­chair, obeying the user’s com­mands via the voice software.

A crit­ical com­po­nent to the stu­dents’ design is an inte­grated net­work of ultra­sonic dis­tance sen­sors that detects obsta­cles ahead and to the side of the wheel­chair. The sen­sors can also deter­mine whether the wheel­chair is approaching a sig­nif­i­cant drop ahead — a set of stairs, for instance. As a safety mea­sure, the system would over­ride user com­mands to avoid such obstacles.

Under the seat is a uni­versal con­trol unit — what the stu­dents call “the heart of the system” — that com­mu­ni­cates with the sensor array, and processes and sends the com­mands to the motor con­troller to move the wheel­chair. Mean­while, a dis­play panel extending from the right arm­rest shows users the cur­rent direc­tion they are going in, and whether they are approaching any barriers.

The project demon­strates Northeastern’s com­mit­ment to use-​​inspired research, par­tic­u­larly in the University’s top research themes of health, sus­tain­ability and security.

While the stu­dents based their cap­stone project around the voice-​​command con­trol system, they said their uni­versal design would make the wheel­chair easy to use and adapt­able to work with other com­mand tech­nolo­gies, such as brain-​​computer inter­face, eye-​​tracking, mouth joy­sticks and hand movement.

“The voice inter­face is an extremely robust system, and this is a flex­ible media that can be extremely useful for the phys­i­cally impaired,” Shafai said. “The stu­dents did a fan­tastic job.”

Stu­dents said their co-​​op expe­ri­ences also played a piv­otal role in devel­oping the project, from sol­dering the sen­sors to the wheel­chair to Lopes’s soft­ware engi­neering expe­ri­ence in robotics at QinetiQ North America.

To learn more click here

Related Departments:Electrical & Computer Engineering