Can you really collaborate with a robot? This PhD grad is working on it

Adam Parker uses AI to teach a prosthetic device to give real-time feedback to its user as human and machine learn tasks together.

Adam Parker (Photo: Alex Pugliese)

For his doctoral research in rehabilitation science, Adam Parker uses AI to teach a prosthetic limb to give real-time feedback to its user, allowing true collaboration between human and machine. (Photo: Alex Pugliese)

Adam Parker had a perfect 4.0 grade point average when he transferred from NAIT to the University of Alberta for his undergraduate degree in electrical engineering. But he wasn’t always an A student.

In fact, as a kid, “I mostly couldn’t be bothered with academics. I just wanted to play,” he says. “And so I would put in the bare minimum effort required to get through school and do as much playing and video games and hanging out with my friends as I could.”

Parker’s favourite game as a teen was a tabletop fantasy role-playing game called Shadowrun that involves heroes with bionic limbs battling evil tycoons in a dystopian near-future.

That’s why it seemed like a perfect fit when he met medicine professor Patrick Pilarski and learned about the Bionic Limbs for Improved Natural Control (BLINC) Lab. Parker spent the next 11 years “playing” with bionic limbs, first as an undergraduate summer researcher, and ultimately earning himself a PhD in rehabilitation science through the Faculty of Rehabilitation Medicine, to be awarded during fall convocation next week.

241122-convocation-sidebar-shape-the-future-wordmark.jpg

U of A students will shape the future. You can help.

With your support, the next generation of change makers can access the resources, experiences and student spaces they need to tackle today’s challenges and shape a brighter tomorrow.

Find Out More

Parker’s research findings flip our conventional understanding of the role of artificial intelligence in developing bionic limbs: Instead of focusing on how AI can be used to teach the artificial limb its moves, Parker is developing and testing ways for limbs to learn from and give feedback to individual users in real time … in other words, to collaborate with their human users.

“Adam has changed my thinking — and that of our international community — as to how intelligent machines can be better partners and collaborators,” explains Pilarski, who is a Canada CIFAR Artificial Intelligence Chair, past Canada Research Chair in Machine Intelligence for Rehabilitation, professor in the Division of Physical Medicine and Rehabilitation and adjunct professor in the Faculty of Rehabilitation Medicine, and an Alberta Machine Intelligence Institute (Amii) fellow.

“Thanks to Adam’s research, the fields of rehabilitation medicine and assistive technology have been able to appreciate how machine learning might impact the continually evolving relationship between humans and machines that are tightly coupled to the user’s activities of daily living.”

A partner rather than just a tool

The central experiment in Parker’s thesis was entitled “Understanding Human Interaction With Real-Time Adaptive Feedback During Simulated Prosthesis Use.” He asked two groups of able-bodied participants to use a physical simulation of an arm prosthesis to move a flexible plastic cup without crushing it. Powered by electric motors, the prosthesis is controlled by the user flexing their forearm muscles, sending signals via electrodes attached to the surface of their skin. None of the participants was an expert user, so it was a difficult task to accomplish.

All 16 of the participants performed the cup moving task on their own first, without any feedback from the prosthesis, simply using their own senses — what they could see and hear — to guide them. This is how commercially available prostheses currently work for people with amputated limbs. 

Half of the participants then performed the task while getting direct feedback from the arm. It was programmed to make a beeping sound whenever a certain level of force — enough to crush the cup — was exceeded.

The second group of participants worked with an arm that was guided by a prediction learning algorithm. It didn’t know at what load the cup would be crushed until it had experienced that once. Then it set out to help the user by warning them with an audible signal before they reached that critical load.

There were a lot of crushed cups at the end of the experiment, but accomplishing the task wasn’t really the point. Parker used a mixed methods approach to analyze his results, including quantitative data from the experiments and qualitative data from interviews about the participants’ experiences.

Overall, the predictive learning arm didn’t do as good a job at guiding the users as the direct feedback arm, but still, the users liked it better. Parker suspects that is because while the predictive arm gave absolutely no help at all at the beginning of the task, it did get better over time, and that gave the participants confidence.

“They focused on its growth, the fact that it was improving,” Parker reports. “They were more invested in the system and their perception was more positive because they were seeing this system start to get better at helping them accomplish the task.”

My goal is not to make AI function like a human. I prefer making them better at interacting with humans rather than imitating humans.

Adam Parker

Adam Parker
(Photo: Alex Pugliese)

He’s not ready to start using words like “empathy” and “trust” to describe the relationship between human and machine, but Parker believes that exploring the disconnect between performance metrics and user experience is key to developing prostheses that are literally more responsive to users’ needs.

“A potential path towards improving user interactions with prosthetic limbs in the current age of artificial intelligence is to view the device not as a tool being used, but as a partner assisting the user in their daily life,” he concludes.

Parker hopes to take his research further by exploring whether the robotic arm can learn body language — for example, making itself harder to move when danger is ahead — instead of just using sounds. He’s interested in this because it’s well understood that humans working together rely a lot on such non-audible cues.

“My goal is not to make AI function like a human, though,” he insists. “I prefer making them better at interacting with humans rather than imitating humans.”

Pilarski agrees with that goal. “Adam can change the way we as humans relate to the next generation of artificial intelligence technologies,” he says. “His skills, perspectives and experience will set the stage for a new era of human-aligned intelligent agents that amplify the best of both human and machine capabilities.”

A circuitous route

Parker, now 40, took an unusual route to get to this point in his academic journey. After high school near Calgary he took a year off, then did some upgrading and got into aircraft maintenance at SAIT. He worked for a while for WestJet repairing airplanes, then went to the University of Calgary but was asked to leave due to poor grades.

More upgrading got him into NAIT in electronics engineering technology, where he excelled, ultimately transferring into electrical engineering at the U of A. Parker earned his BSc in 2017 and went straight into the PhD program.

Parker found Pilarski’s lab at an event for the U of A’s Undergraduate Research Initiative, which connects students interested in research with professors who are looking for creative junior scholars to contribute to their laboratories. Parker’s unique set of skills meant he was perfectly positioned to take on the work. 

“A lot of students say, ‘I have a research idea. How do I find a prof to do my work?’ whereas I approached Patrick and said, ‘What you do is really cool. Here’s my background. Do you have a use for me?’”

Pilarski credits the program with helping both his and Parker’s research leap forward.

“I would not have had the privilege of working with Adam for more than a decade without early support from the Undergraduate Research Initiative,” he notes. “The U of A’s sustained enthusiasm for and concrete resources in service of early-career researchers have made it possible for scholars like Adam to build and grow their careers and their global societal impact.”

In return, Parker has done a lot of community outreach to share the research at public events such as Nerd Nite and Dark Matters at the Telus World of Science. A highlight was playing guitar with a bionic arm as part of an Amii event at the Junos in 2023.

"A potential path towards improving user interactions with prosthetic limbs in the current age of artificial intelligence is to view the device not as a tool being used, but as a partner assisting the user in their daily life,” says Parker. (Photo: Amii/Chris Onciul)


Parker isn’t sure whether he will stay in academia to do research and teach, or eventually leave to work as an AI consultant. For now, he will continue his research on ways that AI can help people with medical challenges as a postdoctoral fellow in computing science, working with professor Matt Taylor.

No matter what path he takes, he will continue to tap into the well of creativity he first discovered in childhood. “Even now I describe what I do as playing,” says Parker. “I get to play with robots to try and get them to move.”

Parker’s research was funded by the Undergraduate Research Initiative Support Fund, Natural Sciences and Engineering Research Council of Canada, Alberta Innovates, the Sensory Motor Adaptive Rehabilitation Technology (SMART) Network, the Alberta Machine Intelligence Institute, and the Canada CIFAR AI Chairs program. Parker also received accommodations funding as a student with a diagnosis of a learning disability.