Skip to main content

From bionic arms to predicting patient surges in ER, AI is reshaping patient care

Michael Rory Dawson, Chris Neilson, Bento Arm, Patrick Pilarski,
Michael Rory Dawson sets up a task for research participant Chris Neilson to perform with the Bento Arm, while Patrick Pilarski gives it a cup to hold at BLINC lab in Edmonton, Alberta, on Mar., 14, 2019. File photo by The Canadian Press/Amber Bracken

Support strong Canadian climate journalism for 2025

Help us raise $150,000 by December 31. Can we count on your support?
Goal: $150k
$36k

Chris Neilson is on his second prosthetic after losing his left arm above the elbow in a nasty work accident about six years ago. Partly myoelectric, the artificial hand and arm is a step up cosmetically and functionally from the first body-controlled device that featured a hook and claw.

But the 33-year-old mining industry worker is looking ahead, helping researchers at the University of Alberta test a new generation of prosthetic — an experimental bionic arm that can "learn" to adapt and anticipate an amputee's movements, employing the power of artificial intelligence.

It's just one example of how AI, or more specifically machine learning, is beginning to transform health care, propelling what was once the stuff of science fiction solidly into the realm of reality.

At the Edmonton university's Bionic Limbs for Improved Natural Control (BLINC) lab, co-director Patrick Pilarski and his research team are developing an artificial arm using AI/machine learning aimed at trying to blur the lines between the person who "needs assistive technology and the assistive technology itself."

"They may have lost a limb through injury or illness and the technology itself is in some ways trying to replace that body part," says Pilarski, Canada research chair in machine intelligence for rehabilitation.

"So when we try to build better bionic limbs, we're really looking at how do we understand the signals that a human is giving to their technology? How does that technology interpret those signals to actually do the thing they want — to pick up a cup of coffee or grab a pen or hold a hand of a loved one?

"And then how does that device give information back to the person, so they can better carry on the tasks of daily life?"

Artificial arms, reminiscent of the android Sonny's appendages in the 2004 film "I, Robot," are being combined with the use of machine learning software, which picks up an amputee's repetitive patterns of movement, then begins incorporating those motions into the way the wearer controls the prosthetic.

With Neilson's current artificial limb, he operates the hand's open-close and grip functions by contracting bicep or tricep muscles. Sensors measure electrical changes in the skin, sending signals to trigger movements in the hand.

But with the BLINC lab's prototype, "I could switch between say the elbow, between a wrist flexion or a wrist rotation to the open and closing of the hand to maybe even changing the grips," says Neilson of Leduc, Alta.

"And what this software did was, if you had a pattern (of movement) ... it picked up on which function you were prioritizing and then it would skip the rest after a while."

For now, the AI-enabled artificial limb remains in the testing stage, but the ultimate goal is to produce a refined prosthetic that amputees can wear in their daily lives, says Pilarski, a fellow of the Alberta Machine Intelligence Institute, or Amii.

Calling manual control of an artificial limb a painstaking process, he explains that the idea of an AI-enhanced bionic arm is to remove some of the burden from the wearer by making complex hand movements more automatic.

"It means that if someone is working very hard to give all these right signals to their prosthetic limb, such that it makes all the right motions, if the prosthetic limb can anticipate where they're reaching or how they want to grab the object, then it can take away some of that overhead for the person.

"The person can have a much more natural, intuitive and, in some cases, much more efficient interaction with their device because the device is filling in the gaps for them."

Across the country, efficiency is also partly at the heart of more than a dozen AI-driven projects being developed at St. Michael's Hospital in downtown Toronto, which serves a large inner-city population as well as being one of Ontario's designated regional trauma centres.

Its Centre for Healthcare Analytics Research and Training (CHART) was created to design and implement innovative programs using AI/machine learning to streamline certain hospital systems and to improve care, from decreasing emergency department wait times to predicting which patients could take a turn for the worse — and when.

CHART director Dr. Muhammad Mamdani says that about twice a year, the emergency department experiences an upsurge in patients coming through its doors, leaving staff scrambling and creating "ridiculously high" wait times of eight to 16 hours. Mini-surges also occur a couple of times each month that are also "pretty bad."

So emergency department managers approached the centre to see if a program could be developed to predict patient volume in advance, so more nurses could be brought on shift or the physician schedule rejigged.

Mamdani says CHART staff looked at three years of historical data to identify ED usage patterns, then added in environmental data: Would it snow tomorrow night? Were the NBA's Raptors playing? Was there a marathon in the city's core?

All the data was fed in to create an algorithm — a set of rules telling a computer how to perform a task — using a combination of methods that included machine learning, he says. "And we found that we could predict with well over 90 per cent accuracy our patient volumes ... in six-hour intervals, three days in advance.

"So for example, if today's Monday, we can tell you that on Wednesday from noon to 6 p.m., we'll have 82 patients showing up to our emergency department. We'll be able to tell you that about 10 of them will have mental-health issues, 12 will be fairly high-intensity cases (heart attack, trauma) and the rest of them will be probably low- to moderate-intensity."

The beauty of the program is that it's all automated, with pertinent data grabbed each night and the forecasted ED volume sent out within seconds to department managers.

CHART also addressed a concern of internal medicine specialists, who know a certain proportion of their very sick patients will either die or need transfer to the ICU — but clinicians are unable to predict which individuals are at risk.

"What they told us was that by the time they realized that these patients are deteriorating, they have on average three hours to react," said Mamdani. "They also said there are care pathways that would reduce the risk of cardiac arrest and death by 40 to 50 per cent, 'but we need about six to 13 hours to implement them.'"

So his team developed a deep-learning algorithm that pulls all kinds of data from patient records, including lab test results, medications, clinical orders and more.

"We've got it reading text notes, so it looks for word patterns from what the nurses wrote last night, what the cardiologist put in his or her notes that morning," he said. "And it will predict 12 to 24 hours in advance if a patient is going to die or go to the ICU.

"As soon as it reaches a certain threshold, it talks to our paging system and it alerts the medical team to go see that patient within the next two hours, because we think something bad is going to happen."

The technology behind the program may seem complex, but the goal is simple, says Mamdani.

"We're hoping this will actually save lives."

Comments