Google is testing an AI-powered app that helps vision-impaired people run without assistance

Google is testing a new app that will allow blind people to run on their own without a guide dog or human assistant. 

Project Guideline uses a phone’s camera to track a guideline on a course and then sends audio cues to the user via bone-conducting headphones.

If the runner strays too far from the center, the sound will get louder on whichever side they’re favoring.

Still in the prototype phase, Project Guideline was developed at a Google hackathon last year when a blind runner asked developers to design a program that would allow him to jog independently.

Scroll down for video

The app uses a phone’s camera to track a painted line and then sends audio cues via bone-conducting headphones if a runner strays too far to the left or right

Thomas Panek, CEO of Guiding Eyes for the Blind, began losing his vision when he was just 8 years old and was legally blind by the time he was a teenager.

While Panek remained active and independent, he had to give up running, one of his passions.

Eventually he heard about running with human guides, who are tethered in front of a vision-impaired runner.

‘I even qualified for the New York City and Boston Marathons five years in a row,’ he wrote in a Google blog post. But as grateful as I was to my human guides, I wanted more independence.’

Thomas Panek, who is blind, tasked Google with creating an app to help him to run independently. Here he uses the Project Guideline app via a phone attached to his harness

Thomas Panek, who is blind, tasked Google with creating an app to help him to run independently. Here he uses the Project Guideline app via a phone attached to his harness

At a Google hackathon in fall 2019, Panek asked designers if they could devise technology to help a blind person run independently.

He didn’t expect much, he admitted, but by the end of the day they had designed a demo that allowed a phone to recognize a line taped to the ground and give audio cues.

Eventually a more sophisticated prototype was produced: The camera on a phone attached to a harness Panek wears uses AI to look for a market on the ground and sends audio signals to him depending on his position.

‘If I drifted to the left of the line, the sound would get louder and more dissonant in my left ear,’ he said. ‘If I drifted to the right, the same thing would happen, but in my right ear.’

Within a few months, and a few adjustments, he was able to run laps on an indoor track without assistance, human or canine.

‘It was the first unguided mile I had run in decades,’ Panek said.

Panek (left) ran marathons tethered to a human guide but he wanted more independence

 Panek (left) ran marathons tethered to a human guide but he wanted more independence

Panek phone's tracked the line in the road while he tested Project Guideline outdoors. 'I began sprinting on my toes, as fast as my legs could carry me,' he said

Panek phone’s tracked the line in the road while he tested Project Guideline outdoors. ‘I began sprinting on my toes, as fast as my legs could carry me,’ he said 

The developers then got to work adapting the technology outdoors, where there are a whole new set of obstacles.

Once out on the open road, Panek said, ‘I began sprinting on my toes, as fast as my legs could carry me, down the hill and around a gentle bend in the road.’

The system was able to keep him on course, and with every stride, ‘I felt free, like I was effortlessly running through the clouds.’

Project Guideline doesn’t need an internet connection to work and can account for weather conditions, Endgadget reports. 

Later this month, Panek will attempt to run NYRR’s Virtual Run for Thanks 5K along a painted line in Central Park.

His company, Guiding Eyes for the Blind, pairs seeing-eye dogs with people with vision loss.

A machine-learning algorithm on an Android phone can detect if the runner is to the left, right or center of the guide line.

A machine-learning algorithm on an Android phone can detect if the runner is to the left, right or center of the guide line.

But there are millions more people with vision loss than there are available guide dogs.

He hopes Project Guideline can be adapted and expanded to provide independence to more people like him.

‘Collaborating on this project helped me realize a personal dream of mine,’ he said, thanking Google ‘and whoever came up with the idea of a hackathon.’

Google has been increasingly investing in accessibility technology: In October it unveiled Sound Notifications, a new feature for Android that notifies deaf users if there is water running, a dog barking or a fire alarm going off.

Users can be notified about ‘critical’ sounds through push notifications, vibrations on their phone or a flash from their camera light.

While the feature is designed for the estimated 466 million people in the world with hearing loss, it can also help people who are wearing headphones or otherwise distracted.

The company has also expanded Lookout, which can read mail aloud and verbally identify packaged goods.