The camera would shake constantly. The sunlight would change and make a yellow line look different at noon than at dawn.
And what would happen if leaves blew on the line and covered part of it? Would the app interpret this line break as a parked car and signal the runner to stop?
“We take examples and put them into the model, classifying the pixels as a class and anything but out of class,” Ayalon said, referring to obstacles that might block the line from view. “The model learns over time.”
So is the runner. Panek tested the technology over short distances for months, slowly gained confidence and learned to trust the directional messages in his ears. Then, in November, it was time for a three-mile run.
“Liberation is a great motivator,” he said, “the idea of being self-employed.”
Working with New York Road Runners, the organizer of the New York City Marathon, technologists were given permission to paint their yellow line around Central Park’s North Loop, a 1.42-mile circle that spans the climb known as Harlem Hill .
Despite the cold, Panek wore short sleeves. He has the wiry build of an experienced runner. The only indication of his vision loss is that his eyes sometimes seem to focus in different directions. But he skillfully compensates, follows a voice and picks up people’s unique noises and looks at them as he speaks.
By noon he was ready to run.
“Let’s go,” he said when it was time.
A starter told him to go and he was gone. He sprinted downhill towards his first corner as if he knew where he was going. And then, about a minute later, the voice in Panek’s headset – as well as everyone around him – told him to stop. A car from the Department of Parks and Recreation was on the track.
Comments are closed.