Glidance CEO describes how Glide device will work to aid people with vision impairments

a blind woman crossing the street in a crosswalk, guided by Glidance Glide device
The mission of Glide is to provide independence and agency to sight-impaired individuals. | Credit: Glidance

Glidance Inc. has been developing a robotic walking aid for people with vision impairments. The Glide device has an ergonomic handle, and its sensors are designed to help users avoid obstacles, find waypoints on maps, and stop at stairs and elevators.

In October 2023, Glidance, which is a resident member of MassRobotics, won the RoboBusiness Pitchfire startup competition. The Consumer Technology Association (CTA) has also recognized the Seattle-based company with an innovation award, and it plans to demonstrate Glide at CES 2024 from Jan. 9 to 12 in Las Vegas.

Mobile Robot Guide recently spoke with Amos Miller, founder and CEO of Glidance, about the development of Glide and plans for commercialization.

Tell us what you are building.

Miller: At Glidance, our mission is to revolutionize independent mobility for people with sight loss. And I don’t use the word “revolutionize” lightly.

We are doing that with a new self-guided mobility aid called “Glide” that uses AI and sensors to guide a person, show them the way, help them avoid obstacles, make them aware of what’s around them, and bring back their independence and their ability to get around with confidence.

What was your inspiration for this product, and how did you came up with the concept of giving independence back to people with vision impairments?

Miller: I lost my sight in my 20s as a result of a genetic condition called retinitis pigmentosa. I lost my sight gradually while I was finishing my computer science degree and starting my career in high tech.

By the age of 30, I had lost all useful sight. I have lived in Israel, the U.K., Singapore, and now in the U.S. I have lived my entire adult life with sight loss. Everywhere I go, I have to deal with independent mobility every day of my life.

I am a guide-dog user and I can also use a cane, but I’m a terrible cane user. I’ve always appreciated the guide dog as an assistant. But a dog doesn’t help if you don’t know the layout of a train station, and you have to wait 30 minutes for somebody to come meet you and guide you to your train. Those are the types of challenges that I’ve always had to deal with daily.

Why weren’t you a good cane user? What are some of the problems that you had trying to use a cane?

Miller: The cane is an amazing technology. It has been around for thousands of years. Today, it is by far the most used assistive technology, and people can buy it for 25 bucks.

To use a cane effectively, you “shoreline” to feel an obstacle and get around that obstacle. Shoreline means that as you’re walking along a sidewalk, you tap along the edge of the building or the edge of the road so that you can keep a straight line. But you still have to be extremely well-oriented as to where you are within a town or a building.

You have to take all the signals around you too, using all of your senses, to know where you are along the street. Mentally, you are feeling for the next landmark — it could be a tree, it could be a bush, it could be a lamppost. That requires a lot of concentration and a lot of skill. This skill can be developed, but it takes time to develop.

I use a guide dog, which is a very different guiding solution, but I still need my orientation. With a guide dog, the dog guides you through the world. So it’s probably closer to what the Glide does.

I would say that a lot of blind people would consider Glidance to be a little bit like an electronic guide dog. From a behavioral perspective, it has similarities in the way that it guides.

A real guide dog sees an obstacle ahead of time and takes you around it. And that’s exactly how Glidance works. Glide sees the obstacle and takes you around. The result is a much lower cognitive load and allows you to listen to your e-mails or talk on the phone while you move with Glide.


SITE AD for the 2025 Robotics Summit registration. Register now


The evolution of Glide

How did Glide evolve from the initial concept, which pulled the user through the world, to the final version?

Miller: Initially, we explored putting a motor on the wheels, as that was the natural place to start. If you work with a sighted person who’s guiding you, there are two ways of doing that.

One is that you touch their elbow or the other is the unwelcome way, in that they grab your hand. If they grab your hand, they are now pulling you, and you feel a total loss of agency. You’re now just a trolley, along for the ride.

The preferred way for sighted guiding is that I touch the elbow of the sighted person, and they walk. But I still determine speed, I determine the steps and the angles that I move my body in.

When you have a robot that pulls you around, you lose that agency immediately. I have tried other robots, and when the robot is completely autonomous and pulls you around, you lose all sense of agency. It just doesn’t feel right.

By removing the motors from the wheels, the user simply nudges the device forward; it’s very light. The moment they start pushing it forward, the wheels start to servo and steer the way. But all the agency in the control is still with the user. They don’t have to move knobs up and down to control the speed of the robot, when they want to stop, they stop.

From an experience perspective, our users love that. It also reduces the complexity of the system, reduces the weight of the unit, doesn’t require big batteries, and lowers the overall cost.  

Can you take us through the obstacle-avoidance system? What are some of the sensors that you’re using? 

Miller: We have local and global planning onboard the system. We have a variety of sensors that are at the mobile unit level and another set at the bottom of the unit just above the wheels.

f there’s something in the way, the sensors will detect it, and the wheels will start to steer you around it. The user is controlling the speed, and Glide knows how fast you’re going.

How are the haptics giving the user feedback about pacing and other elements that it perceives?

Miller: The goal is for the robot to communicate with the user so that the user stops before an obstacle. Because the robot is not pulling the user along, it can indicate to the user through haptics and audio on the handle.

For example, it double-taps on the handle to indicate to the user to slow down.

How does learning to use a guide dog compare to the experience learning to use Glide?

Miller: A guide dog is amazing, but you do have to go to a residential program and train with a new dog. It takes weeks of training to trust the animal and learn to work as a team. It takes a lot of effort, plus you have to replace your dog every five to six years, which is again another upheaval in your life.

We know that there are 7.3 million people with significant or total sight loss in the U.S., but there are only 10,000 new guide dogs available in any given year. I think dogs will continue to be part of the fabric of independent mobility for years to come.

At the same time, 99.9% of blind people will never have the benefit of a guide dog.

A new Glide user can learn to use the solution in a couple of hours. For an individual who loses their sight late in life, which is now an emerging trend, a solution like Glide may be the fastest and simplest method to return a sense of independence to that individual. This is the opportunity for Glide.

Glidance gets ready for market

Will Glide integrate its device with GPS, Google Maps, or Apple Maps to use navigation instructions?

Miller: We will have mapping capabilities in Glide, but I don’t plan for Glide to be a navigation aid. I don’t want to build my own navigation app.

But it will work with existing navigation apps. So if you set a destination for a restaurant on Google, its walking directions are sent to Glide to use as waypoints. Glide will do the local planning to those waypoints along the way and get you to that restaurant.

We also plan to integrate apps like the Target app, where you create your shopping list on the Target app, and the Target app tells you where you are in the store and where the product is.

Glide has cameras and wheel odometry, and all the necessary sensors to do SLAM [simultaneous localization and mapping] in the store. So Glide could pair up with a Target app and help you get to that shelf in the store.

Can you share the price point for the production solution?

Miller: This solution must be affordable. It’s not going to be $25 like a cane, but we are aiming at the price range of a cellphone subscription.

You’ll start with a basic subscription, depending on the level of features that you want. The basic features will fit the needs of new users, and world travelers can enhance the product by turning on additional features to meet their needs.

We are also working with the VA [U.S. Department of Veterans Affairs] and with insurance companies to make sure that anyone can get the device. We expect to start our beta program in the spring of 2024.

Where can folks find you at CES 2024?

Miller: The CTA Foundation gave us an award for a free booth at Eureka Park.

Editor’s note: This interview was transcribed by https://otter.ai and edited for clarity. You can listen to the full interview with Amos Miller in this episode of The Robot Report Podcast:

mike oitzman headshot.
Written by

Mike Oitzman

Mike Oitzman is Senior Editor of WTWH's Robotics Group, cohost of The Robot Report Podcast, and founder of the Mobile Robot Guide. Oitzman is a robotics industry veteran with 25-plus years of experience at various high-tech companies in the roles of marketing, sales and product management. He can be reached at moitzman@wtwhmedia.com.