Exercise 55 – How do you design the Lyft app for the blind?

Post and review answers and feedback to answers in the comments section of this post.

See also:

How to answer a product design question in a product manager job interview

List of product design questions for product manager job interviews

11
Leave a Reply

avatar
4 Comment threads
7 Thread replies
0 Followers
 
Most reacted comment
Hottest comment thread
3 Comment authors
AyushD'Andre EalyBijanBryan Recent comment authors

This site uses Akismet to reduce spam. Learn how your comment data is processed.

  Subscribe  
newest oldest most voted
Notify of
Anonymous
Guest
Anonymous

Hold anywhere on the screen to request ride. All blind initiated ride requests will be managed via phone call. No in-app gimmicks or tech-enabled Braille/blind solutions. Just a phone call.

Bryan
Guest
Bryan

You’ll need to go far more detailed on your answer. First, determine the goal of the app. Then define the persona (in this case, it’s easy – blind users). Next, talk about the use cases / needs of this persona. After listing the persona’s needs and use cases, brainstorm some ideas / features that can meet those needs. Then evaluate the ideas based on some criteria and finally, provide your recommendation based on your evaluation of the ideas. Take a look at some of the other answers submitted for other questions in this site. They help you get an idea of the expected structure of the answer.

Anonymous
Guest
Anonymous

A quick attempt between meetings.

First, I will understand what the interviewer means by app. Is it something that works on a regular phone as an add on feature or is it an app for a specialized device for the blind.

Assume that the interviewer says that it is just an add on feature for the existing Lyft app.

Next, I will ask whether the user has any other special needs that I should know about.

Interviewer – No. Only that the user is blind.

Where is the app going to be released.

Interviewer – US.

Assumption. Customer speaks English.

Goals of the app.

Allow blind people to access cabs via Lyft.

Pain points addressed
For the blind customer –
1. Non-touch based way to convey current location and call cab.
2. Know that a cab has been scheduled
3. Know that the cab has arrived and has reached the destination.
4. ID the cab that has been assigned.
5. Safety while traveling in the cab.
6. Payment options
7. Rating/Tipping options
8. Type of cab needed ( any luggage etc, car pool, luxury)

For the cab driver
1. ID the customer uniquely
2. Incentives to pick up blind people

Solutions:

Customer
1. Voice based input to convey where the user wishes to go. Feedback from the phone to confirm the location.
2. Feedback from the device – via voice, vibration that a cab has been scheduled.
3. Voice based feedback
4. There is a unique id that is communicated to the customer. This ID has to be verified by the cab driver. The id can be the name of the driver or any other uniquely identifiable information.
5. Panic signal (either a button on the phone) or an attachment given to the blind to help signal trouble. There is feedback from the system saying that there could be something going wrong. For example, the driver’s vehicle is not going in the correct direction.
6. Ability to preset cc information.
7. Voice based interface to know the rating of the driver and to rate the driver and add tips.
8. Voice based input with feedback from the app as to what the estimated fare and time of arrival

Driver:
1. Flashing customer phone synchronized and with a specific color so that the driver can uniquely id the customer
2. Special rates for drivers to incentivize picking up blind people.

Prioritization.
For the customer – all the features listed would be essential for v1.

Bijan
Guest
Bijan

Hi there,
Thanks for the response. I really like your first clarification question as it helps with defining a goal for the project. Couple feedback:

– you can think of a few more pain points by going through the customer journey from the beginning (e.g. deciding to go somewhere) to the end (e.g. time they have gotten out of the car and arrived at their destination). This can give you a few more pain points in the journey. E.g. how do they determine the address of the place they have to go? how do they find the app on their phone? etc) The good thing about going through the whole user journey in your head is that it helps you see all the pain points from the eye of the blind person.

– Some of the solutions listed do not describe exactly how they solve the problem (e.g. ability to present cc information). You will want to describe the feature and how it solves the pain point as clearly as possible.

– I think the solutions can also leverage a larger variety of technologies. A couple ideas I can think of are:

. a short training audio that teaches the user how lyft works

. instruction for the person installing the app on where to place the lyft app on the screen to make it easily accessible for the blind person

. option to immediately ask for help in finding the car – enable video call and connect to volunteers that can help them spot the car

– In the end, you want to evaluate the solutions based on set of criteria such as impact on customer and cost of implementation. I suggest that only after your evaluation, you list your suggestions for the V1 or MVP of the product. This way, the interviewer sees your ability to execute and launch a new product.

Love the two ideas you’ve listed for the driver:)

D'Andre Ealy
Guest
D'Andre Ealy

Here’s how I would approach a question like this.

Step 1 – Comprehend the question

First, I would ask some questions to make sure we’re on the same page.

1. Are we building features into the current Lyft app or designing a whole new app?

Interviewer – New features into the current app

Step 2 – Identify the user

The users of our new features are blind adults that are at least 18 years old and who rely on ridding sharing services to get around.

Some potential use cases of the app could be getting to work, going to hang out with friends, and getting around town in general.

The user of these features could also be a spouse or friend.

Step 3 – Identify customer needs

Since I’ve identified who my users are, I’m going to list some of the user’s needs that we might need to solve for in our design.

1) The user needs to be able to enter their destination and confirm their current location

2) The user needs to be able to know when their ride was matched and how far away it is

3) The user needs to be able to know what the cost of the ride is

4) The user needs to be able to know when the driver arrives at their pick up destination

5) The user needs to be able to determine how far they are from their drop-off location

6) The user needs to be able to communicate with the driver

7) The user needs to be able to switch blind mode on and off so that their spouse or friend might be able to book their ride for them

Step 4 – Prioritize

To prioritize our list of needs, I’m going to focus on solving core rider experience features. Those features are:

1) Entering a pickup and drop-off location

2) Know when their request for a ride was matched with the driver

3) Provide the user with the cost of their ride

4) Knowing when the driver arrived

5) Being able to communicate with the driver

I decided to go with those five features because those features sum up the experience of using the Lyft app. Leaving core features out or making them hard to use for a blind person could impact their overall experience and cause us to lose a user. The goal is to provide a blind user with the same experience as non-blind users but with a different design.

Step 5 – List solutions

Now that I have my list of prioritized user needs I’ll list some solutions and walk through use cases.

1) Entering a pickup and dropoff location

– We could use voice to set the pickup and dropoff location. We should skip the entering of the pickup location, but sometimes the coordinates can be off, and since our user is blind we want to cut out any opportunities that could cause mistake around where the user is or where the user is going. As soon as a user opens the app, a voice assistant can greet the user and ask where would they like to be picked up at and where is their dropoff location. The assistant could confirm both locations and put in the ride request for our user. An edge case to consider is what about when the user is in a loud setting? We could test another feature where the voice assistant would call the user on the phone and take their information. That’s something we should track to see if its a problem. To track it we could monitor how many times the user has to repeat information to the voice assistant.

2) Know when their request for a ride was matched with the driver

– To let the user know that their request was matched with a driver, we could use a sound and a vibration alert. The sound alert of your ride being matched happens today, but sometimes you might have your phone on silent so we could use a series of tactic vibrations to let the user know the driver is on the way. We could ask them how helpful they are during user interview sessions.

3) Provide the user with the cost of their ride

– Once the user inputs their pickup and dropoff location, we can use the voice assistant to provide the user information for the cost of their ride before they book it. The assistant could tell the user their cost and ask them to confirm the cost using their voice and saying “Book now.” We can test different phases or perhaps allow them to make custom ones down the road.

4) Knowing when the driver arrived

– When the driver arrives at the pickup location of the user, we could send a series of vibrations to the user’s phone letting them know their ride is here and follow it up with a phone call telling the user where to go so they get into the correct car.

5) Being able to communicate with the driver

– Since our user is blind, they have no way of knowing which car to get into. When the driver arrives, and the user receives a phone call, we’ll have an option to connect the user to the driver. Once they connect, the driver can then help the user find the car and take them to their destination.

Step 6 – Summary
Our goal was to redesign the Lyft app for the blind. I started by understanding who our user was, identifying their needs, prioritizing their needs, and listed potential solutions. Our solution was to include a voice assistant for both setting the location for pickup and drop off but also for telling the user the price of their ride. To let the user know when their ride arrived I decide to send a series of vibrations and follow it up with a phone call that could connect the user and the driver.

Questions to consider:

– How do we communicate the different ride options available?
– How do we introduce the blind mode to users?
– How do we know if our features are successful or not?

Bijan
Guest
Bijan

Thank you for the answer. Very well articulated, easy to follow, and structured answer. Couple things I really liked:
– Highlighted key steps of the ride journey
– By listing out some questions in the end, you’ve shown that you are taking more things into consideration even though you didn’t have the time to cover them

One feedback I would have is I think it can be helpful helpful to evaluate the solutions based on some criteria that includes development effort. Sometimes, some solutions have big impact but they are very difficult to develop, making them less attractive. I would have evaluated the solutions based on some criteria (my favourite criteria are impact on user and cost of implementation) and then would have described the MVP / V1 of the product by prioritizing the solutions that are simpler to implement and have bigger impact to the user.

D'Andre Ealy
Guest
D'Andre Ealy

Thank you for the feedback Bijan! I’ll make sure to include the tradeoffs and criteria next time I answer one of these questions.

Ayush
Guest
Ayush

Me: Is this app an add on to the existing Lyft application?
Interviewer: Yes!
Me: ok, so our user base is blind people who want to book cabs to get from point A to point B. There use case is somewhat similar to the existing Lyft users but need added functionalities to account for the disability. Lets walk through the customer journey when using the app:
1) Customer decides they need to book a cab, so now the first thing they will do is open the Lyft app on their smartphone.
2) Customer needs to decide on the pickup and destination.
a. Now, pickup can be a customers current location, which can be tracked using the existing location services but sometimes this is not accurate.
b. Destination can be something like a house address which the customer already knows. Or it can be a bestbuy or Walmart, in which case the app needs to communicate to the user different options along with distance from their pickup point. This would enable the user to choose the closest one.
3) Now, customer wants to choose between different types of rides available from mini to luxury cabs.
4) Once customer picks the type of ride, app needs to communicate two things:
a. Wait time
b. Estimate of the trip
5) If customer decides to go ahead with the booking, confirm booking. Then the customer is linked to a driver nearby. App needs to notify the customer that the cab has been booked.
6) At this point the customer may want to check the feedback or rating of the driver, here it might be more relevant for the customer if we can communicate the drivers rating of managing blind customers separately.
7) Customer wants to track the cabs location in real time to be aware of the wait time.
8) Once cab arrives at the pickup location, customer needs to be notified and be able to identify the can they need to get into.
9) Once customer is in the cab, they need to confirm that the driver has accurately started the trip only then.
10) On the way to their destination, customer wants to be able to track that they are on the right route.
11) Customer wants to be able to raise an alarm in case they feel insecure.
12) Once cab reaches the destination, customer needs to be able to confirm that it is the same location that they entered in the app.
13) After the ride is complete, they want to be able to tip the driver and leave a feedback/rating.

Is there any aspect of the journey I am missing?

Interviewer: No, I think that is pretty much it.

Me: ok, now all the above use cases are important for a great customer experience but in the nick of time, I would like to prioritize my top use cases based on ‘must have’ features vs other features that would be ‘good to have’.
One of the biggest reasons that ride sharing apps such as Lyft are so successful is because they are reliable, safe and convenient. Based on these criteria my top use cases from the list above are – 1, 2, 4, 5, 8, 11.
Is there any other use case you would like me to add?
Interviewer: No, this is good.
Me: ok, now we have a prioritized list of use cases, let’s work on the scope of solutions.
1) There can be multiple ways for the customer to be able to open the Lyft app on their smartphone:
a. Voice assistant such as SIRI or Android voice assistant. This is already an inbuilt feature in most of the smartphones these days.
b. The location of the Lyft app for blind is fixed on the screen of the phone. For example, it is always placed on the bottom right corner of the screen. This way customer would know where to touch to access the app.
2) Deciding and communicating the pickup and destination location to the driver via app is very critical.
a. The most convenient way for the user to do this is through voice command. For reliability purpose, the voice assistant on the app should communicate back to the user the entered pickup and destination details for confirmation.
b. There are existing features on smartphone under accessibility called ‘voice over’. This command reads out loud the functionality of the button user touches on the screen. This would be very time taking and not a convenient process for the user.
4) For communicating the wait time and trip estimate, we can again use voice assistant. This is a convenient and reliable way.
5) Once cab is booked, customer can be notified in multiple ways:
a. Series of vibrations signaling that a cab has been booked.
b. Voice assistant notifies the customer of the booking confirmation.
8) A very critical part of the process is for the customer to be able to identify the right cab to get into at the pickup location. This can be done in multiple ways:
a. When confirming the booking, the app asks the customer to click a picture of themselves or their surroundings to send to the cab driver. This would make it easy for the cab driver to identify the customer. Some people might not be comfortable sending their pictures to a stranger.
b. As soon as the cab arrives at the pickup location, the app automatically connects a call between the driver and the customer so that the driver can guide the customer to the cab.
11) Safety is a very critical part of the entire customer journey, so the customers wants to be able to raise an alarm if they feel threatened. They can do this in below manner:
a. Motion detection – if the customer taps their phone 3 times continuously, this sends an SOS to the nearest authority of the Lyft call center. They can then track the cab and provide help.
b. Voice assistant – the customer communicates to the voice assistant and requests a call to the Lyft call center. This could not be the most convenient way of raising an alarm as it is not discreet and also alarms the driver.

Now, with potential solutions in mind, lets evaluate any tradeoffs in terms of customer impact and cost of implementation to estimate for a MVP.

1) Customer impact – high, cost of implementation – low (leverages existing voice assistants such as SIRI)
2) Customer impact – high, cost of implementation – med (in case we have to develop a personal voice assistant for the Lyft app)
4) Customer impact – med, cost of implementation – low (once a voice assistant is developed, same can be utilized here)
5) Customer impact – high, cost of implementation – low
8) Two potential solutions
a. Customer impact – med (many might not be comfortable in sending pictures), Cost of implementation – Med
b. Customer impact – High, Cost of implementation – low
11) Two potential solution
a. Customer impact – high, cost of implementation – med
b. Customer impact – med, cost of implementation – low

Based on above evaluation, I propose that we build a voice assistant for Lyft app for blind to assist with booking. The app automatically connects a call between the driver and customer as soon as the cab arrives at the pickup location and provides users with the option of raising an alarm using voice assistant.

Bijan
Guest
Bijan

Great answer. It has all the elements of a good answer to a product design question. great work, Ayush:)

Ayush
Guest
Ayush

Thanks Bijan for your feedback. I would also try to answer more questions and would really appreciate your feedback on them.

Bijan
Guest
Bijan

Happy to help. It’s harder to answer the latest exercise which doesn’t have any answers yet. But it’s a really good practice to try to be the first to solve a new exercise.