Google recently developed a technology that can detect human emotions in a 2 X 2 dimension (energy level and body movement). What are some of the products you can build using that technology? List a few ideas, then choose one and design the product for that idea.
+1 vote
1.4k views
in Product Design by (1.2k points) | 1.4k views

2 Answers

+6 votes
Google recently developed a technology that can detect human emotions in a 2 X 2 dimension (energy level and body movement). What are some of the products you can build using that technology? List a few ideas, then choose one and design the product for that idea.

Just to reiterate, google developed a technology that can detect human emotions based off of energy level and body movement. I would ask clarifying questions as to how this technology works. Does it require special cameras? Does it require a user to actually turn it on? What’s the level of confidence in the results? Does the user need to opt in to this program? What kind of human emotions are we talking about?

Assuming that the user does need to opt in, but it works off of a normal laptop or phone camera and does need to be turned on, I would start brainstorming ways that this might help solve an existing user issue.

– This app might be useful when it comes to detecting fatigue and general tiredness when working. Google could offer a product that would remind users when to go for some air or take a walk whilst working
– Emotions is useful to gauge how a user feels about any particular product. This would be especially useful for both the user and the app to understand how to optimize UI/UX, product placement, etc.
– Emotions would be useful for Youtube, where it can better hone in on video recommendations than a simple Thumbs up or Thumbs Down.

I want to actually build for number 3, because a better recommendation system would be kind of amazing.

The goal of this feature is to deliver a better recommendation system for users who opt-in to this technology.

The way we know the recommendation is working is by longer viewing times on Youtube, a general happier user from the emotion technology, and better retention.

The problem we are trying to solve for our users is how to elevate more relevant content so that they are able to watch more videos that are akin to their interests. Understand that their interest may change everyday. Some key features for this would be:

– Users would need to opt-in and sync their face and camera up to make sure they work. There will also need to be an opt-in everytime a user surfs over to Youtube.
– A backend database that tracks videos you’ve watched and also the change in emotion from the beginning of the video and through the end
– A front-end display showing the users how their emotions have changed throughout the video itself
– Asking the users how they really felt about the video compared to the emotion captured from the technology
– A algorithm that can match your personal preferences with your emotional response for the next video you want to watch.

Within the algorithm, there’s a lot more features that we can build. Like what does emotions tell us? What does each of the four grids really mean in terms of how they feel about a particular video? I think that’s why we pare it with the actual ratings the user provides to inform us how to move forward.

All of the bullet points would be key to the first launch as it provides us with all the information we need to continue to iterate on the recommendation engine. We will know the recommendation engine is working when we pare a level of internal confidence metric with the user’s actual rating. Also through traditional metrics like how many videos are the users watching one after another.

In summary, I would leverage the emotion engine that Google created to deliver better recommendations for users on Youtube. At first launch, we should take care of all user privacy concerns by making it opt-in and then gather data both through the technology and with user input to help our machine learning algorithm produce better results in the future.
by
+2
Scott – amazing creativity! I was thinking of leveraging a tool like this to help non verbal people communicate. If you have an autistic kid, can you potentially use emotion recognition via facial expression recognition and energy recognition to predict whether they are feeling hungry, bored, tired, happy etc.
You might want to throw in privacy concerns too. There should be a way for someone to turn off the facial/energy detection by voice when they want their privacy protected. Or as soon as someone else comes in to the screen- let the application notify them that they are being watched.
Will you record videos? Will you store them on a server in order to analyze the emotions real time? Talk about how you should delete video/energy recordings as soon as emotion analysis is done unless user chooses to explicitly save the recordings.
+2
That’s a good one. Extending it to help blind people to help understand people expressions when they are conversing with other people in the real world.

Track people reactions for new products in retail.

Monitor health of collaboration in open office.
+1 vote

I would structure my answer as follows:

 

1. Clarifying the question

What could I do if Google had developed a technology that measures energy levels and motion to detect a user’s emotions?

 

2. Brainstorm possible products that could leverage this technology

  • An app that detects your pets emotions so that as a pet owner, you can better understand/care for your pet 

  • An app that detects elderly people who have issues communicating with their loved ones 

  • A tool to help therapists understand how patients, particularly children, really feel about a specific issue during a therapy session 

  • A wellness app that recommends yoga, meditation techniques and diet tips based on users’ current mood/emotions 

 

3. Focus on one idea and develop a product for it 

I am choosing to design a product for the last idea: a wellness app that recommends yoga, meditation techniques and diet tips based on users’ current mood/emotions 

 

This product would be integrated as a new app for an existing wearable product, like Google Watch for example. It would use a number of user data including energy levels and motion to detect the user’s current set of emotions, and which of these emotions is prevalent at a given point. 

I would focus on negative/challenging emotions, since this is the type of situation where a user would want to feel better and improve their mood. 

I’d start with a list of the 6 basic emotions, which could be expanded upon later based on initial data:

  • Feeling sad 

  • Feeling angry  

  • Feeling afraid  

  • Feeling anxious  

  • Feeling fatigued

  • Feeling bored 

 

Some use cases where I envision the app being used:

  • User feeling afraid during a flight 

  • User feeling anxious before a job interview 

  • User feeling sad after receiving difficult news 

  • User feeling angry after a bad day at work 

 

The user would open the app and be presented with an emotional evaluation detailing how the user is feeling, what the prevailing emotions are, and presenting recommendations based on the user’s contextual preferences in that particular moment.

There would be 3 main sections for the user to pick based on their context and goals:

  1. Meditation sessions (specific goals and lengths available based on user’s preferences)  

  2. Yoga sessions (different goals and lengths available based on user’s preferences)  

  3. Diet recommendations (different type of recipes/snack suggestions based on season and time of day)

 

Over time, the app would gather and analyze both qualitative and quantitative data in order to optimize and personalize recommendations for the user:

  • Quantitative feedback: the app itself would measure the effectiveness of these recommendations by tracking any change in the user’s emotions, and present these changes to the user as a feedback loop mechanism  

  • Qualitative feedback: 

    • users can rate/share most useful/effective content for qualitative feedback 

    • User can save most useful/effective content for easy access in the future 

 

4. How would I measure success?

  • Like any app, I would look DAU/WAU/MAU to understand how engaging and effective the app is in driving users to engage with it regularly 

  • To understand the app effectiveness, in addition to ratings and engagement metrics, I would measure how the user’s emotions change after the user engages with the app (E.g. can we see a decrease in negative emotions after the user engaged in a meditation session to manage their fears?)

 

by (19 points)

Post answer and get feedback

Your name to display (optional):
Privacy: Your email address will only be used for sending these notifications.
To avoid this verification in future, please log in or register.