Alexa, “What am I Holding?” – Amazon Show and Tell Feature

Amazon Show 2nd Gen

Alexa’s skills as offered in the Amazon Show or Echo devices continue to expand offering more and more including universal design services for the general public and individuals with disabilities. A recent email sharing “What’s New with Alexa?” featured the new Show and Tell skill supporting individuals with blindness and low vision. The Show and Tell feature is offered for all versions of the Echo Show devices (1st and 2nd generations) providing product recognition using its camera to identify and vocalize the product. Object recognition to support individuals with visual impairment and cognitive disabilities has been a high interest, and a need identified when evaluating client for their AT needs. While I have previously review apps with object or product recognition tools, this device as a stand alone tool was a new tool. So, this new feature offered on this ubiquitous home device piqued my attention as a possible AT solution for clients serviced. So can Alexa help with product recognition?

Amazon Show- Show and Tell Skill Operation

The Show and Tell skill of the Echo Show (1st and 2nd generation both have cameras), uses its camera to tell you want object you are holding. Asking Alexa, “What am I holding?” while holding a product about 1 foot away from the camera and positioned about 1 foot above the surface the Show is located. The Show and Tell provides verbal instructions on how to use it and then speaks the item name out loud to you. Quite slick.

How does Show and Tell work to identify products you’re holding? Here are the Amazon directions for asking Alexa to identify common pantry products your holding with Show and Tell:

Before you get started, make sure that there’s good lightning in your room and that nothing is blocking your Echo Show’s camera.

  1. Hold your item one foot from your Echo Show’s camera.
  2. Say things like, “What am I holding?” or “What’s in my hand?”
  3. When prompted, move your item around slowly to show different sides of the packaging. Alexa helps you position the item with tips and sound.

Of course, I had to trial product recognition with my Amazon Show (1st generation). At on set of the verbal command “Alexa, what am I holding?”, Alexa provided me with a brief but adequate instruction on how it works then promptly completed the recognition task. Sound feedback is provided when a camera shot is taken and prompts to turn it to another side to get more information. It was very simple and intuitive to use Show and Tell with initial trial.

Amazon Show – Show and Tell Trial

How did it work? I trialed 8 different products from my pantry including canned goods, jars of products and packages (snack bars, packet of rice, etc.). I used name brands and generic brand products. Alexa Show and Tell provided me with basic information with the generic products (grape jelly) while additional but basic information was provided with the name (e.g. Jiffy Crunchy Peanut butter) of the product presented. Alexa prompted me with verbal instructions to turn the item, lift it up for better recognition. It appeared that showing the front and the UPC scanning code (I suspect) was key additionally used to identify the item. As would be true when scanning any items, standard text of the product was most accurately recognized versus word art or cursive writing on the product. It accurately identified 8/8 items with a general description (“grape jelly” Nature Valley oatmeal bar, olive oil. Additional information of the product may be offered by Alexa. If help is needed, saying “Help with Show and Tell” or “More help with Show and Tell” can provide the user with additional information.

Amazon Show – Show and Tell as an AT Device

Although briefly trial with a few panty products, accuracy was good with basic information provided on what the product was. Although detailed information was not always provided, this would be a helpful tool for individuals with low vision or blindness to help them recognize products in their cupboards or refrigerator if marking were lost, items unmarked with the typical ID systems recommended. It was also easy to position the items in front of the 10″ Echo Show screen with Alexa verbalizing directions when needed to turn or reposition the item. A hack that might help would be to have a small low platform to place an object on with position set at 1 foot away with a rise that is even with the bottom of the Show device. this would take the guesswork away for individuals that may not be able to see the distance from the Show device or hold the item steady while it is detecting the label.

While I may not run out and get a Show just to recognize products, the Show offers many great services making it a worthwhile device for many tasks for the general population and individuals with disabilities. The Echo Show can easily be considered an AT device, offering many accessibility tools and services.

What do you need to set up an Amazon Show

Amazon Show Devices

The Amazon Show devices now come in a variety of models. Refurbished 1st generation models are available for about $99.99 for the 10″ model with 2nd generation refurbished under $200. Here are a few of the newer Amazon Show models and prices:

Amazon Show 5  

This model offers a 5.5″ screen with built-in camera shutter and microphone/camera off button.

Amazon Show 8

Amazon Show with 8″ screen with built-in camera shutter and microphone/camera on/off button and touch screen .

Amazon Show (2nd generation) offers a 10″ screen, built in camera shutter and microphone/camera on/off button and touch screen.

While these models offer product recognition, many other skills are available with Amazon Echo Show devices including reading books aloud, closed captioning, playing music with voiced requests, finding out news, weather, sports, making calls, talking reminders and timers, researching basic information on the Internet as well as automating home devices and services. The skills offered by Alexa on the Amazon Show continue to expand and are used by young and old to access information and control their  environment.

What skills do you or your clients use or are your favorite on the Amazon Show? Ask Alexa next time what her favorite skills (NASA and cats…?).

More for your OT eTool Kit.

Carol – OT’s with Apps and Technology

About Carol Leynse Harpold, MS, OTR/L, SCLV, ATP, CATIS

OTR/L with more than 35 years experience in pediatrics, school based therapy and adult rehabilitation. Masters of Science in Adaptive Education/Assistive Technology with 20 years experience in AT in education of elementary, middle school, secondary, post secondary students and work environments for adult clients. A RESNA Assistive Technology Practitioner with ACVREP CATIS credentials, AOTA Specialty Certification in Low Vision, USC Davis Executive Certificate in Home Modifications, servicing adults and students with disabilities in employment, education, and home environments. A 2020 graduate of the University of Alabama Birmingham Low Vision Certification Program.
This entry was posted in Accessibility, Activities of Daily Living, Adults, Aging in Place, Artificial Intelligence, Assistive Technology, Cognitive Impairment, Environmental Control, Home accessibility, iADL's, Intellectual Disability, Internet of Things, Low Vision/ Blindness, Object Recognition, Text to Speech, Universal Design, Visual Impairment and tagged , , , , . Bookmark the permalink.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s