Alexa Shopping is the Amazon.com experience for voice. Primarily for Echo devices, but utilized on every Alexa enabled device, the Alexa Shopping experience helps users purchase goods using a variety of multi-modal experiences that blend voice and graphical UI touchpoints.
The Alexa Shopping team wanted to explore innovation surrounding a hybrid design systems that included voice as well and the conventional GUI-focused design system. To be successful, the conceptual solution would need to support seamless blending of experiences across voice and tactile interactions, through across a range of device-modalities.
Partnering with my lead researcher counterpart, we developed and ran a 2-day intensive series of in-person discovery and design thinking workshops with the Alexa Shopping Design System team - a cross-functional team of roughly 20 that included team leadership, engineers and developers, researchers, and UX and conversational designers.
The main goals of the individual workshop activities were:
Some really interesting ideas came out of the workshops, involving elements of incorporating generative AI into design system creation and usage, and how design systems might evolve to better encorporate experiences moving between voice and tactile interfaces. After the workshops, our internal team began doing market and technical research based on the outcomes and conceptual directions exposed during our sessions.
Unfortunately, external factors required the Alexa team to end the project where it stood with directional ideas defined and conceptual strategies supported by the market research. To close the loop, we handed over all data from the workshops, packaged both in raw form and also compiled into an Airtable.