Kendra Screens Mockup

AWS kendra

Natural Language Enterprise Search

Role

Sr. UX Designer

Responsibilities

Product strategy • UX/UI Design • Research & Testing • Information Architecture • Design Ops

Tools

Sketch, Airtable, Lucid Chart

Type

New Service Creation • Public Launch • Improvements

Amazon (AWS) Kendra is an enterprise search service that organizations can use to create a variety of Natural Language Processing (NLP) based tools, such as chatbots and internal search solutions.

Kendra helps customers sync their data from a variety of business software options, enabling them to search from a single point of entry. Using machine learning, the service improves result accuracy automatically over time, while also providing users the tools for optimizing their own data to further improve accuracy. Additionally, customers can use the services' analytics system to monitor data streams, view query and result data, and manage cost for their organization.

Challenges

1. Data Complexity

Objective:
Consistent & streamlined process

Customers need to add data to Kendra's system to be able to search it, and no two data sources are built alike. But, for the individuals adding their data to Kendra, the process needs to feel alike to provide an intuitive and consistent experience.

2. Systems Design

Objective:
Self Improvement for data owners

To improve their own search results, data owners need to know what was searched for, what the results they're getting are, and they need to have the mechanisms necessary to continue to improving the first two.

3. Collaboration

Objective:
Reduced effort while improving output

Efficiency and scalability are key to the success of enterprise products teams. To realize the benefits of these, the team needs a shared goal, a vision to execute toward that goal, and a simple process to stay on track. And, they need good communication.

Connector List Page
Kendra offers customers over 100 data source connectors to bring all their organizational data into one searchable database. Ever individual connector setup offers default search-optimized setups, with the ability to customize for advanced use-case or security purposes. Kendra's SDK allows customers to imbed the services search experience into any organizational web experience or knowledge base.

Data Complexity

My main goal for creating the data management system for Kendra was to ensure the process of adding data sources always felt the same, regardless of product specific nuances. To achieve that, I needed to have a few unique sources to base design decisions off of. This helped me better understand technical needs, constraints, and similarities. I started by doing design pairing sessions with engineers to get the initial requirements on paper and begin mapping out the experience. From there I was able to start formalizing patterns and converging the experiences through subsequent versions.

One of the most challenging problems to solve was the metadata field mapping system. This is because different products have different purposes, information, structures, etc. For example, when you compare a standard file storage solution like OneDrive to a product like Salesforce that is built on a complex system of metadata and hierarchical relationships - it can be difficult to figure out how to map the two products data together in a way that will produce accurate search results based on an end-users intent. After all, that's the #1 most important function of a search engine and the only way to build trust in it.

So, I built and implemented a process where the Kendra team would carry the burden of figuring out the best mappings prior to making that data source available, versus forcing customers to read documentation and setup mappings manually. While this added additional scope for our engineering teams, I also helped reduce their work in two other areas to offset. First, the creation of the process itself helped engineers go through a guided discovery phase instead of pulling apart APIs to figure out what we needed. The second was developing the component design system to reduce repeating negotiations between technical and design teams. Together, this streamlined communication and development processes, and removed a lot of duplicitous effort.

Component design system - Early version
To solve for a consistent experience for adding sources, I created a product specific component library built on top of AWS's design system using the "rule of 2". As we scaled the number of data sources, a new component would be created and standardized based on the common patterns found in 2 distinct data sources.
Component design system - 2 years later
I returned to Kendra 2 years after leaving the team while managing another designer who joined the project. Using the strategy and system I put in place, the Kendra team had scaled the design system and data source offering to over 100 connectors and counting.
Data Management
Advanced data management features helps customers set up their search experiences with low friction. Sync monitoring helps administrators stay on top of their data for near-real-time accuracy. Metadata field management comes pre-setup with defaults optimized for out-of-the-box search accuracy, and in-page help for teams looking for advanced configurations.

Systems Design

During the Beta phase, another senior designer and I did over 20 generative interviews with SME's and customers. Not hyper-focused on anything specific, but to get an idea of general customer needs and pain points around knowledge management and information access. We spent a week synthesizing the data and gathering insights. But as features were cut from Beta, the research was shelved.

As the product scaled, so did the design team. After our Beta launch, I started moving into more of a Lead IC role, handing much of my existing feature work to new team members and starting to build out new areas of the product like their Analytics tooling.

I used the opportunity with the Analytics work to leverage the research that had been sitting, putting everything into an Airtable, labeling data, grouping themes, etc. The problem was, the insights were too unstructured. The research was almost too general, so it didn't serve as an effective guide for modeling the experience.

So, I pivoted my approach from qualitative to quantitative. I read every data point (hundreds of unique statements from interviews) and tallied any directly or eluded-to metric that was mentioned. This gave me a clearly prioritized list of needs I could start building around. From there, I was able to create sitemaps and user flows, and wireframes for v1 of the Analytics dashboard. The model was to segment by user-type. One for business teams to monitor cost and usage, another for knowledge/data owners to help improve and manage the data and results, and the last for technical teams to monitor and troubleshoot to keep their data flowing properly.

Finally, I did validation testing with internal SME's and a small number of early customers, moving to higher fidelities as I went.

Kendra Analytics and Search Console
Kendra's search console with machine-learning model tuning for customers to manually help improve search results.  Cost and usage monitoring allows customers to see breakdowns of activity and usage metrics, database size, and data movement. Search analytics give customers insight into what kinds of information their teams are looking for, and how effective the search results are - this gives them insight into where their knowledge gaps are and clues on how to fill them.
Systems flow
A systems flow of Kendra product areas helped leadership better visualize and understand the dynamics flows of inputs, outputs, and dependencies across the product. This UX asset helped inform and redirect roadmaps and product strategy. By including persona's the team was able to better understand how to tailor different parts of the product to the correct users of specific functionality.

Collaboration

I drove process improvements throughout my time on the project. The team and product itself were relatively new, so we started from scratch on pretty much everything. Everything functioned very much like a startup (within AWS). This is something I took the lead on in many cases, pushing for better processes around a specific need or problem as they emerged. Some of these have already been mentioned above.

Key Results

I reduced the effort needed to build a connector by 75% in under a year.

By systematizing how data sources were built, I brought consistency to the experience of adding over 100 unique products to the customers searchable database.  

I helped develop more strategic and holistic approach to roadmapping.

Working directly with product leadership, I helped drive a systems thinking approach to product planning & roadmapping through the creation and socialization of systems-flow diagrams.

In 18 months, I designed and helped develop a bulk of Kendra's core features and architecture.

Features include data ingestion and management, Index management, Metadata field mapping, FAQ's, Synonyms, and Analytics & monitoring.

Learnings

The most interesting moment in this project happened while validation testing the Analytics feature. To quote Henry Ford, "If I had asked people what they wanted, they would have told me faster horses". There is no real evidence he actually said that, but I digress.

What we realized, was one of the metrics that almost every interviewee mentioned was CTR (click-through-rate). This is arguably the most well known metric from services like Google Analytics, and generally it's a marker of success. The higher the CTR, the better. At least in terms of Marketing, SEO, E-comm, etc.

The interesting thing about building a search engine, versus using one, is that for the purpose of search accuracy - providing customers with the fastest possible access to the correct information - CTR was actually a negative metric. It meant our AI engine wasn't able to find and surface the correct answers early enough via a suggested answer, forcing users to click into one or many results themselves to find the answers they were looking.

The most obvious or popular answer is not always the best one.