Good UX In, Great Results Out: Designing a Data Ingestion Workflow for Real-Time Impact

Company: Splunk
Role: Director of Product Design, Unified Experiences, and Design Systems

Previous Next

Goal

Customers know how to import data easily and control where it goes and exactly what is analyzed, so actionable insights are clearly derived and enable the quickest time to value.

Illustration of the data ingestion process from many different kinds of data stores.
Bringing data into the Splunk portfolio of products was challenging from an internal product alignment perspective due to the wide range of potential data sources, each with specific rules for accessing and importing the data stored within.

Opportunity

Overcome competitive pressure from nimble upstarts challenging our ability to simplify our customers’ core data ingestion and processing problems to:

Diagram example of a single data ingestion workflow and its inherent complexity.
To understand the fine-grained details around data ingestion our customers faced, we spent many hours documenting the explicit steps required to complete these processes across a wide range of potential data storage types.

Solution

The problem I was facing was twofold, and both were equally urgent. The first issue was stopping the bleeding. We were under immense competitive pressure to address a long-standing customer issue with the complexity of bringing data into our analytics and visualization products. The second problem was longer pole, where customers needed a more scalable and flexible model for expediting data ingestion while giving them tools to be very targeted about the parameters defining the data coming into their environments. Starting with deep customer conversations to help derive the critical pain points, I was able to craft a dual approach where we quickly spun up an iteration on our existing product that addressed the core complexity considerations for some quick wins and to demonstrate our ability to execute while simultaneously leading a focused and nimble team to design a mid to long-term process to solve these problems with a 0 to 1 mindset.

Screenshots of the final designed data ingestion solution showing a catalog of different data store types and their setup.
Making the data ingestion opportunities clear and apparent within a catalog of providers and technologies streamlined the process of importing data and setting up the data store connections.

Screenshots of the final designed data processing solution showing different data connections and flow visualizations.
Giving users powerful pre-processing data tools and visualizations so they can be sure they are only accessing the exact data they need, increasing data security and lowering transaction costs.

Outcomes

Verbatim customer feedback from our regular research sessions highlighting our users’ high satisfaction with the designed solution.
Including customer feedback throughout the entire design process ensured that we kept our eye on our users’ needs while giving us key insights that allowed us to uncover new opportunities to solve their existing pain points and far exceed the project’s original goals.

Accomplishments

What I learned