Improving access to public benefits

TOOLKIT Service Design, Behavioral Design, Design Research, Client Relationship Management

TEAM Business Manager

MY ROLE Led the design research process from scoping to synthesis 

Case study is kept to a high-level to protect client confidentiality.

 

Challenge

The client, a U.S. state, wanted to increase enrollment in public benefits. Because state benefits can be acquired and maintained through phone, in-person, and online, and we had limited time to conduct the initial research, this was a broad scope that needed narrowing. My role was to work with a business manager from my organization to clearly define the client’s scope through a customer-first approach with the ultimate goal of helping the client identify the reason for low enrollment.

Process

  1. Define the problem space. Although many citizens began the process to receive benefits, there was a significant drop-off. I worked with the business team to look at the data and identify areas where applications were not continuing through to completion. Unfortunately, there was not enough data from the state to understand the drop-off points. It also was not feasible to recommend better data collection across the relevant areas of the system because of technical resource constraints. We had to conduct qualitative research to hone in on the problem.

  2. Plan the research. Due to the broad scope of this problem and so many unknowns, I advocated for a research plan that allowed for more time in the field and a thoroughly representative group of end-customers. I then created a research plan that included a representative sample of citizens via a diversity of field offices and call centers.

  3. Conduct research. I spent several weeks traveling to different benefit office locations where I conducted ethnographic observations and interviews with both applicants and staff. I mapped my findings in service design blueprints to better identify issues and breakdowns. During the research, I realized that many successful applications were due to trust in community resources. For example, a trusted community authority tells a citizen to seek benefits after a life event such as pregnancy or losing a home. I broadened the research plan to include community centers closely linked to life events - for example, community libraries to homelessness and food banks to job loss.

  4. Synthesize insights. When I looked at the findings across the application process, there was no single set of drop-off points or bottleneck. However, I knew from talking with experts prior to the field visits that what I discovered was unique, so I looked at the findings again. This time, I compared the journey map format across different settings (call centers, in-person, community centers) and experience by personas. The analysis showed which areas had higher completion rates, and a pattern emerged. For example, the interaction with an individual employee mattered greatly. Certain benefits offices and community locations had higher completion rates among one of the key target segments. In one instance, the way community center employees were trained to interact with potential applicants resulted in higher application completion rates than those done in the state offices. Comparing the two gave us a start to understanding the friction points in the process.

    I looked for areas where defaults could help reduce friction for the citizen, such as making enrollment a part of existing paperwork instead of a new process.

    For all findings, we also refined our understanding through remote interviews via a paid panel which tested our insights against a wider population.


Outcome

Although no single solution would fix the challenges spread across the journey, the journey-based insights were an unlock for the client. They were able to integrate the learnings into existing initiatives to account for the changes needed across the entire experience. For example, improving the employee customer experience training based on the training given at the community centers. The work was so effective that other states sought it out to tackle similar challenges. The result was a state-agnostic evaluation tool. For my role leading the research, I was designated Associate Design Director and recognized for the inclusive research practices.


Key Takeaways

Don’t try to converge too early. I could have quickly been disappointed by not having a single set of answers to the drop-off question. Still, the journey-based diagnostic tool ended up being far more applicable and valuable. 

Be open to changing plans. I allowed the research to direct me to new places, such as community centers outside of the original research plan. Without these visits, we would have missed some of the more impactful insights. 

If you want something new, don’t repeat what you’ve already done. The most surprising insights came from conducting interviews across a broad, representative range of office locations and individuals. Despite the logistical challenges of reaching all geographies, including a sample of all impacted customers proved necessary.

Reduce friction. There were many areas in the process where simple changes would greatly reduce friction by changing choice architecture to help create the desired outcome.