Reimagining how rural Alaskans access public benefits



How might we enable all Alaskans to get the help they need?


The State of Alaska is large, diverse in terrain and residents, and one of the few states to not have an online application for SNAP (food assistance). My team and I at Code for America entered into an initial research and discovery phase in order to explore some key questions about their current service delivery, how to help build resilient systems and employees, and if/how digital solutions could help them better serve their residents.


4 month long discovery phase, 2018

Team and role

I worked with 1 other (part-time) design researcher on a cross-functional team, including 1 program manager and 2 engineers

Activities and methods

My responsibilities and contributions: workshop facilitation, user interviews, contextual research, prototyping, usability testing, interaction design, stakeholder management



Alaskans were having major problems with applying for, receiving, and retaining SNAP benefits.

The Division of Public Assistance (DPA), the agency responsible for administering SNAP benefits (colloquially food stamps), was also facing a number of problems: outdated technical infrastructure, understaffing at all levels, and a reliance on manual processes that made scaling efforts nearly impossible.

My team and I at Code for America entered into an initial research and discovery phase in order to explore some key questions about: 

  • how we might improve their current service delivery

  • how we might help build resilient employees and systems, and 

  • if/how digital solutions could help them better serve their residents.



Fee Agents, volunteers who submit assistance applications on behalf of community members, are currently required to also include a piece of paper (a Fee Agent Form) that acts as a record of intake and receipt and submission of application materials. We designed a digital solution to remove physical barriers to submission and to address the inconsistency in completeness and quality of submitted forms.



Goals and methods

We started with understanding key stakeholders

Our primary contextual research goal was to understand the priorities, challenges, and day-to-day experience of delivering benefits in Alaska.

I wrote a research plan and defined some guiding questions: 

  • What are the key priorities for the state?

  • What is the current applicant journey like? 

  • What is the current ecosystem in Alaska?

With the program manager, I planned an initial site visit in Alaska over the course of a week. This week consisted of:

  • 20 stakeholder interviews: DPA leaders, policy and operations staff, technical staff, field office management and eligibility technicians, food bank staff, and a Fee Agent.

  • 2 workshops I ran with DPA leadership:

    • “Blue skies” in Juneau to understand their vision for service delivery and tools/systems

    • “Journey of an application” in Anchorage to understand the details of service delivery/operations in Alaska

  • 5 field observation sessions with eligibility technicians

  • Across 4 cities in Alaska


We learned everything is connected

Back in San Francisco, the other design researcher and I synthesized all the information we gathered in interviews, the workshops, and observation sessions. We saw that everything was connected:

  • DPA operations and staff are siloed and don’t always communicate, and operate far from users

  • Staff across offices and DPA leaders / operations put out fires in their own domain, treating symptoms not causes

  • Structural issues within DPA and the field offices cause overwhelm and long wait times for clients

  • A large backlog of piled-up applications looms over DPA leadership and staff

  • Frontline staff deal with both client issues and antiquated systems / processes

A particularly big learning for me came during one of the workshops. One of the participants noted that it was the first time some of these people had been in the same room before - which struck me as odd, since they worked on different ends of delivering the same service. It reinforced to me how siloed government workers can be.

And rural and remote Alaskans see the worst service

The town store in Golovin, AK (in the Nome area).

The town store in Golovin, AK (in the Nome area).

Unfortunately, all clients bear the brunt of all of DPA’s internal strife, from leadership to the field offices. But clients who can’t advocate for themselves in person in the offices see the worst service.

  • Alaskans who live in rural/remote areas have little/no access to DPA offices

  • Mail can be delayed for weeks in Alaska and phone service is inconsistent

  • Fee Agents (trained DPA volunteers established in communities) administer official interviews on behalf of DPA, but the reports they submit are inconsistent and incomplete

  • Fee Agents cannot act as support for clients beyond the submission of an application, though applicants may ask them for information/support

User interviews

Next, we interviewed clients about their experiences

Our primary user research goal was to learn the biggest barriers clients face in receiving their benefits, and their experience with DPA.

I performed 3 in-depth client interviews, focused on:

  • Understanding a client’s benefits journey

  • Understanding their pain points

  • Hearing them articulate their needs

  • Learning about their relationship with DPA

  • Defining what success means to them


Clients have trouble navigating a confusing system

Their pain points were largely symptoms of the root causes we were seeing internally: 

  • Overwhelmed staff ➜ unanswered phone calls ➜ clients have to go to the lobby for answers

  • Numerous clients in lobby ➜ first come, first serve support ➜ they suffer long wait times for information or action

  • Long wait times for info/action ➜ lack of transparency about case or expectation of timeline ➜ client frustration, stress, and cynicism

The more a client was willing and able to be their own advocate, the more likely they were to a) have their eligibility determined in a timely manner b) receive necessary guidance during renewals or in the case of having to declare a household change c) ensuring they maintained their benefits case.

Often, clients who were able to seek information/help in person from DPA workers had more success with receiving benefits.



User groups

We saw three user groups we could serve

All of what we learned from the interviews showed us that there we had 3 main user groups we could attend to: all clients, remote Alaskans, and eligibility technicians, and each had a key need:

All clients

I need the ability to access services and maintain my benefits without undue burden

Remote Alaskans

I need expanded pathways to service

Eligibility technicians

I need tools and processes that are efficient and reduce strain on me

Paper prototypes

We saw three opportunity areas for 8 prototypes

But what could we technically accomplish that would have actual impact?

Each prototype met one of the larger user needs we defined, and laddered into these big opportunity areas we saw based on our research. 

We had a lot of ideas/hypotheses, so we tested them - 8 prototypes in all. During a week in Alaska, we tested paper prototypes with: 

  • 6 eligibility technicians

  • 9 clients

  • 3 Fee Agents

We provided scenarios for the prototypes, and asked participants to use the prototype and talk aloud about their thinking and experience, and took qualitative notes on their process and responses to follow-up questions.


Our prototypes had mixed success

Of the 9 clients we spoke with, all of them expressed that making any part of the process digital would be better than the current state. Having a digital status checker would allow them to be more self-sufficient and give them peace of mind, and prevent them from having to physically come into the lobby in order to get information about their applications. 

The eligibility technicians were generally less enthused about the prototypes. They felt as though they didn’t need any more calculators or information resources; the real value for them is in automation and standardization – i.e. if a calculator output a standard case note for them to enter as a note. 

Two of the three Fee Agents we spoke with reacted positively to the idea of a digital Fee Agent 1 form. They also all responded positively to the idea of building more of a community of Fee Agents; whether that be a monthly call or some sort of online group.

Clients, ETs, and leadership all responded positively to any prototype that would help alleviate the stress of lobbies—namely, the status checker and helping Fee Agents facilitate accurate and complete interviews. 


Design decisions

2 prototypes had the most potential: a status checker

A status checker would be a way to test the idea of a “digital lobby,” starting with the most requested item from clients: the status of their benefits.

I would use this anytime I had something to check. I would love more information. I’m self-directed… People like me would do it themselves instead of coming to the lobby.

- Client

And a digital assister for Fee Agents

This would improve efficiency and equalize access for all Alaskans - allowing better guidance and input validation for the Fee Agent 1 form, which acts as an “interview” for rural clients who apply through a fee agent. The focus would be on accuracy and completion, which the paper form cannot do, and would help build trust between DPA and its Fee Agents.

There are no downsides to doing it on the computer. It would be nice for clients to see what you’re doing.

- Fee Agent


The proposed solution

From all our work, we decided that by focusing on Fee Agents, representatives of the State, who live in rural and remote areas, we’d be able to have an impact: we’d see more complete and timely applications being submitted to the State. 

If I could do everything on the computer that would make my life a lot easier.

- Fee Agent 3

We proposed building a Digital Fee Agent (1) form. We believed: 

  • Digital guidance will increase accuracy in applications submitted by Fee Agents. A digital form can encourage better, more useful, more complete answers and notes. 

    • More accurate FA1 forms will build more trust between eligibility technicians and Fee Agents. 

    • Increasing trust between FAs and ETs will decrease the amount of reworked applications. 

  • Digitization of the app and attached documents is an important aspect of increasing the efficiency of rural applications. 

  • Having clients and Fee Agents fill out the application online is more efficient than scanning the paper app and results in more complete and accurate applications. 

  • We can develop suitable security processes to ensure that documents do not stay on Fee Agent phones or computers. 

As a result, we’d see: 

  • More accurate and complete applications, documents, and FA1 forms being submitted by Fee Agents

  • A decrease in ET-reworked applications

  • Less eligibility and enrollment time/time to benefits delivery for rural and remote Alaskans

The form is designed first for desktop: most Fee Agents we interviewed either used a home computer, or a public computer (like at a school, a common use case).

We were able to leverage the design system we had established when we built GetCalFresh, making the Fee Agent form mobile-friendly and accessible. The design system also emphasized performance, something very important for Fee Agents/Alaskans, where broadband access is generally slower.



Alaska piloted the digital Fee Agent assister

In our final presentation to the State, we pitched a Pilot Phase: a six-month engagement during which our team would build out the prototype we presented. We’d start with building an MVP/proof of concept of the Digital Fee Agent form, then test with real folks on the ground (both clients and state employees), iterate, and work on expanding the service to more Alaskans in order to measure impact and capture learnings.

This would be a first step in understanding how something Code for America built could work on top of DPA databases and tools (as integration was off the table, technically speaking).

Initial response was positive, but a change in State administration brought the pilot to a slowdown.



All of Code for America’s Integrated Benefits work mapped to bigger initiative-level learnings, and in Alaska, there were plenty of learning opportunities around access and communication. What we learned there would help inform other projects and pilots across the United States. 

On a personal level, this was the biggest coordination and facilitation project I have done yet; in the end, I arranged and scripted many hours of interviews with more than 40 different people, and I led 4 workshops with State leaders and staff. 

Additionally, it was a project where I learned how to balance being a researcher with being an empathetic human, more so than ever before. Alaska is a complicated place, and I am so grateful I got to meet and work with the amazing people I worked with. But Alaska has a lot of problems: the state itself is having to undergo major budget cuts, there’s a mental health crisis, and the public feels the effects profoundly. Asking people during my research to share with me such personal (and oftentimes, heart-wrenching) experiences about their most difficult moments dealing with an entity that is supposed to provide a safety net for its citizens was difficult work. I left this project and this work with a renewed sense to work for the interest of the public, the people, who need it most.