Perceived Barriers: Using design to break through the policy Kludgeocracy

💡

I wrote this post in 2024 as a means to reflect on my own process approaching the internal playbook we created for CMS employees doing user research. Additionally, this was based on a presentation I delivered with Mathias Rechtzigel about this project. This represents my own views on how I approached this work with my team and does not reflect my former employer.

Overview

Building products in government requires policy acumen. However, what happens when internal policies appear to stand in the way of delivering at all? This guide explores the ways that perceived barriers on public servants manifest as actual barriers when trying to deliver for the public. In this guide, we draw from our experience building the “CMS Customer Experience and Paperwork Reduction Act Playbook.” We illustrate how design thinking can bridge the gap between federal policy compliance and meaningful user engagement.

Kludgeocracy

Last November, Jennifer Palhka, former U.S. deputy chief technology officer and author of Recoding America, penned an op-ed about the frustration of public servants when trying to deliver in the government kludgeocracy. But, what does kludge mean in this context? Looking at Webster dictionary, there appear to be two relevant definitions:

  1. A clumsy but temporarily effective solution to a particular fault or problem
  2. An ill-sorted collection of parts assembled to fulfill a particular purpose.

Both of these definitions describe the system of government.

How our government system works

In this scenario, policymakers identify a problem and create a policy. Then, people employ specific processes to implement the policy. Ideally, implementation leads to the policy’s intended outcomes.

image

System A, Alt text: The picture shows a diagram depicting how policy influences people and process and how people and processes influence outcomes.

However, in government, multiple legislative policies pair together to influence desired outcomes. This leads us to our reality: Multiple legislative policies, agency-wide policies, sub-regulatory guidance, agency memorandums, and OMB memorandums interpreting policies all inform desired outcomes. This engages only more people, processes, and resources to ensure implementation meets desired outcomes.

image

System C, Alt text: The picture shows a diagram depicting how policy influences people and process and how people and processes influence outcomes.

However, when people and processes lose sight of the policy’s intent we risk impeding our ability to deliver on the desired outcomes.

image

System D, Alt text: The picture shows a diagram depicting how policy influences people and process and how people and processes influence outcomes.

This creates a situation where the system does not achieve its intended outcomes, leading to actual but  unintentional outcomes.

image

System E, Alt text: The picture shows a diagram depicting how policy influences people and process and how people and processes influence outcomes.

Case Study: CMS Customer Experience and Paperwork Reduction Act Playbook As part of the federal-wide Burden Reduction Initiative, the Centers for Medicare and Medicaid Services (CMS) Chief Experience Officer (CXO), Ariele Faber, sought ways to improve CMS’s ability to improve its digital service delivery efforts. The Digital Service at CMS (DSAC) led the design and development of the agency’s first playbook focused on the intersection of customer experience activities and the Paperwork Reduction Act. This playbook directly supports CMS employees and contractors, providing organizational aircover to conduct user research.

Example: The Federal Information Collection System

Policy
People
Processes
Outcome
Paperwork Reduction Act (PRA): A federal law governing how federal agencies collect information from the public. It’s designed to reduced the total amount of paperwork burden the federal government imposes on businesses and citizens
- Office of Information and Regulatory Affairs (OIRA) - Agency PRA Staff - Agency Customer Experience  (CX) practitioners
- Steps for seeking a PRA clearance - Knowing when not to seek a PRA clearance approval
Intended Outcome: Information collected from the public is simple, low-burden, and not duplicative.  Actual Outcomes: - People spend too much time figuring out if they can conduct user research - People don’t do user researchServices struggle to improve

For this project, we distinguished between intended and actual outcomes to illustrate a design opportunity for impact.

Framing the problem

In July 2022, a CMS employee posted in one of the human-centered design Slack channels, asking for help on how to get feedback from the public. They wanted to understand how to get feedback on public Zoom calls.

image

Alt text: Slack message that says from Gov Employee “Hi everyone – this may be a general question, but does anyone have any guidance on Human-centered DEsign and the Paperwork Reduction Act (PRA)? I want to use some methods for upcoming meetings. The feedback I’ve been receiving is that everything needs to go through PRA.

Shortly, the employee received responses from contractors to the following resources: pra.digital.gov, the 18F User Experience Design Guide, and a 2017 Medium article by the current Chief Technology Officer at the Consumer Financial Protection Bureau (CFPB).  It is worth nothing that none of these resources were by CMS teams or referenced ways to get in touch with the CMS PRA Staff.

One month later, we checked in with Kai to see if they figured out how to get feedback. They shared they are still working on figuring it out.

image

Alt text: Slack message exchange between two employees. Government Employee #2 says “Did you ever get your feedback mechanism launched?” Government Employee #1 says “Not yet. Thanks for reaching out because we are working on it!”

Seven months later, the employee reported they still had no updates. Specifically, they expected this outcome but were still disappointed.

image

Alt text: Slack message of “Kai” asking for help in a CMS channel.

Considering this employee’s  experience we asked ourselves the following question: How long does this process really take?

In this case, we identified the following:

  • Perceived Barrier: I don’t have enough information to move forward confidently without feeling like I’m get in trouble.
  • Actual Barrier: Practitioners don’t get feedback from the people they are serving.

In the next section, we discuss our discovery sprint in detail and walkthrough the beta version of the CX and PRA playbook.

Discovery Sprint: Defining Our Scope

In the overview section, we discussed the meaning of kludgeocracy, how government systems work, and introduced our problem. Now, we’ll explore the discovery sprint we conducted, diving deeper into how we prioritized building the guide. Throughout this section, expect us to define the perceived barrier we heard, the actual barrier we observed, and share our approach for breaking through the barrier.

For example, we would frame the logic behind this discovery sprint as the following:

  • Perceived Barrier: An expectation that people must be the experts on every topic.
  • Actual Barrier: Without building internal capacity, the agency incurred real costs. .
  • Approach: We listened to our colleagues' diverse experiences, when interacting with the policy.

We see discovery sprints as useful tools for building understanding and generating buy-in on pursuing specific design opportunities. We took this as an opportunity to provide clear justification for why addressing this problem was important for maximizing public impact.

Defining the problem

For this discovery sprint we honed in on the following problem statement: CMS CX Researchers and designers experience difficulty understanding and navigating the PRA process at CMS.

Understanding

Without a CMS specific guide, we decrease the ability of CMS practitioners to even start to user research.

  • CMS practitioners remain burdened with being PRA subject matter experts (SMEs)
  • Reduced ability of CMS employees and contractors to engage with PRA staff positively

For example, let’s consider an employee who wants to conduct user research. To start, they ideate several ways they could get feedback on their product or service:

  • Survey federal employees
  • Direct observation: usability testing
  • Unstructured listening sessions
  • Structured one-on-one interviews
  • Structured focus groups
  • Surveying 500 members of the public
  • Sitting in on a public listening session
  • Documenting a specific groups’s experience over the course of a year
  • Creating a statistical persona
  • Seeing satisfaction rates at the end of an onboarding form
  • Conduct a large scale statistical study of behavioral health anxiety outcomes

They bring these ideas to their supervisor and immediately are told, “You can’t do that because you don’t have clearance.” Suddenly, all these opportunities to get feedback are off the table. This creates a situation where the individual does not conduct any user research at all.

  • Survey federal employees
  • Direct observation: usability testing
  • Unstructured listening sessions
  • Structured one-on-one interviews
  • Structured focus groups
  • Surveying 500 members of the public
  • Sitting in on a public listening session
  • Documenting a specific groups’s experience over the course of a year
  • Creating a statistical persona
  • Seeing satisfaction rates at the end of an onboarding form
  • Conduct a large scale statistical study of behavioral health anxiety outcomes

However, what research activities don’t need clearance and could be done by creating a PRA-compliant study design? According to the 18F User Experience Guide, there remain several ways to get feedback without seeking a PRA clearance:

  • Survey federal employees
  • Direct observation: usability testing
  • Unstructured listening sessions
  • Structured one-on-one interviews
  • Structured focus groups
  • Surveying 500 members of the public
  • Sitting in on a public listening session
  • Documenting a specific groups’s experience over the course of a year
  • Creating a statistical persona
  • Seeing satisfaction rates at the end of an onboarding form
  • Conduct a large scale statistical study of behavioral health anxiety outcomes

However, without a CMS specific guide, CMS employees and contractors remained without organizational aircover to back their understanding of the PRA.

Navigating

Without a CMS specific guide, we increase the risk that services do not meet customer needs by not clarifying how the process actually works.

  • Product owners remain limited in their ability to bake the clearance process into their agile product development process
  • Product owners and designer remain confused on the steps they need to take internally
  • PRA officers struggle to direct customer experience questions to the right SMEs

Prioritizing a solution

Based on extensive desk research and qualitative interviews, we identified three solutions that could address the defined problem statement. Given the CMS Chief Experience Officer’s capacity, our team needed to prioritize what solution for building out a minimal viable product.

Internal facing guide
Public-facing web-page
CMS-wide generic clearance
Collaborate with CMS PRA staff to craft internal guide for CMS employees and contractors
Launch public page on PRA and CX research activities
Embrace OMB M-22-10 template for establishing agency-wide generic clearances, enabling fast-tracked information collections

In the following subsections, we walk through the perceived barrier we observed, the actual barrier we identified, and our approach to addressing those barriers.

Read policy guidance

Perceived Barrier: I don’t know where to start when it comes to understanding the PRA and how to be compliant at my agency..

Actual Barrier: Agency staff and contractors limit or never get feedback from the public due to understanding the policy as barring user research.

Approach: Read through existing policy guidance for design opportunities.

In April 2022, the Office of Management and Budget (OMB) published the memo M-___ on Improving Access to Public Benefits Programs Through the Paperwork Reduction Act.” The memo encouraged agencies to continually improve their understanding of their customers and minimize public burdens. Overall, it affirmed that conducting user research was not a violation of the Paperwork Reduction Act. This memo emerged from previous guidance articulated in OMB’s Circular A-11 Section 280.

The memo provided the following resources for agencies:

  • Examples of where PRA may or may not apply
  • Template for agencies to establish generic clearances, allowing agency researcher to easily fast-track low-burden research activities

Based on this guidance, our team conducted the following analyses:

Borrowed approaches from peer agencies

Perceived Barrier: This policy guidance requires that I build everything from scratch.

Actual Barrier: The OMB guidance doesn’t speak to the specifics within agencies.

Approach: We looked at existing resources resources worked well from our peers and remixed them for our own use cases.

Generic clearance comparison

This research study evaluated about 30 existing PRA generic clearances and the OMB guidance for conducting customer experience research across the federal government. In looking across the clearances and OMB guidance, we had 4 main goals:

  1. Understand what OMB-Circular A-11 Section 280 generic clearances are available:
    • What existing clearances already exist within CMS?
    • How do those compare with clearances across the federal government?
  2. Understand what generic clearances are best suited for conducting customer experience research:
    • What existing clearances exist within CMS?
    • How do those compare with clearances across the federal government?
  3. Learn from good and bad examples of what’s out there:
    • Are there incentives baked into a clearance for conducting research?
    • What amount of burden hours do agencies typically allocate for this clearance?
    • Who can use this clearance?
  4. Help us decide on the nature and scope of any recommendations CXO and OSORA should consider to solve the problem statement.

To prepare for the content comparison, we chose 30 clearances that met at least four of the following criteria:

  • Are generic
  • Are historical active or active
  • Comparable to the existing CMS generic clearance for customer experience research
  • Have “OMB Circular A-11, Section 280 Implementation” in the title
  • Used keywords: “usability testing”, “customer interviews”, “customer feedback”, “qualitative feedback”, “customer experience”, “customer satisfaction”, “user research”, and/or “service delivery”

As themes emerged, we focused on clearances with high numbers of fast-tracked information collections (ICs). We set out to learn what activities these clearances serve and how different clearances compare across the federal government for customer experience research. Using a set of criteria we designed, we looked closely at these websites and evaluated them on:

  • # of Burden hours
  • Annual Cost to the federal government
  • Cost burden
  • Payment and gifts to participants
  • Activities supported (usability vs. surveys vs. customer interviews)
  • Supporting Statements
  • Approved fast track clearances

Several things stood out when we compared 30 generic clearances:

  1. About 17% (5) of clearances allowed for participant compensation.
  2. Nearly half (13) of clearances were directly tied to Section 280.
  3. Some clearances calculated annual costs by including all internal costs to the agency.
image

Alt text: The picture shows a screenshot of the Coda dashboard created to analyze clearance information collected.

Agency and internal source comparison

This research study evaluated existing Paperwork reduction act (PRA) informational content within CMS and other agencies. In looking across the existing informational content, we had 4 main goals:

  1. Understand what agency-specific information about PRA exists publicly
  2. Understand existing CMS internal resources about PRA
  3. Learn from good and bad examples of what’s out there
  4. Help us decide on the nature and scope of any recommendations CXO and OSORA should consider to solve the problem statement.

We chose five public-facing content from other agencies and two internal CMS resources.

We set out to learn what activities these clearances serve and how different clearances compare across the federal government for customer experience research. Using a set of criteria we designed, we looked closely at these websites and evaluated them on:

  • References to existing clearances for CX/UX work
  • References to internal points of contact
  • Steps of approval process
  • References to PRA.digital.gov

Several things stood out when we compared six agency public-facing resources:

  1. Most information housed within Office of Chief Information Officer, some exceptions
  2. DOE only resource to plainly detail steps for navigating the in-agency fast track system for CX work
  3. DHS only resource publicly tying PRA information and guidance to CX work
  4. Only 2 out of 5 agency resources link back out to PRA.digital.gov
  5. DOI only agency to directly reference existing generic and fast track clearances available enterprise wide

Brought stakeholders along

Perceived Barrier: I am worried about the relationship with X department. By speaking up, I might mess things up for my group.

Actual Barrier: No one talks about policies that may ruffle feathers.

Approach: We brought stakeholders along, empowering them as true co-designers.

To prioritize our focus, we facilitated a kickoff workshop with the following offices likely most impacted by these product opportunities:

  • Office of Strategic Operations and Regulatory Affairs (OSORA)
  • Office of Burden Reduction and Health Informatics (OBHRI)
  • Office of Communications (OC)
  • Office of Strategy, Performance, and Results (OSPR)
  • Digital Service at CMS (DSAC)
  • Office of the Administrator, Customer Experience Center of Excellence (CXCoE)

We started by introducing the problem framing and statements we discussed earlier in this blog. We provided participants with a brief opportunity to validate our understanding of this shared problem space at the start of the call. Additionally, we made time to meet with each team before the official kickoff meeting for additional context about their unique interactions with the PRA.

Based on that shared consensus, we introduced the three design opportunities we identified. We guided each office to share with the group the following:

  • How might we collaborate?
  • Based on your experience, what should we consider?
  • Which design opportunity should we prioritize?

By engaging in this exercise, we reduced anxiety from teams concerned about this work and increased participation for teams responsible for delivery. Their insights led us to prioritize creating an internal guide, based on the clear short and long-term impacts it could have on service delivery.

How we structured the guide

When creating the guide, we adopted an iterative process to refine the scope of what a minimal viable product (MVP) would look like. We quickly learned after our first round of user search that solving challenges related to using user research for informing policy, a useful application for some CMS teams,  lied outside our scope to maximize our guide’s impact. When articulating this focus to our stakeholders, we first divided our audiences into primary (prioritized) and secondary (de-prioritized) audience types.

The primary audiences included people who…

  • Do customer experience research
  • Want to talk to the public to get feedback
  • People who advise on the legal implications of talking to the public

The secondary audiences included people who…

  • Do statistical research
  • Manage CX activities
  • Want to talk to the public to inform agency-wide policy or regulations

Information architecture

We structured the guide accordingly:

  • Introduction: Introduce the PRA and customer experience (CX), provide organization air cover
  • Get quick feedback from customers
  • Understand the PRA:
  • Navigate the PRA:

Introduction

The welcome page serves as a one-pager with an authoritative message. By letting users know what the guide hopes to accomplish, we set clear expectations for how this guide aims to help. The welcome page also introduces why the PRA matters. It immediately prompts users to reach out to OSORA or the CXCoE for additional questions about the PRA and customer experience, respectively. Lastly, the guide features a recorded video, featuring the CMS CX Chief Experience Officer, Ariele Faber, Kerrian Reynolds, the Deputy Director of OSORA, and Bill Parham the Director of the CMS PRA Staff, that provides aircover for any low-burden customer experience research activities.

Get quick feedback from customers

Modeled after DHS CX Directorate’s approach, this page quickly allows CMS employees and contractors to know what low burden research they could do by tomorrow. In speaking with the DHS CX Directorate staff, we learned that users at DHS found this page extremely useful in validating their ability to quickly complete a small amount of user research.

Understand the PRA

This section covers the basics about the PRA identified as most relevant to CMS employees conducting customer experience research activities.

There are two subsections highlighted:

  • How the PRA works: This section describes pieces to keep in mind when having conversations with OSORA about PRA. By no means is this content exhaustive. The Office of Strategic Operations and Regulatory Affairs closely reviewed this content for accuracy.
  • What to consider: This section describes PRA considerations when planning research. It walks through how the project or product lifecycle services as a useful framing when deciding why and how to collect feedback.

Navigate the PRA

This section covers the basics about the people you need to connect with and what to consider when choosing the right process to pursue for your research study.

There are two subsections highlighted:

  • Connect with people: This section shares contact information for people responsible for the PRA, internal organizations with their own PRA resources, and relevant CMS communities of practice.
  • Select a PRA-compliant process: This section shares study designs for PRA compliance without clearance, steps for the normal clearance process, and the fast-track process for individual information collections.

Reflecting on impact

By building this guide, we grew a community of practitioners who are trying to build their understanding of policies that impact customer experience research activities. More importantly, we removed some of our colleagues' administrative burden, enabling them to have conversations with the public. Additionally, we built and shared knowledge of what’s working and progress we’ve made in addressing what doesn’t work when it comes to improving internal customer experience operations.

We know that users may not read the guide in its entirety. However, documenting authoritative perspectives empowers CMS employees and contractors to ask questions and validate why conducting user research is not illegal. We enjoyed sharing our learnings and building on this work with our colleagues at 10x, OMB, and other digital teams across the federal government.

In the next section, we close out this discussion with key takeaways from this work.

Conclusion

Addressing perceived barriers in government policy requires a different approach that emphasizes collaboration, capacity-building, and design thinking. Our experience highlighted the importance of bridging gaps between policy compliance and user engagement. By engaging stakeholders, refining internal guides, and learning from peer agencies, we created a resource that empowers CMS employees and contractors to conduct meaningful user research without undue administrative burden. This effort not only clarified complex processes but also fostered a community of practice dedicated to improving public service delivery.

As we continue to build capacity in our agencies, it is crucial to continue sharing roadblocks, involving policy colleagues as co-designers, and leveraging existing solutions to enhance our work. Building organizational capacity and creating sustainable processes are key to maximizing public impact. While the guide may not resolve all PRA-related concerns, it serves as a way to share the expertise within CMS and a foundational step towards reducing fear and increasing trust among practitioners.

Ultimately, by aligning our efforts with the policy's intent to simplify public interactions, we can better support our colleagues and enhance the public's experience with government services.

While the guide alone did not make all CMS employees and contractors' concerns and questions about the PRA go away. Instead, the resource remains as proof of the subject matter expertise  lies across the agency and serves as a foundational capacity building step. By building capacity on this subject, we created organizational air-cover for all.

Reflection questions for user experience teams

  • How might we reduce fear and increase trust?
  • How might we better support our colleagues to increase public impact?
  • If the spirit of the law is to make things easier for the public, does your team embrace the policy’s intent or uphold it as a barrier?