DELOITTE DIGITAL 2020

Inclusive redesign of a .gov website

I led the redesign of a premier .gov career application flow from discovery to testing. Time to application for target candidates was successfully reduced from several minutes to 20-30 seconds.


7 minutes to read, but feel free to skim. Includes scrubbed research artifacts and models. Scrubbed wireframes on request.

Top Secret Notice 🙊

This project is not yet complete, so I can't share finer details or designs (yay gov). Instead, I have provided a light and censored summary of the project and my accomplishments to date below.

If you want to get to know my process more, feel free to contact me. I will gladly nerd out for hours on my strategy. 🤓

Article Outline

  1. Overview and Key Goals
    1. Dream Team and Roles
    2. Challenges
  2. Research
    1. Stakeholder Research
    2. User Research
    3. Inclusive Sampling Strategy
    4. User Interviews
  3. Analysis and Reports
    1. Contextual Analysis
    2. Personas and Journey Maps
  4. Wireframing and Testing

1. Overview and Key Goals

Our client was a premier government agency looking to redesign their website. We had two key goals: increase the number of diverse and qualified “top-talent” STEM applicants and improve their perception of the agency's brand.

1.1 Dream Team and Roles

  • All-star Deloitte Digital cast featuring UX, Visual, Dev, PM, and QA teams.
  • Kickass UX crew including a Creative Director, UX Lead, Design Analyst, Content Analyst, and UX Researcher/Designer (it me).
  • I led our team through discovery, designed the most critical flow of the website (careers), served as a SME for accessibility, and facilitated usability testing.

1.2 Challenges

  • In the contract, time and budget for research were not explicitly defined despite being requested.
  • The client was made up of a collection of offices, each with their own goals and expectations for the redesign.
  • Short 6 month timeline from kickoff to post-delivery QA (later expanded by 2 months).
  • No access to technology during client presentations and limited communication (this was fun /s).

2. Research

2.1 Stakeholder Research

Before this project was green lit, I was a part of a separate team that conducted stakeholder researcher several months earlier. I served as a UX researcher and was responsible for planning and facilitating interviews with 12 unique offices within the agency. We decided to conduct the interviews in a panel format given that the employees provided within each office were of similar level (no power conflicts) and opinion (homogenous in goals) from prior informal conversations.

After conducting qualitative analysis on over 730 utterances of anecdotal data, I was able to output a list of 15 high-level goals e.g. There are two distinct purposes for the website - to recruit and to educate. We printed out each of these goals onto physical note cards and facilitated a prioritization workshop with all of the offices present. Through this workshop, the offices were able to come to consensus on what was feasible and desired for their new website. The decided upon key goals were to increase the number of diverse and qualified “top-talent” STEM applicants and improve the their perception of the agency.


By serving as a facilitator and helping the offices figure out their own internal politics through a design workshop, we were able to determine a north star that everyone felt satisifed with.

2.2 User Research

Fast forward back to the present. Despite having business goals established by our stakeholder interviews, we were still in the dark when it came to actual users of the website. There were also some particularly tricky challenges I had to navigate as the research lead:

  • 4 week timeline due to budget and contract constraints.
  • Unable to sample users using a public vendor due to client's nature of work.
  • My team did not have substantial past experience with user research.

Instead of panicking, I began to plan out a research strategy based on my constraints. During the first week of research I:

  • Created a research protocol document to share with the team that outlined goals, approach, desired candidate characteristics, research focus areas, and analysis strategy.
  • Facilitated a series of internal interviewing workshops to train a team of secondary interviewers from a mix of competencies (dev, pm, etc). This would expedite analysis and naturally build cognitive empathy for our users across the full team.
  • Began to recruit potential interviewees.

2.3 Inclusive Sampling Strategy

A top-talent STEM candidate can't be pinned down to a single gender, race, or ability/disability (to mention a few factors). I wanted to make sure we could talk to many different kinds of people so we could build an experience that could say, "hey, you belong here"- regardless of any factors beyond one's qualifications to do the job.

Since we couldn’t use an outside vendor for recruiting, I began by gathering a large list of potential interviewees by making use of the broader team's connections. Since convenience sampling runs the risk of recruiting homogenous groups, I created a live confluence table inspired by the Matrix of Oppression.


By using social identity categories from the Matrix of Oppression, I hoped to mitigate bias from the convenience sampling process.

I plotted the potential interviewees onto the table to clearly visualize and identify where our unrepresented intersections were. This allowed me to pivot and refocus the team’s recruitment efforts specifically onto those unrepresented intersections until all gaps were filled or directly brought up to leadership (it’s better to recognize where exclusion is than to ignore it outright).

The dangers of convenience sampling become clear when translated into a table. Can you tell what's wrong with the example below? What intersections should we be focusing on next?

  JK LA WG  JD
 Race        
White   
Biracial         
Asian   x      
Black         
Latinx         
Native         
Ability         
Temp able bodied x x x x
Temp disabled        
Disabled        
Sex        
Bio man x x x  
Transexual        
Bio woman       x

For this particular project, I plotted potential interviewees on race, sex, gender, sexual orientation, class (using salary bands), ability/disability, age, major, and education. For privacy reasons, interviewees could opt out of disclosing information and their names were only shared with interviewers. Information was collected through surveys during recruitment.

I selected 10 individuals out of a list of 30+ to schedule interviews with for the following week. To double check my unconcious biases, I asked one of our design analysts to create her own list based on the matrix for validation purposes. We promptly scheduled interviews with the potential interviewees that matched on both our our lists (~70%), and talked through our reasoning for the remaining interviewees before coming up with our final selection together.

Notably, one of the key gaps in our sample was people with disabilities. There are many different kinds of disabilities from vision, cognition, hearing, and mobility (just to name a few). Although two individuals in our sample had a disability, they were not enough to account for all disabilities. To make up for this gap, I made sure to schedule SME interviews with accommodation experts at Deloitte. This opened up a door for an incredible opportunity to design with an accessibility champion which ultimately led to a better and more inclusive final product that I would have never been able to design on my own.

2.4 User Interviews

By the second week, I was ready to begin interviews. I had already finished training my secondary interviewers through an interactive interviewing workshop.

Each interview was split into two parts: a general background interview and a contextual goal-based exercise with prompts to search for a job in order to observe live behavior. The background interview in the first half allowed us to select what prompt to provide the interviewee. For example, if we learned that they had recently conducted a job search on a website, we would ask them to show us how they did it.

Participants were encouraged to bring and use technology most familiar to them. This naturally output important information on mobile usage and behavior.

At the end of each interview, we asked the interviewee if they would be interested in keeping in touch; I wanted to be able to include them in the design process later on during wireframing and testing. Once the interviewee left, I checked in with the secondary interviewer to discuss our key takeaways. The secondary interviewer would then transcribe the interview into my pre-formatted excel spreadsheet which would set us up nicely for contextual analysis.

3. Analysis and Reports

3.1 Contextual Analysis

I decided to use Hartson’s trusty WAAD (Work Activity Affinity Diagram) method to conduct contextual analysis as it was a good fit given the scale and complexity of the anecdotal data. It also would allow me to get the entire team engaged with the research process. Historically this has helped me sync up cross-competency teams, save time, build empathy with end users, and ease design handoffs.

During the WAAD, we had 20+ participants including product managers, designers, developers, and more in attendance over the course of two days! I prepared a short introductory workshop to explain the rules of the game, and we were off to a quick start. Incredibly, we were able to organize over 600+ lines of trackable anecdotal data in just two days.

Seriously, bagels were the MVPs of this activity. We also took a small bit of time to come up with rodent mascots to cheer us on throughout the activity.

3.2 Personas and Journey Maps

Thanks to the WAAD, we were now familiar with what functions and features were most important to top-talent STEM candidates. Although the WAAD naturally categorized these in an easily digestible manner by topic and chronological order, there was still more to abstract in terms of behavioral patterns. For example, could we identify a pattern between filters used (e.g. salary) and make-or-break values (e.g. clout)?

Using Alan Cooper’s behavior mapping method, I drew out several scales across a variety of categories identified from the WAAD including filters, devices, 3rd party services, values, and more. I then led a small team of design analysts to plot the interviewees (numbered 1-10) onto the scales using small colorful stickies. This allowed us to identify reoccurring clusters of interviewees which we named and converted into our first provisional personas.

We continued to flesh out the personas by cross-checking them with our anecdotal data until we felt confident they were well representative of all the individuals we had interviewed. To mitigate unconscious bias and to encourage inclusion, I also opted to use monikers instead of real names and pictures. For example, “The G2G (got to go)” can help focus our conversations on a key behavior trait instead of say “John Gates, the Harvard Graduate” which may unintentionally transform a persona into a stereotype. This process ultimately output 4 final personas.

Our journey maps featured each persona’s expected route to apply to a job layered on top of a rudimentary emotional chart which mapped how those expectations would be met or unmet in the current state of the website we were redesigning. Collectively, this helped us understand which parts of the experience were more important or less important for each of the personas. This in turn allowed us to contextually select a primary persona based on which part of the experience we were designing. For example, one persona- the G2G- was significantly more likely to drop off from the experience early on if they were not provided easily scannable job information. I designated the G2G as our primary for our search results page, but not for our benefits page which was not a highly valued part of their experience.

Looking back, I wish I had included images of possessions to create more interest and depth to the personas.

4. Wireframing and Testing

Due to the nature of work, I cannot publish any wireframes publicly. Please contact me to get the full scoop.

  • Wireframed complete flows (100+ high-fidelity screens) based on outputted models from research.
  • Designed form states and provided annotations to account for WCAG 2.1 compliance.
  • Presented wireframes directly to client leadership with repeated success as a result of grounded and rational findings from research.
  • Provided QA support to the dev team while conducting usability tests to ensure a successful launch.

RESEARCH ON A BUDGET, 10 min read

How I saved a government service portal >>

I transformed a free-falling nightmare into a research-driven million dollar contract win.

Demonstration of keyboard accessibility

UI DESIGN DIARY

Why my portfolio website sucks >>

An annual diary of the UI changes I've made to this piece of s*** portfolio.

A timeline progression of my portfolios going from awful to bad to meh