This application is far too large to cover in a single consumable case study. In order to best facilitate your understanding of my approach, I will tell the story through the lens of a wicked problem we are helping to solve, the modernization of the electronic health record system at the Department of Veteran’s Affairs.
While the story of the VA will be the main focus of this case study, my responsibilities involve directing the overall design vision and strategy for the organization, as well as developing and mentoring a rapidly growing design team. These higher level functions have informed and modified how I work and I will pull from those experiences as well.
Over the next 10 years, the Department of Veteran’s Affairs is focused on an initiative to modernize their electronic health record system.
The Department of Veteran’s affairs provides the following information on their objectives for this migration:
All in all, the project sounds like a win for veteran’s and providers both. However, updating VistA has proven to be incredibly complicated, and indeed as failed multiple modernization attempts, due to the large number of package and data dependencies throughout the system and the VA struggles to have an accurate view of their system. VistA as a piece of modeled software is an absolute hairball.
EAGLE6 helps untangle the hairball the VA has been unable to untangle.
Traditionally EAGLE6 was separated as a piecemeal product and each module existed as its own self contained experience that sort of worked together. It was not until 2018 that I was moved into a Principal Product Designer role and was tasked with designing a product that would become a complete enterprise solution.
I work on a team of a product owner, a business analyst, bring in other designers for support, consultation, and brainstorming sessions. We often worked directly with our end users and sometimes their project managers to ensure that ideation translated into features that addressed their needs and motivations.
I am the primary designer and stakeholder for our design system, leading the charge on process, tool sets, collaboration, patterns, principals, and our UI kit. I am the primary touchpoint for all design decisions and how the rest of the organization interacts with design and our artifacts. I adopted and tuned a workflow status tracking system for our Figma prototypes to promote alignment with engineering and keep product owners up to date. Our prototypes, combined with our design system, and workflow status tracking helped to evangelize ideas and drive faster decision making with fewer errors.
On the overall product, I design across and promote platform consistency across a large enterprise application supporting 8 different feature “verticals” consisting of up to 13 designers, 6 product owners, and upwards of 60 front and backend engineers. I am the lead designer for the VistA modernization features and while they are incredible features aimed at helping improve veteran’s lives, my primary job function is overall design vision & strategy and I needed to make sure the patterns we were adopting here, worked across EAGLE6 as a whole.
I executed research, journeys, wireframes, prototypes and constructed a design system with a UI kit built from Material Design Guidelines.
I am the Principal Product Designer at EAGLE6 and evangelize design and design thinking throughout our organization and within our application. I presented works to gain buy-in from the executive team, senior stakeholders, and the remaining feature vertical teams across EAGLE6.
We needed a way to quickly identify the status of a Figma file, as well as direct team members and key stakeholders through the file in bite-sized journey pieces.
This left little doubt what the stage of our process any particular feature was in and also provided walk throughs and descriptions to facilitate documentation and review across key stakeholders in the software development life cycle.
"Design thinking is a human-centered approach to innovation that draws from the designer’s toolkit to integrate the needs of people, the possibilities of technology, and the requirements for business success."
TIM BROWN, EXECUTIVE CHAIR OF IDEO
I try to approach design in the lean approach. I believe that measuring design on the breadth and depth of deliverables is inferior to measuring success based on the experiences. Lean UX allowed us to focus on rapid prototyping, user feedback, and quick iteration of design mockups. This approach helps ensure early team-wide alignment, encourages new ideas, and honestly, the ownership felt across the multiple disciplines is amazing.
Collaboration is key to my design process and throughout each step, I make sure to involve key partners and focus on the deliverables my team feels are valuable to them. Delivering some sort of artifact that no one gets use out of and doesn’t serve as useful documentation is costly and honestly, wastes valuable time. I want to make sure I’m solving the right problem, not symptom chasing, and testing as many iterations as I can for valuable feedback.
Defining problems and coming to understand them takes a large amount of both time and effort. Every time you have a conversation, do a live design session, or test an iteration, you learn something new. Honestly, that is my favorite part about product design; that I can always increase my understanding to improve an experience for someone to make their day even somewhat easier.
Fortunately, we have the ability to have some really great open dialogs with not only the stakeholders and project leaders at the VA, but also those down in the mud doing the day to day work. Being able to talk with your users and even the folks that are going to be receiving output from your application is fantastic and brings so much to the design process. We’re able to brainstorm, storyboard, and partake in my favorite design activity, live design. The ability to get into a Figma file with users, product owners, and other stakeholders and work through a problem, massaging an idea to something everyone feels good about is a lot of fun and leads some some pretty miraculous things.
Throughout the beginning of the product lifecycle we did as much research we could while still moving the needle. We worked on assumptions while researching, challenged those assumptions, and adjusted where we needed to. In order to get started, we first had to sort out who we were empathizing with to help solve this problem. Fortunately for us, the VA was clear about the teams and users that would be engaging with our product and we were able to setup some interviews to talk with them.
In all we target 11 personas for EAGLE6, but functionally, 5 of those are supported for the VA initiative. The Software Engineer (MUMPS) is the most relevant, therefore, I’ve chosen to highlight him here. Gathering the data to model this user was actually really fun. We sat down with an engineer working at the VA on the system we’re trying to help modernize and had a conversation.
The conversation involved the following stakeholders:
Given the nature of the problem we are helping solve, getting the user model of the software engineer as accurate and focused as possible was paramount to the success of the experience. They would, after all, be our primary end user. We’d need to provide some reporting and progress tracking views for the project managers and leadership team, but at the end of the day, it’s the Software Engineer that will consume the bulk of this feature set.
The other personas were modeled using a combination of research and interviews involving the majority of the EAGLE6 team.
The personas were incredibly helpful throughout the entirety of the process and they were even revamped as business asks shifted and interviews continued.
The VA MUMPS comparison feature is one that allows the VA’s project team to load in two different versions of their VistA system and compare them to each other to see what’s different. This journey map shows not only the path, but how I relied on existing work from other designers throughout my journey to support the business ask.
Talking through how someone is going to walk around your system is important. During interviews with users and key stakeholders, we would identify expected journeys through key features.
Live design is my favorite activity to do with my team and with my users. It’s interesting because, in terms of the VA, we are solving a pretty specific problem and the users are pretty good at describing the perfect output and we are able to design backwards from that.
Here you see some direct output of a live design session where I worked directly with my team and an end user. They needed a report to help them establish some out of order menu items and how they related to the overall VistA system.
The output of this session aided in fulfilling a final proof of concept that lead to a multi-million dollar contract award.
I conducted the following iterative loop an obscene number of times.
The scope of what we were working on, the technical issues, and even sometimes the people issues was mind-blowing. In hindsight, we were very fortunate to have the amount of access to users as we did. At one point, I was bouncing prototypes off an actual user nearly once a day!
The bulk of our testing was remote, with the occasional in-person testing. Sometimes our testing was done through a project management team inside a partner company's Client Services, and sometimes I was able to just sit down directly with our target group and do some testing.
I tested everything from whiteboard sketches on the spot, low-fidelity prototypes, high-fidelity prototypes, and even the occasional low road code project. Sometimes, the data was so complex that we had to code a small iteration, far below MVP, to make sure we were on the right track.
A lot of our asks were coming from a Client Services team with a partner company. Testing to make sure that we were solving the right problems and not a symptom of the problem was paramount. For example, when the requirements for the Comparison Matrix came across my JIRA status board, I said "Wait. What?" This requirement meant that they wanted to see 131 Hospital columns and up to 16,000 (!!!) rows on a screen at once. The number of times myself or the developers said, "There is no way they want this. They can't even use that much data at once, the browser might even blow up." is astounding. I distinctly remember asking "Well, why don't we just ask them." So, that's what we did.
It turns out, they wanted a version of that. I had initially, based on the original requirements mocked up a quick very low fidelity prototype to test. To our surprise, it was actually pretty close to what they wanted, but it was far less usable than they expected. A key piece of information is, the VA doesn't actually know how complicated their VistA system is. They had very little idea that the matrix would be nearly 43 feet long had we printed the matrix at a 1 inch data point scale.
The initial usability testing of this, was as expected, very poor. The browser took a very long time to load it, There were so many rows and columns that tracking down the ones that mattered to them was very difficult. I'll be honest though, we put less effort into this than we should have. We assumed the requirements were wrong and they didn't want anything like this. We tested our own assumption, and found out our assumption was very wrong.
Instead of a report style matrix, this needed to be a sort of explorer with a quick heads up display of classification point. They were manually tracking this by hand and it was often inaccurate and very error prone.
Iteration 2 of this feature led us to something much more usable and gave us a valuable insight into how the project teams at the VA not only reported their work, but how they could use our tool to justify spending and explain progress to Congress. Our next iteration allowed them to search for and view specific packages on the left side and only select the hospitals they were responsible for along the top. We also included some key iterations that allowed them to deep dive through the matrix to gain insight into package status that they previously never had before.
Our team learned a lot about assumptions throughout this particular feature's usability testing and iteration loop.
The number of user interviews, data points analyzed, persona iterations, wireframes, and prototypes made during the course of my iterations on this project over 2 years still boggles my mind.
Throughout my development of this feature set, I was also developing a design department, crafting an interview process, mentoring 3 junior designers, and also working on a proof of concept for the Department of Defense.
I've never worked on anything of this scale, application-wise, as well as potential earnings-wise. This project was big for my company and the award eventually led to an explosion of company growth adding nearly 80 people and even allowing us to move into a newer space. I am proud to have contributed to this.
I previously worked as a front-end engineer, so I had a good bit of technical knowledge, but I quickly learned this was out of my depth. I initially struggled with this, but once I was able to bring in the developers and include them in my process, we started firing on all cylinders. There is something truly magical about actual collaboration taking place throughout a product lifecycle.
Perhaps my biggest fear though, was failure. This contract meant a lot to my company. It was something they had been working toward acquiring for nearly 6 years before it was finally awarded. There were a number of big data issues to climb over, not to mention the dealing with the cognitive load that some of the views could potentially show. By breaking this wicked problem down with design thinking and working with a truly collaborative team working toward a single goal, we were able to solve it and help the VA provide our nation's veterans with even more outstanding care.
Disclaimer: This is a very small selection of over 150 UI screens I am able to share. The majority of the detail has been scrubbed from these due to the sensitive nature of this project.