Scaling a product is tough. My design team exploded from 2 to 10 designers rather quickly and eventually settled around 8. The product was experiencing growth and entirely new product verticals were being added that needed to fit into an application experience that already had a large customer at the Department of Veteran's Affairs. A UI kit was my first attempt at a quick "get everyone on the same page" solution.
While that got us most of the way there, it was relatively inflexible and it only accounted for the UI and a small number of UX cases within our product. It didn't account for a lot of patterns and it gave designers very little direction about how to use it.
Around this time, I watched a really great presentation hosted by Figma about a design system at Credit Karma and the types of things they are doing to help with alignment in their organization and it addressed many of the issues I was seeing. You can checkout that presentation here: In the file: Aligning around a design systems workspace. I based all of our individual file structure on this. It is perfect for thorough documentation and I saw no need to mess with something so perfect. This presentation motivated me to build something I could scale and perfect over time with feedback from my team.
With the design team and the product growing as rapidly as they were, I started looking for ways to keep everyone in sync and the product manageable and consistent. It became evident that I needed tooling for my designers to not only help them produce a consistent user experience, but also guidelines and patterns they could easily describe to engineers and the QA team.
At the time, the business, mostly Training and Client Services, ****was struggling with the number of minor consistency issues the product had. Material Design is relatively vague in a number of places and the design team, spread out across the globe, was operating inside these grey areas. The problem business faced is, things became slightly different throughout the app and the downstream implications of that compounded rather quickly to a fairly large amount of design debt.
We, as a company, had made the decision that we didn't want strict approval processes. Product verticals were able to move relatively independently in order to promote development speed, but it eventually led to the culmination of 500 UI bugs eating away at a polished user experience.
After a meeting with the CEO, VP of Application Development, VP of Architecture, and the Principal Front-end Engineer, I was able to convince them that I needed to focus less on being an individual contributor, and more on DesignOps. We needed design system made up of light processes for validation, approval, and quality checks; a UI kit that encouraged the flexibility they wanted the product verticals to have, but one that operated within a strict set of guidelines that focused on accessibility, flexibility, and scalability.
I worked as the sole contributor to this design system to construct a team, processes, and tools that would allow us to operate efficiently inside an Agile development shop. My designers, front-end engineers, and quality assurance team were my target audience and I was off to develop a system that would remove ambiguity from design and promote rapid design work.
I based my system off Material Design, Fluent Design, Credit Karma's documentation and process frame, and the book Expressive Design Systems by Yesenia Perez-Cruz.
Material is an adaptable system of guidelines, components, and tools that support the best practices of user interface design.
Our product already had a bit of a UI kit based on Material Design guidelines. We were exploring other options like this at the request of business leadership. This lead to some large conversations spanning Design, Engineering, Architecture, Client Services, and Leadership. From these conversations, we discovered that our current approach to design didn't offer the flexibility we needed to support our product growth. We weren't ready to budget for a large scale swap from Material Design to something like Fluent Design, but we wanted to build a design system that we could scale in those directions and still feel cohesive.
I had to make sure my system was more than just a UI kit. I needed research, decision support, audits, interactions, patterns, and also process to all be accounted for. To get this started though, I needed buy-in from my design team. I could build the best system in the world, but if it didn't solve their needs, they'd simply ignore it. This would be a key disruptor to how they work every day. It had to be comfortable, not introduce any burden to their workflow, or add overhead to approval.
Throughout my first iteration, I spent a great deal of time on interviews with the Design, Engineering, Quality, and Training teams.
Oddly, I'm not sure that interviews are always the best research method to solve problems. I knew in this case though, it was going to be my primary source of defining my problem. I needed the teams to tell me what Design, particularly my design system, could do to help them be more efficient with their work.
I'm going to start with the Training team, because they brought up a few things I had never thought about. The training team consisted of between 3 and 5 people. They originally were not on my list of users, but questions kept popping up from them and I realized I had completely neglected and entire channel of the user experience. I'm really big on cross channel user experience as part of an organization's strategy, so missing this surprised me. Their key issue mostly came down to content strategy. There was no documentation on what we were calling some things and it lead to a fair number of inconsistencies in the UI. The trainers found themselves confused a fair number of times and it lead to a poorer quality user manual and more importantly, customer experience.
They requested we create a component glossary that identified a component, all its variations, the official name, and its last updated date. This allowed them to standardize their training manuals and also gave them a key piece of information on when they needed to identify changes.
For the Design team interviews, I had prepared a list of questions I wanted to hit throughout the conversation. I try to drive my interviews as a conversation and not a Q/A format. I believe Q/A formats can sometimes feel like an interrogation and lack the option for elaboration and exploration. I had a number of points I wanted to hit with 8 designers:
I was most curious about how they wanted to work. We had started to hit a nice stride as a design team and I didn't want to interrupt that. There were consistency issues though and it was causing bugs. My primary concern was building a UI kit and supporting processes they would use.
All in all, this fully defined how I approached the whole project. As I worked, I continued to interview them. They were the primary consumers. If it wasn't easy to use, or better yet, enjoyable to use, they wouldn't use it and the UI would continue to have on average 50 "design" bugs in each 1 month release.
Perhaps the most resounding agreement amongst the design team is they didn't want the component naming rules to force too deep of nesting. They wanted to be able to find a component quickly with a keyword search or with as few clicks through a nested menu as possible. Because of this, I knew I would need to come up with a consistent content strategy that would help me easily add idea keywords to the description field in Figma for designers to find the components easily.
The second most impactful insight the designers provided is that their agile teams were on their own individual sprint cycles and they needed a bit more granular control over component updates and how those propagated throughout their design files. One vertical being ready to make an update, didn't mean all verticals were ready for the same update. This lead to the file structure you see in the image below, as well as a process for component update introductions with the Product Owners and VP of Application Development.
The Engineers and Testers were asked a simple question that led to a specific change in how we were doing this: "How can Design give you what you need, so that we can get out of your way and let you do what you do?" They needed components that they could easily see specs on not only the whole component, but also individual sub-element components.
All the 4 Engineers and 6 Testers I talked to cared about was getting exact information. They wanted to rely on their designers to use the components correctly and Engineers wanted to copy the styles straight from Figma with confidence, while the Testers wanted to be able to easily compare implemented components to documented components in Figma.This lead to two large changes from my previous approach to UI kits. First, we included a documentation page where we redlined components.
The redlines include all correct sizing, but also supported the color properties both Engineers and Testers needed to properly evaluate color.
Our documentation also needed to account for interaction states and what the implementation of those properties looked like. The interviews with Engineering and Testing also added a sub-element architecture to my components. I utilized Atomic Design principles and user testing with our front-end engineers to get to the correct level of granularity. Throughout the UI-kit, I leveraged both a naming system of .sub-element / and an individual Elements file to help the develops identify reuse and dependency across files. They in turned used this identifier to help them construct their Angular templates in a more consumable fashion that led to greater adoption of code reuse across product verticals.
As I completed components inside the system, I added them to a stress test environment that the other designers could test them in.
The components were tested as early and as often as I could. Due to the way the files were structured, I was able to pull in downstream changes at a granular level. Changes don't always happen across all components at all times and I wanted to mirror that in my UI kit. This meant I could make larger changes along the pipeline and then test their impact on small chunks of the design system. I had to make sure the system was usable and that they would use the system. If constraints weren't properly set, auto-layout done incorrectly, or grids were messed up, then time would just be spent building something that no one would use and we'd be back to square one.
While my design team would be the primary user of the full Design System, Engineering would also be a big consumer. Making sure they were able to get the details they needed from the components, i.e. size, color, elevation, was paramount to the success. We had went through a large front end technology migration previously and it left a significant amount of debt in the implementation of the UI as designed. The old UI kit didn't give the engineers what they needed and I didn't have a design system in place with the proper review steps to not end up with a large pile of "design" bugs.
Testing and validating what I was doing with the UI kit helped me develop down stream processes that still let our teams remain agile, but record less UI bugs. I could watch the front end engineers seek out a component, consume it, and then validate their implementation. As testing went on, I was able to iterate and build easier to code components.
Throughout this stress testing, I gathered and surveyed my team as much as possible about naming and how they wanted to work with component state. There were a lot of opinions about naming conventions, but not nearly as strong as opinions about organization. It was interesting working through a couple different organization systems at the same time and iterating on them.
In the end, we settled on having light and dark stickersheets that are organized by component / use case / state. It works pretty well and through description search built into Figma, we can really quickly find components to swap through.
The designers were going to be forced to recreate some screens using the new UI kit components. This allowed us to test that new components were flexible enough to not break existing views, therefore not losing current state source of truth documentation, but also that they could be used to create something that felt fresh and new.
All of this testing definitely led to finding some issues. In some cases, I had taken Atomic design principles too far and the engineers struggled with getting Angular components and design components in Figma to line up in a consumable fashion. In other cases, I didn't take them far enough and designers were losing their overrides when swapping components based on interaction state.
I learned a great deal about building a system that's not only flexible to design with, but also one that is flexible in how changes roll through the system.
DesignOps is amazing.
DesignOps really brought my creativity alive. As someone that has been in this field for almost 10 years, I had my share of struggles with process and UI kits. I always wanted the tools to get out of the way and let me do my job.
Growing a design department from the ground up is tough. Designing a product with 8 verticals and 100+ developers working on it at any given time really puts pressure on design.
Designers, engineers, and testers all consume the same documentation to make the best product they can make and to produce the best UI they can. There are a lot of potential holes when designing purely from a UI kit. That process that everyone seems to hate really boosts the ability to build a strong user interface and user experience, and they don't have to be heavy processes, nor do they have to be burdensome.
If you know your team will require some flexibility in component architecture, you can build that into your UI kit, then when it's time to check into that, it's easy to scroll down the layer list and check for broken components. If there's a broken component, it's an ideal place to have a conversation. Those broken components are where consistency issues come into play and where you'll see UX metrics take a potential hit or a UI bug crop up. When we started we had an enormous amount of broken components, but now we have a process for new component introduction and validation that leads to better documentation, better implementation, and a better product.
The training piece, was a huge surprise for me. It allowed me to integrate a new channel into my design system and provide a better overall customer experience that led to an decreased UAT environment acceptance time.
This was a huge undertaking. The bulk of the manual work was definitely in the UI kit and I was terrified throughout the whole process. However, I was talking to my designers, engineers, and testers all the time. I took their feedback seriously and at the end of the day, I made something useful that increased productivity and led to a much stronger product and a high functioning design team.
Disclaimer: This is a very small selection of over 150 UI screens I am able to share. The majority of the detail has been scrubbed from these due to the sensitive nature of this project.