Symantec's Mountain View headquarters

Symantec
A look at two high-impact projects

Role Overview
Symantec is the largest pure-play cybersecurity company in the world. The business unit I was part of, Network Protection Products, represents dozens of products under the Symantec Enterprise umbrella. The products under NPP are dedicated to policy, analysis, and investigation.
I was hired as the second designer in the Utah office to assist UX Design Manager for NPP, David Monson, in building a strong Design discipline for our division from the ground up.
At Symantec, I was lucky enough to stretch in ways that were outside of my comfort zone.
I was forced to reckon with extreme constraints like:
Esoteric, highly complex concepts and taxonomies
Analytics on petabytes worth of highly sensitive data
Powerful customers like government agencies, militaries, banks, etc.
Navigating bureaucracy in a large, changing organization
Bloated product portfolio from acquisitions and products designed by engineers
Other problems that are more often associated with start-ups or SMBs:
Educating nondesigners on the value & role of design
Developing Design process and strategy from the ground up
Helping Product Mgmt determine the right success metrics and product strategy
Bringing developers together to improve front-end implementations
Maximizing efficiency through tooling and clever shortcuts
Below, I'll highlight the two most interesting initiatives I was part of:
Creating foundations for a fully-realized design system
Bringing a flagship product, Security Analytics, into a new and improved paradigm
Discovery
Problems encountered:
Experiences are wildly inconsistent between products despite a business strategy based on creating complementary product suites to be sold together
The Design organization is underresourced meaning that a design system will be difficult to get off the ground, but vital to mature
I recorded some important known unknowns to start:
Is there something similar to a design system in any part of our organization?
If there is, does it meet the needs of us and our users? How might we improve it?
What are the right tactical objectives and how might we ensure those align with the long-term strategic goals for the project?
I needed to seek out the right people to get answers.

Example of "Stratocumulus" PDF style guide
Learning from others' failure
I learned from a UX designer in another business unit that there had been many separate efforts to create a single source of truth over the last couple years in their organization.
These efforts included:
Three internally developed PDF style guides for UI
Several style guides from acquired companies
Abandoned front-end UI toolkit
Why had all of these previous efforts failed despite good intentions and talented contributors?
I spoke with anyone who might have useful info to learn more.
Identified factors:
Lack of buy-in from upper management
UX leader for other division had no background in Design
Headcount was too low for desired scope
Failed to consider variability of product and user needs
Stealing like a designer
We didn't have the manhours to create a fully custom design system from scratch.
Upper management wasn't motivated to fund the kind of design system one would see at a more design-mature company.
This was a reality we seek to change over time, but one we had to reconcile in the meantime.
So I used designsystemsrepo.com, designbetter.co's Design System Handbook, etc. to scour every system and case study I could find, evaluating each based on the following:
Completeness of UI library (esp. pertaining to needs for data-heavy enterprise apps)
Resources for developers
Clear guidance and principles for UI elements
Flexibility, rapid customization, and rapid deployment
After evaluation, I recommended starting with Google's Material Design which, serendipitously, had just been overhauled to improve flexibility and data treatment and also released additional tools for designers and developers.
As Material Design was created primarily with consumer applications in mind, I cautioned users of the system to look out for gaps in enterprise use cases and pointed to using IBM's design language and testing when gaps arose.

Pages from Material Design

Roadmap to V1 of the design system
Sharper definition
We had more autonomy than the other team, putting us in a position to achieve better outcomes.
I now had enough information to brainstorm tactical objectives and strategic goals.
Near term:
Use Material Design to build out customized UI elements
Organize/distribute using Invision's Design System Manager
Iterate on processes for constant curation, ensuring a living ecosystem
Involve product teams to validate direction
Long term:
Prove value to business-side decision makers to gain support and resources
Regular review and shared understanding of how to apply the system
Collaborate with Engineering to bridge the gap between UI illustrations and code
Foster shared understanding of full product portfolio company-wide
I created a roadmap to guide the first version and got to work.
Design system design
I collaborated with intended users on styling and everyone shared examples from our products to find overlaps and differences.
I created the elements in Sketch and uploaded them to Invision's DSM. For most items, I included short guidance on how the pieces were intended to be used as well as links to the Material Design website for more detailed guidance.
On the front page, I provided instructions on what to do if anything in the system wasn't the best solution for the product experience.
The resource we compiled:
Allowed us to move our attention away from low-impact, duplicative UI work to high-impact activities like user research, testing, collaboration with product teams, design experimentation, etc.
Makes onboarding new designers far less painful and time-consuming
Ensures consistency across products with the best solutions we can find
Helps everyone learn from problems that other designers run into

Where we are today
The siloes we discovered in trying to standardize design across the company were disturbing.
Acting as separate companies is antithetical to the vision of a complementary suite of products we were working towards and discourages valuable interactions between us.
A fantastic designer and leader, Dale Brewer, is now leading Design for Symantec Enterprise. Dale agreed with me and my team about the shortcomings of previous efforts and lauded our approach.
My team joined Dale to run the first in a series of workshops, kicking off a multi-faceted standardization push across Symantec Enterprise.
By carefully choosing our timing and who we presented with our case, we gained the active support of key people in upper management.
We're also in talks with engineering leadership and front-enders to understand and eventually standardize frameworks, technologies, methods, etc.
We plan to collaborate on the creation of code UI kits and tokenization of simple elements.

Background
Security Analytics (SA) enables the world's best security analysts to capture and analyze every packet on a network to find and triage the newest, most sophisticated attacks. I worked on many products while at Symantec, but I made Security Analytics my top priority as it has critical problems and represents a significant portion of revenue.

Examples of other pages in Security Analytics
Generative Research
Early research included a site visit and conversations with Sales, Support, Product, Engineering, and internal users.
This helped flesh out a few starting points:
UI was visually dated and unattractive which failed to communicate the sophistication of our technology and gave competitors an advantage
Data visualization was limited to defaults from a mediocre, 3rd party technology
SA had no dark theme despite being used in a dark environment, causing eye strain
Workflows were poorly thought out because of feature request driven development
No in-app onboarding or novice user support, only in-person training or modules
My approach:
Understand and validate each problem to prioritize work
Ask the right questions, find reliable answers, and encourage new insight
Decide on appropriate methodology and depth required to move forward


Diagram of one user's setup
Overcoming distance between us and our users
Getting access to our actual users was challenging because:
Security professionals are wary of providing data freely
Users and customers are often different people
Historically, designers had trouble setting expectations and coordinating with Sales which negatively affected contract negotiations
Some companies don't or cannot send data to us
The way we overcame this was to:
Recruit sponsor users that are highly motivated to provide feedback through Sales, Support, surveys, and site visits
Set clear expectations for/during interactions and coordinate with the entire team
Define success metrics, get as much clean data as possible, and strive to learn about the stories behind the data

Report of critical survey findings
Survey
Interviews helped us group our users into addressable buckets, but we still didn't know details like the ratio, overlap, or differences between them. We also needed a deeper understanding of the areas of opportunity.
I created a brief with clear goals and lines of questioning to keep us on track as we iterated on the survey design. With the help of our sales team and cash incentives, we distributed it to users.
We estimated that 20 responses would be an attainable minimum level of engagement to get usable quantitative and qualitative data.
With 24 responses we were able to map out essential parts of the experience:
Breakdown of user backgrounds
Differences in how Incident Responders and Threat Hunters begin their workflows
Workflows within the product; the product's place in the overall security workflow
Why multi-dimensional data visualizations could be useful
How to make dashboards more useful
We also recruited some respondents to as sponsor users to provide us feedback in the future.

Aftermath of an ideation session
Translating information into ideas
Everyone involved debriefed on the survey results to adapt to changes in understanding, evolve ideas for solutions, and prioritize the work ahead.
A gameplan arose from ideation and planning sessions:
Redesign entire UI and rewrite front-end code
Iterate and test alerts section to unlock its potential
Observe workflows to understand how to design around the investigation methods of our users
Improve data visualizations
First on the list: new light and dark themes for the product and direction moving forward.
Synthesizing a new aesthetic
Symantec had no products already with dark modes to look to for inspiration so I worked closely with my mentor, David, to create a visual direction for our products going forward.
We put together a mood board with examples of well executed dark themes, UI from science fiction, and images of sophisticated, powerful technology from reality or fiction.
We iterated on structure and layers of colors:









I tried purely grayscale and heavily colored schemes as well as different gradient treatments.
The navy iterations were favored early on as they were based on Symantec's branding and stood out among competitors' dark themes. However, we found that the navy presented challenges with contrast and wasn't dark enough for the environments it would be used in.
With these insights in mind, we moved to a darker and less saturated scheme that still has a very subtle indigo hue to set it apart aesthetically.
We also put our data visualization color sets through accessibility testing:
























































































I leaned heavily on IBM's resource for data visualizations. There are links to 40+ external studies and articles as well as the principles they explain.
After choosing colors, I ordered the divergent color sets and used the Sketch plugin, Stark, to check for color blindness and contrast. David gets credit for the methodology on the progressive color sets.
Alerts redesign
First thing I tackled was organizing the crowded data cells by logical groups, eye-scan patterns, and importance.
Luckily I had experience with this from an earlier project:


Guided by my mistakes from the earlier project I had a much easier time fixing the cells.
I watched and discussed real-world alerts workflows with one of our most knowledgeable internal users to figure out what was breaking down.
These three tabs seem to have distinct, important purposes on first glance:



I discovered this may not be true. There were too many clicks to achieve user goals, functionality was unclear and it wasn't organized around actual workflows.
I iterated to a single-page design containing all functionality in the tabs plus more:

We added in-context alert tuning to aid in reducing false positives on the fly.
The feasibility of providing real-time previews or estimates based on configuration changes is currently being evaluated.
I prototyped an example of a complex workflow to test the new ideas and validate the concept. Early responses have been mostly positive, but have led to changes in other parts of the product. For example, customizable, high-level breakdowns of alerts were designed for the dashboard page as a result.

Session-based paradigm
Introducing a session-based model into the product is the largest undertaking of any UX project identified.
Ideation sessions were held to get a diverse range of ideas using sketches and whiteboards, both physical and digital. These helped prove the value of the overall concept.
After several meetings with subject matter experts, I came to realize I didn't have the depth of empathy for our users to know what form this solution should take. I worked with the team to schedule a site visit with a receptive customer. I explained the kind of contextual inquiry I would be conducting and the questions I would be asking.

Reflection
At Symantec I improved in the following areas:
Overcoming obstacles to practicing good design.
Communicating effectively and running cross-functional meetings.
Lean UX approach to both speed up and produce better outcomes.
I want to continue improving on the items above, as well as:
The more nuanced aspects of applying methods effectively.
Testing more regularly with true end users.
Working with product teams to prioritize design work
A special thanks to my mentor, David Monson, for trusting and teaching me.