AGENT OS

Self-Install

A B2B/B2C internal tool designed for Spectrum agents to help residential customers with the self-installation process of their equipment.

Role

UX Designer, Facilitator

Tools

Figma, Axure, Miro, Zeplin

Timeline

11 days

Methodology

Modified GV Design Sprint

Modified Design Sprint? 

The team anticipated solving for multiple scenarios within self-install scope based on the research data obtained a year ago, when self-install first came into the picture. The scope included tackling the processes of activation, terms & conditions, and fulfillment. To allow more time for prototyping multiple scenarios, it was decided to run an experiment and extend the sprint to 2 weeks, or 11 work days.

Learn, Map & Target

MONDAY

Learn about users, create a map of the problem, choose 1 target.

Sketch

TUESDAY

Sketch out possible solutions to the problem.

Decide

WEDNESDAY

Deciders pick the best solution.

Prototype

THURSDAY

Build realistic prototype till Thursday next week.

Test

THURSDAY next week

Test prototypes with Spectrum call agents.

Retrospect

FRIDAY next week

Discuss takeaways from testing + next steps for a deep-dive session.

Team Structure

The organization was comprised of 3 omni teams that were solving various problems in a series of design sprints. All the problems were related to building a new experimental tool for agents called Agent OS, which was intended as a replacement for an existing internal tool called ‘Gateway’. I was part of an omni team 2 that focused on tackling self-installation and repair issues.

FACILITATOR

UX STRATEGIST

SR. UI DESIGNER

UX DESIGNER

SOFTWARE ENGINEERS

CONTENT DESIGNER

UX RESEARCHERS

STAKEHOLDERS

3 DECIDERS

Problem Space

Spectrum self-install agents receive a high volume of calls from residential customers every year. This number skyrocketed during the pandemic, as self-install was the only available option in the realities of social distance to ensure the safety of our customers & technicians. The self-installation process in 2020 drove 4.4M calls & 1.5M truck rolls. Self-install calls were an average of 28% longer than the total AHT (average handle time).

Spectrum agents also repeatedly reported multiple issues with Gateway, the internal tool used before and during the pandemic. Eventually, it was decided to build a new experimental tool, Agent OS, to replace Gateway. The goal was to solve existing problems, fix system limitations, and add new functionality to allow Sectrum call agents to better and more efficiently assist customers.

Sprint Objective

In 2 years Spectrum agents have end-to-end visibility of the customer’s self-installation experience, which is contextual to where in the process the customer is and where problems are detected. Spectrum agents have a single tool that predicts a call reason and reduces repeat calls.

Hypothesis

We believe that building a contextual tool that proactively detects the customer’s self-installation experience/issues will increase FCR (first call resolution), reduce truck rolls, reduce AHT (average handle time), and increase customer satisfaction. AHT & FCR are the key metrics that measure the company’s success.

Day 1

GOALS

• Understand the problem and users’ pain points
• Pick an important area to focus on
• Start at the end by mapping a possible end-to-end user scenario  

Research Readout

In early 2020, the research team ran an initial study to identify issues with Gateway, the internal tool agents relied on. Seven months later, a second study was conducted as part of the Agent OS initiative. For this sprint, the team consolidated data from both studies to address recurring self-install challenges. The first two hours of the sprint were dedicated to a comprehensive research readout, during which researchers presented findings from both studies alongside Engineering Tech Support (ETS) insights. This provided a shared foundation of knowledge to guide problem-solving during the sprint.

Agent Interviews

2 CSG & 2 ICOMs in Nov & Dec
2 CSG & 2 ICOMs in Feb

Lead Interviews

2 leads (both CSG) in Feb
2 CSG & 2 ICOM leads in Nov

Agent Surveys

118 agents from CSG & ICOMs on the self-install tab

Customer Interviews

6 customers that called Spectrum regarding a self-install in the last 90 days

ETS Call Observation

50 calls from ICOMs & CSG

Data Discovery

SME interviews, CSAT data
BPI self-install statistics

Call Drivers

Evaluated 209,766 total calls for 11/1/20 through 11/15/20.

75%

Activation

15%

Installation

5%

Post Connect

5%

Professional Install & Missing Equipment

Key Insights

Biggest issues resolving a self-install call.

Login Info

Getting customers through the process of creating a user id/password, then logging into the portal.

T&C

Non-acceptance of terms and conditions creates roadblocks in the equipment activation process.

No Active Taps

Self-Install is being sold to locations with no active taps as there’s no related info in the current system.

Intimidated

Convincing customers to self install when they are intimidated upon receiving a box full of cables, equipment. 

No Checklist

No required steps, a check list, of what needs to be done that tracks completion to know where customer is in the process and what needs to be done.

Liability

Spectrum agents found a loophole by trying to activate devices when customers had not accepted terms & conditions.

User Map: Ideal Experience + Target

We mapped the customer journey from ordering to device activation, highlighting customers, agents, and issues along the way. A key challenge was separating self-install from repair, since they often overlap. After alignment, we defined the endpoint at device activation—with anything beyond considered repair. For the sprint, we focused on two targets: activation as the primary goal (the critical success moment), and all preceding steps as the secondary goal, since they directly influence whether activation succeeds.
👉 This mapping exercise gave the team a clear scope and shared focus, enabling us to prioritize the right problems and design solutions with measurable impact.

SME Interviews

The next step for our team was to speak with 4 subject matter experts to close any remaining knowledge gaps.

Tony B.

Data Analytics

Michael L.

Customer Compliance

Rene M.

Self-Install Call Center Supervisor

Shailu J.

Portal Auto Activation Demo

SME Interview Highlights

The most frequent reason for scheduling a truck roll is when the house has no outlets and has never had service at home.

Internet activation, connectivity, and modem offline are the 3 main causes of calling in. Additional problems include equipment appearing in CSG but not Gateway, CSG hierarchy issues.

If he had a magic wand Rene would improve the troubleshooting trees and enable the ability to quickly test outlets.

When the customer acknowledges with a check box in the portal the system collects the IP address and location as a fingerprint.

Agreements are specific to the line of business a customer purchases from (for example E911 is only required for phone customers.

Agents currently try to activate equipment without terms and conditions acceptance, which is a huge liability.

To activate equipment in the portal, customers must confirm their account, verify their identity, create a username and password, sign in and then review and accept the agreements.

Agents are not allowed to create accounts for customers. If customers struggle with this, agents can use Agreement Initiator for agreements and activate the equipment themselves.

In the case of a customer with physical or other limitations that prevent them from agreeing online, they would need a friend/relative to bring technology to them to assist them in agreeing.

Day 2

GOALS

• Competitive analysis
• Identifying the critical screen
• Sketch out concepts & possible solutions

Lightning Demos

Each participant was given 30 mins to do competitive research. As competitors’ internal tools are usually not publicly available on the internet, the intent was to explore what other companies in our segment do right and works well or not so much. After conducting research, in breakout rooms each participant presented their findings. After some discussion the team chose 3 options that they wanted to bring back to the bigger group. My choice was AT&T, as I recently placed an order with them, and wanted to share my experience and point out what I found useful.

Communication

As a customer, I’d rate communication with AT&T as 10 out of 10. Right after placing my order, I received an email with a clear summary of the order and charges, followed by details about the upcoming installation. Since I forgot to register my user ID right away, I also received a few helpful reminders. What stood out to me was how AT&T handles legal consents and terms & conditions.

At Spectrum, agents often cannot activate devices because customers haven’t accepted all the required agreements—this is one of the main reasons for support calls, and it takes significant time and effort to identify that the issue is incomplete consent. AT&T, on the other hand, makes it impossible to proceed with installation unless customers have agreed to all terms in advance, sending multiple reminders before the appointment.

Smart Home Manager 

I took screenshots of AT&T’s app to gain inspiration and a clearer understanding of how user accounts are created — the steps involved, when customers are required to accept terms and conditions, and where all legal documents are located within the experience.

Crazy 8s

For the Crazy 8s exercise, each participant had time to individually sketch a variation of one of their best ideas, followed by creating a three-panel solution sketch. We then moved into breakout rooms to present our sketches and select a few to bring back to the entire group for the Art Museum activity, which took place on Day 3.

Day 3

GOALS

• Turn ideas & concepts into a testable solution
• Design explorations

Solution Sketches

During the Art Museum activity, each team presented two to three solution sketches to the larger group. Afterward, the deciders who joined the session voted on the solutions they believed would work best.

Prioritization Matrix

After selecting a solution sketch, the team realized we were trying to solve too many problems at once. We agreed to narrow our focus by prioritizing the challenges we wanted to address in this sprint. To build consensus, we used a prioritization matrix to collaboratively identify the most important problems and distinguish must-haves from nice-to-haves.

Agents must memorize or search in CoPilot for the correct codes to manually add customer-owned equipment into CSG.

Agents need to understand equipment capabilities to determine whether they support the purchased services.

Agents lack visibility into customers’ previous problems and resolutions.

It’s difficult for agents to determine if a device is quarantined.

Agents need to know how many active charter taps exist at a given location.

Some orders are submitted with equipment assigned to the wrong taps or hierarchy, requiring agents to correct them in CSG before activation.

Day 4 – 9

GOALS

• Creating a realistic prototype

From Sketches To Hi-Fi Designs  ✨

High-fidelity designs of the most critical screens, collaboratively designed by the team.

Explorations: Devices Received Tab

The initial design shown below was essentially a repaint of the existing Gateway tool that agents had been using. It represented the UI designer’s first attempt to align the look and feel with Agent OS. However, it did not address the key problems uncovered during the sprint. At this stage, I stepped in to lead the design effort, creating a data-informed experience grounded in both agent and customer needs.

Explorations: Top Nav

As a team, we conducted numerous design explorations of the top navigation to address the existing problems. One of the key challenges was refining the information architecture, which determined both the number of tabs and their order. Another challenge was designing a notification system within each tab to clearly communicate the status of every step in the process.

Prototype No. 1

SCENARIO 1: Order Missing

• John Mclane calls and says he didn’t receive all of the equipment he ordered
• He thinks he’s missing a Set Top Box
• Determine what the situation is
• Order a replacement, if needed

Prototype No. 2

SCENARIO 2: Activation

• John Mclane calls and says that he’s set up all of his equipment, but it’s not working
• Determine what the situation is
• Take any action needed

Prototype No. 3

SCENARIO 3: Equipment Not Online

• John Mclane calls and says that he’s set up all of his equipment, but it’s not working
• Determine what the situation is
• Take any action needed

Day 10

GOALS

• Validate assumptions
• Get feedback from real users​​​​​​​

Usability Testing

We conducted one round of moderated usability testing with six agents, carried out remotely via WebEx. Each 45-minute session asked participants to imagine they were assisting a customer who had called in about a self-installation.

GOALS

Shipment Status

Can agents clearly identify and communicate status of equipment shipments?

Order Status

Can agents clearly identify and communicate status of orders?

Equipment Status

Can agents clearly identify the status of customers’ equipment?

Terms & Conditions

Can agents clearly identify the status of customers’ acceptance of T&C?

Location History

Can agents determine the history of service to a customer location?

Activation

Are agents able to activate the customers’ equipment during the call?

Affinity Mapping

STEP 1:

During the testing sessions, sprint participants observed and captured notes directly on a Miro board overlaid with the interface outline. All team members then contributed to the affinity mapping exercise by adding virtual post-it notes.

STEP 2:

To synthesize our findings, the sprint team divided into breakout groups. Each group reviewed the observation notes and organized them into thematic clusters, which allowed us to identify patterns in agent behavior and surface the most critical pain points. This collaborative process ensured that all perspectives were considered and directly informed the next iteration of the design.

Day 11

• Discuss feedback and takeaways from user testing
• Discuss design implications & next steps for a deep-dive session

User Testing Takeaways

Activation

• Most Spectrum agents navigated the device list without any issue.
• All agents understood and activated devices without an issue.
• The majority of agents weren’t sure how to manage devices in the swap or replacement workflow.
• The signal strength graph didn’t add value for some call agents.

Bad Outlet 

• Online status was clear for all agents.
• Users liked the detailed and visual instructions.
• Users liked the simple layout with consolidated information.
• Some users were slow to identify the troubleshooting trees, including the toggle and CTA.

Order Screen

• Users easily navigated the order screen.
• Users wanted codes for services.
• Several users missed the second order entirely.
• The carrot at the top of the order card was misunderstood.
• Include job numbers instead of order numbers.

Shipment Missing

• Agents liked the UI – navigation was easy and they liked not much scrolling.
• Some were confused by the number of packages vs the number of devices.
• Carrier info would be helpful.
• The “wrong/missing device” was misunderstood.

Design Iterations

The next step for the team was to discuss the design implications as a group based on what we heard from the agents. We talked through all the insights to make a list of items for a design deep-dive session where all the issues were addressed.

WHAT WE LEARNED 🚀

Key Takeaways

After everything we uncovered during the sprint, I joked with the team that I felt ready to take a part-time job as a Spectrum technician or agent — full-time even, if design weren’t my first passion. The sentiment was shared across the group: we had gained a much deeper appreciation of the agent and technician experience.

As a designer, this sprint reminded me how essential curiosity is. The answers are often right in front of us, we just need to go out, connect with the people we’re designing for, build empathy, and translate those insights into better experiences. At the time, I was moving into a new apartment during COVID, when self-installation was the only option. My installation failed because there were no active outlets, and a technician had to be sent. That turned out to be the best outcome: I asked questions at every step, observed the process, tested signal devices, and even climbed to the rooftop to locate the right switch and cable. Every pain point I noticed was brought back to the team. Other designers did the same: digging out old bills, inspecting devices, so together we built a far richer understanding of the experience we wanted to improve.

Another key takeaway was about team structure. In a previous sprint, we had four times as many participants, which slowed us down. This time, with fewer people, we were able to streamline processes and make decisions more quickly. As I facilitated the first few days of the sprint, I also learned how crucial time management and mitigation are. With an aggressive timeline, it was important to drive the team toward fast decisions and keep momentum, even when discussions dragged on. It was challenging at times, but I learned how to help the group reach compromise under pressure and keep the sprint moving forward.