AGENT OS

Self-Install

A B2B/B2C internal tool designed for Spectrum agents to help residential customers with the self-installation process of their equipment.

Role

UX Designer, Facilitator

Tools

Figma, Axure, Miro, Zeplin

Timeline

11 days

Methodology

Modified GV Design Sprint

Modified Design Sprint? 

EXPERIMENT

The team anticipated solving for multiple scenarios within self-install scope based on the research data obtained a year ago, when self-install first came into the picture. The scope included tackling the processes of activation, terms & conditions, and fulfillment. To allow more time for prototyping multiple scenarios, it was decided to run an experiment and extend the sprint to 2 weeks, or 11 work days.

Learn, Map & Target

MONDAY

Learn about users, create a map of the problem, choose 1 target.

Sketch

TUESDAY

Sketch out possible solutions to the problem.

Decide

WEDNESDAY

Deciders pick the best solution.

Prototype

THURSDAY

Build realistic prototype till Thursday next week.

Test

THURSDAY next week

Test prototypes with Spectrum call agents.

Retrospect

FRIDAY next week

Discuss takeaways from testing + next steps for a deep-dive session.

Team Structure

The organization was comprised of 3 omni teams that were solving various problems in a series of design sprints. All the problems were related to building a new experimental tool for agents called Agent OS, which was intended as a replacement for an existing internal tool called ‘Gateway’. I was part of an omni team 2 that focused on tackling self-installation and repair issues.

FACILITATOR

UX STRATEGIST

SR. UI DESIGNER

UX DESIGNER

SOFTWARE ENGINEERS

CONTENT DESIGNER

UX RESEARCHERS

STAKEHOLDERS

3 DECIDERS

Problem Space

Spectrum self-install agents receive a high volume of calls from residential customers every year. This number skyrocketed during the pandemic, as self-install was the only available option in the realities of social distance to ensure the safety of our customers & technicians. The self-installation process in 2020 drove 4.4M calls & 1.5M truck rolls. Self-install calls were an average of 28% longer than the total AHT (average handle time).

Spectrum agents also repeatedly reported multiple issues with Gateway, the internal tool used before and during the pandemic. Eventually, it was decided to build a new experimental tool, Agent OS, to replace Gateway. The goal was to solve existing problems, fix system limitations, and add new functionality to allow Sectrum call agents to better and more efficiently assist customers.

Sprint Objective

In 2 years Spectrum agents have end-to-end visibility of the customer’s self-installation experience, which is contextual to where in the process the customer is and where problems are detected. Spectrum agents have a single tool that predicts a call reason and reduces repeat calls.

Hypothesis

We believe that building a contextual tool that proactively detects the customer’s self-installation experience/issues will increase FCR (first call resolution), reduce truck rolls, reduce AHT (average handle time), and increase customer satisfaction. AHT & FCR are the key metrics that measure the company’s success.

Day 1

GOALS

• Understand the problem and users’ pain points
• Pick an important area to focus on
• Start at the end by mapping a possible end-to-end user scenario  

Research Readout

beginning of 2020 to uncover issues related to Gateway, the internal tool agents were utilizing at that time. The second research took place 7 months later as a part of the Agent OS effort. The research team combined 2 sets of data for this sprint targeted at solving in-depth self-install problems. First 2 hours of the sprint are dedicated to the research readout, where researchers are covering the research and ETS findings to help understand all the information at hand. Research methodologies are listed below.

Agent Interviews

2 CSG & 2 ICOMs in Nov & Dec
2 CSG & 2 ICOMs in Feb

Lead Interviews

2 leads (both CSG) in Feb
2 CSG & 2 ICOM leads in Nov

Agent Surveys

118 agents from CSG & ICOMs on the self-install tab

Customer Interviews

6 customers that called Spectrum regarding a self-install in the last 90 days

ETS Call Observation

50 calls from ICOMs & CSG

Data Discovery

SME interviews, CSAT data
BPI self-install statistics

Call Drivers

Evaluated 209,766 total calls for 11/1/20 through 11/15/20.

75%

Activation

15%

Installation

5%

Post Connect

5%

Professional Install & Missing Equipment

Key Insights

Biggest issues resolving a self-install call.

Login Info

Getting customers through the process of creating a user id/password, then logging into the portal.

T&C

Non-acceptance of terms and conditions creates roadblocks in the equipment activation process.

No Active Taps

Self-Install is being sold to locations with no active taps as there’s no related info in the current system.

Intimidated

Convincing customers to self install when they are intimidated upon receiving a box full of cables, equipment. 

No Checklist

No required steps, a check list, of what needs to be done that tracks completion to know where customer is in the process and what needs to be done.

Liability

Spectrum agents found a loophole by trying to activate devices when customers had not accepted terms & conditions.

User Map: Ideal Experience + Target

First, we listed customers, agents and issues on the left, then drew the ending with a completed goal on the right. Then we created a flowchart in between to show how we want customers to interact with our product. The ideal customer experience. The team wasn’t sure where the map should end, as self-install & repair are interchangeable processes that tend to overlap depending on the customer situation. It made eliciting steps related to self-installation only quite challenging. After discussions, we agreed that the self-install call reason will stop after device activation. As a result, any problems beyond that point will be regarded as a standard repair call. The next step for us was to discuss the customer journey and pick a target that we wanted to focus on during this sprint. As you can see on the map below, we picked 2 targets: activation of the devices being a primary one, and everything preceding it, as a secondary.

SME Interviews

The next step for our team was to speak with 4 SMEs to close any remaining knowledge gaps.

Tony B.

Data Analytics

Michael L.

Customer Compliance

Rene M.

Self-Install Call Center Supervisor

Shailu J.

Portal Auto Activation Demo

SME Interview Highlights

The most frequent reason for scheduling a truck roll is when the house has no outlets and has never had service at home.

Internet activation, connectivity, and modem offline are the 3 main causes of calling in. Additional problems include equipment appearing in CSG but not Gateway, CSG hierarchy issues.

If he had a magic wand Rene would improve the troubleshooting trees and enable the ability to quickly test outlets.

When the customer acknowledges with a check box in the portal the system collects the IP address and location as a fingerprint.

Agreements are specific to the line of business a customer purchases from (for example E911 is only required for phone customers.

Agents currently try to activate equipment without terms and conditions acceptance, which is a huge liability.

To activate equipment in the portal, customers must confirm their account, verify their identity, create a username and password, sign in and then review and accept the agreements.

Agents are not allowed to create accounts for customers. If customers struggle with this, agents can use Agreement Initiator for agreements and activate the equipment themselves.

In the case of a customer with physical or other limitations that prevent them from agreeing online, they would need a friend/relative to bring technology to them to assist them in agreeing.

Day 2

GOALS

• Competitive analysis
• Identifying the critical screen
• Sketch out concepts & possible solutions

Lightning Demos

Each participant was given 30 mins to do competitive research. As competitors’ internal tools are usually not publicly available on the internet, the intent was to explore what other companies in our segment do right and works well or not so much. After conducting research, in breakout rooms each participant presented their findings. After some discussion the team chose 3 options that they wanted to bring back to the bigger group. My choice was AT&T, as I recently placed an order with them, and wanted to share my experience and point out what I found useful.

Communication

As a customer, I’d rate communication with ATT&T as 10 out of 10. Right after placing the order I received an email with my order summary and charges, followed by details about the upcoming installation. As you can see, I forgot to register my user ID right away, and received a few reminders. What was interesting is how ATT&T handles processes regarding legal consents and terms & conditions. At Spectrum, agents won’t be able to activate devices without customers accepting all the agreements. That is one of the main call reasons, and it takes a lot of time and effort to figure out that the reason was terms and conditions that weren’t accepted. It’s not even possible for AT&T to install devices without customers’ consent to all agreements, so they notify customers multiple times before the installation.

Smart Home Manager 

I took screenshots of the AT&T’s app for users to get inspiration and better understanding of how the user account is being created, what steps are needed, at which point in the customer’s experience T&Cs have to be accepted and where all the legal documents are located.

Crazy 8s

For crazy 8s each participant had time to individually sketch out a a variation of one of their best ideas, and then create a 3-panel solution sketch. We would then work in break out rooms to present our solution sketches and pick a few that we would like to bring back to the entire team for the Art Museum activity, which took place on day 3.

Day 3

GOALS

• Turn ideas & concepts into a testable solution
• Design explorations

Solution Sketches

During Art Museum activity, each team presented 2-3  the solution sketches for the big group. After that, deciders who joined the call for that activity voted for the solution that works best in their opinion.

Prioritization Matrix

After deciding upon the solution sketch, the team realized we had too many problems we were trying to solve at once. We decided to focus on prioritizing the problems that we wanted to solve in this sprint. As a team, we decided to use a prioritization matrix to achieve collaborative consensus on the most important problems & identify the must-haves vs nice-to-haves.

• Agents need to memorize or research in CoPilot which codes are appropriate for customer-owned equipment to manually add them to CSG.
• Agents need to know the capabilities of the equipment to know if it’s appropriate for the purchased services.
• Agents don’t have visibility into previous problems & resolutions for customers.
• It’s difficult for agents to determine if the device is quarantined.
• Agents need to know how many active charter taps are at a location.
• Some orders are entered with equipment assigned to the wrong taps/hierarchy, requiring an agent to correct in CSG before equipment can be activated. Design Sprint | Day 3
• Make a decision
• Turn ideas & concepts into a testable solution
• Spend time exploring designs

Day 4 – 9

GOALS

• Creating a realistic prototype

From Sketches To Hi-Fi Designs  ✨

High-fidelity designs of the most critical screens, collaboratively designed by the team.

Explorations: Devices Received Tab

The first design shown below is a repaint of the existing Gateway tool that was previously used by agents. It was the first step taken by the UI designer on the team to try to bring the look and feel of Agent OS. However, it wasn’t solving the problems uncovered during the sprint. This is where I took the lead as a designer to create a data-informed experience based on agents’ and customers’ concerns.

Explorations: Top Nav

As a team we’ve done numerous design explorations of the top navigation to try to solve the existing problems. One of the challenges here was the information architecture, which would determine the number of tabs and their order.

The second one was to establish a notification system for each tab to communicate the state of each step.

Prototype No. 1

SCENARIO 1: Order Missing

• John Mclane calls and says he didn’t receive all of the equipment he ordered
• He thinks he’s missing a Set Top Box
• Determine what the situation is
• Order a replacement, if needed

Prototype No. 2

SCENARIO 2: Activation

• John Mclane calls and says that he’s set up all of his equipment, but it’s not working
• Determine what the situation is
• Take any action needed

Prototype No. 3

SCENARIO 3: Equipment Not Online

• John Mclane calls and says that he’s set up all of his equipment, but it’s not working
• Determine what the situation is
• Take any action needed

Day 10

GOALS

• Validate assumptions
• Get feedback from real users​​​​​​​

Usability Testing

One round of moderated usability testing was conducted with 6 agents who were tested remotely using WebEx. Remote moderated testing took place over 45 minute sessions. Participants were asked to imagine they were responding to a customer who had called about self-install.

GOALS

Shipment Status

Can agents clearly identify and communicate status of equipment shipments?

Order Status

Can agents clearly identify and communicate status of orders?

Equipment Status

Can agents clearly identify the status of customers’ equipment?

Terms & Conditions

Can agents clearly identify the status of customers’ acceptance of T&C?

Location History

Can agents determine the history of service to a customer location?

Activation

Are agents able to activate the customers’ equipment during the call?

Affinity Mapping

STEP 1:

STEP 1: During the testing session, sprint participants observed and made notes on a Miro board over an outline of the interface. All team members contributed to affinity mapping by adding post-it notes.

STEP 2:

STEP 2The sprint team was divided into breakout groups. Each group reviewed the notes & grouped them by theme to describe what we learned from agents.

Day 11

• Discuss feedback and takeaways from user testing
• Discuss design implications & next steps for a deep-dive session

User Testing Takeaways

Activation

• Most Spectrum agents navigated the device list without any issue.
• All agents understood and activated devices without an issue.
• The majority of agents weren’t sure how to manage devices in the swap or replacement workflow.
• The signal strength graph didn’t add value for some call agents.

Bad Outlet 

• Online status was clear for all agents.
• Users liked the detailed and visual instructions.
• Users liked the simple layout with consolidated information.
• Some users were slow to identify the troubleshooting trees, including the toggle and CTA.

Order Screen

• Users easily navigated the order screen.
• Users wanted codes for services.
• Several users missed the second order entirely.
• The carrot at the top of the order card was misunderstood.
• Include job numbers instead of order numbers.

Shipment Missing

• Agents liked the UI – navigation was easy and they liked not much scrolling.
• Some were confused by the number of packages vs the number of devices.
• Carrier info would be helpful.
• The “wrong/missing device” was misunderstood.

Design Iterations

The next step for the team was to discuss the design implications as a group based on what we heard from the agents. We talked through all the insights to make a list of items for a design deep-dive session where all the issues were addressed.

WHAT WE LEARNED 🚀

Key Takeaways

After all the information we learned during the sprint, I feel confident about taking a part-time job as a Spectrum technician or agent. Even full-time, hadn’t design been my first passion. The team shared the same sentiment.

As a designer, again, I learned how important curiosity is. Answers are right there in front of us. All we need is to go out there, connect with the people we are designing for, build empathy, and create better experiences. At the time of the sprint, I was moving to a new apartment, and guess what, I needed to have my internet installed? It was during COVID times, when self-installation was the only available option. And I failed it, as there were no active outlets in my unit, so they had to send a technician. The best thing that could have happened to me: I had a chance to ask my tech all the questions, observe the installation process, test the signal devices. We even went to the rooftop to locate the right switch and cable for my apartment. I was curious about every single step, every pain point that I then brought back to my team. And I know other designers were doing the same thing, pulling their old bills, inspecting their devices so we could all build a better understanding of the experience we wanted to design.

Another takeaway related to the team structure. We learned from our mistakes in the last sprint, when we had 4 times more participants, which was holding us back. This time we had fewer people which helped us optimize the processes and move faster with decisions and design. As I’ve also been facilitating the sprint’s first few days, I learned the importance of 1) time management, and 2) mitigation. Sprints have an aggressive timeline, and the team needs to make sure to make decisions fast and move on. This was somewhat challenging at times as discussions would go on and on, as the team couldn’t reach an agreement. I had to learn how to get everyone to come to a compromise, make a decision fast in pressing circumstances.