

AT&T | TV Recommender Tool
Address customers who need a helping hand by recommending the best TV products AT&T has to offer.
CHALLENGE
Provide customers with a white glove online experience that they would only otherwise receive if they were to call in or visit a store. Develop a tool to recommend the best TV product to users based on their unique preferences. This tool should also first help identify if certain products, speeds, etc. are even available for them to purchase given their location.
Problems we needed to solve
-
Users consume information differently depending on personal preferences – some prefer to have all of their info shown up front, while others need more help when making a decision. Currently, we only provide one method to let users consume, and we’d like to provide a secondary interactive method to help those who need the extra hand holding.
-
Users want information quickly and easily, however, our TV landing page does not do a good job of clearly explaining why a consumer should choose any service over another, making it difficult to make the right decision. Instead, we use a lot of the same language and imagery to depict our services and don’t focus on the major content differences as well as the feature uniqueness.
-
Reduce the number of steps a user will need to go through in order to make a purchase, an issue we see across att.com right now. It will also let users add their preferences directly to the cart so that they aren’t forced to make the same decisions all over again.
MY CONTRIBUTION
I joined the Recommender team to Lead the UX design of the TV Recommender Tool. I leveraged the research from our specialized teams, along with the Design Jam findings, and I was able to focus my efforts on the sketch, design and implementation processes.
PROCESS
Learn | Research | Sketch | Design | Implement | Evaluate
Learn
Drawing on all the insights gathered at the AT&T Design Jam, I was able to build upon them by analyzing the strategy and research. Insights were gathered through 1-on-1 user testing and were conducted and moderated remotely in user sessions with 5 participants. The recommender tool was just a portion of the overall study for the entire TV buy flow.
Metrics
Customers would be entering the Recommender through a call to action on the att.com/tv index page which has an average of 15k customers daily. 30% arrive on the TV lander via homepage, 30% land via natural search. 75% of the overall traffic is in-footprint.
TV buy flow

FOCUS
AREA
User testing

Research
The provided research, along with the competitive analysis that my team conducted at the AT&T Design Jam gave me the materials I needed to quickly move into the Sketch phase.
Prior to starting this project, I was involved in some ideation around a configurator tool. That tool worked more like a dashboard that users could adjust and answer questions and get real-time feedback for TV package recommendations. The project eventually morphed into the current Recommender project and we shifted the approach to avoid overwhelming users with too much cognitive load. I feel that there is a place for an advanced configurator type tool, but this would need to be introduced at a later date when we're able to gather metrics on its performance. I had some good takeaways from that project and wanted to introduce them into the recommender tool, most notably being able to determine a user's actual address so that we could better provide recommendations.
Configurator tool concept


UX flow created using Sketch and Overflow

The business team recommended that the user engagement begins with inputting only the ZIP code because the system was already set up to work this way.
​
I recommended we ask for a full address. For some users, we couldn't recommend actual packages locations based on a ZIP code only. Therefore, those users would be given popular packages to choose from. Upon selecting one of the recommended popular packages, the user would be moved to the Buy flow and would now have to validate their full address to check for product availability, resulting in a terrible user experience.
​
Working together with the Strategist, we determined two approaches that we could test.
Scenario 1: User is asked, "What type of home do you live in?" In this scenario, users can only select one answer. Have the customer only put in their full address if they choose "apartment" or "college campus", and if they select "house" or "condo" we could just ask for their zip. Though it would be a benefit of having them enter their full address so that it can be carried through to the buy flow.
​
Scenario 2: Ask for ZIP code or address, but recommend to customers who are already opting-in, that entering a full address will give them the best package recommendations for their specific area. The full address would be carried through to the buy flow and users wouldn't need to enter it again.

Exploration: multi-select add pattern


Help me choose a TV Package
Scroll to the "Need help choosing" seection and select the "Get started" CTA.
Design
Our new design stack was pushed to production with a limited UI design system in place. The att.com homepage did not require certain UI pattens/components, but as other pages are being developed, there is a need for these patterns. Input form patterns needed for the Recommender tool weren't developed yet so I had to finalize those UI patterns in the att.com design system.
Form UI patterns


Implementation
I'm currrently working closely with the development team to QA test the recommender tool. The tool is being built through a series of design sprints in an Agile environment. I'm participating in daily checkpoints to ensure the tool is developed to specs an working with the team to define interaction and animation.


Redlines created with Zeplin
Test
Takeaways through the fifth round of testing:
-
Participants understood and appreciated that information they entered early in the process carried forward to later parts of the process.
-
Participants liked the questions presented, but access to additional content could help them better understand the decisions they needed to make. e.g., info on How streaming works, what devices are needed, satellite installation information, NFL Sunday Ticket details, impact of internet speeds on streaming service/usage, contract details –required or not and how they work.
​
RESULTS
​
-
By adjusting our approach from a configuration tool to a recommender tool, we were able to reduce the cognitive load on our users by only asking one question at a time.
-
By asking the "Sport's fan" question early in the process, we were able to create an early exit scenario since we know that those interested in Premium Sports Packs can only get them with specific packages, therefore we push the user directly to the final recommendation "Our recommendations" page.
-
Added a scenario for users that "Skip this question" too many times. Recommend the most popular packages, but state that we need more information to recommend actual packages available to them. We make it easy to start the questionnaire over via a "Start again" CTA.
​
Overall, this continues to be a challenging project. Through multiple rounds of testing, I feel we're on the right track to launching a successful Recommender tool. I'm excited to see the results of our A/B testing, but until then I'll continue to enhance the overall user experience.