top of page

Reducing the Churn Rate of New Users

Company: Meta (Facebook)

Team: Messenger, Growth

Group 2930 (3) 1.png

STREAMLINING CONTENT REVIEW WITH LLMS, AUTOMATION, & UX IMPROVEMENTS

Company: Rokt

Team: Advertiser platform

Tools

  • Figma

  • Figjam

  • Excalidraw

  • Framer

  • Jira

users-01.png

Team

  • 1 product designer

  • 1 product manager

  • 4 developers

  • 1 operations director

  • 1 account management director

laptop-02.png

My role

  • UX/UI design

  • Design system

  • UX research

  • Content strategy

  • Product strategy

  • Performance analysis

calendar-date.png

Timeline

  • Overall: 6

  • Discovery & research:  4+ weeks

  • Design & testing: 5 months

My design process

Process (6).png
Process (7).png
Process8.png

Context

What is the content review process?

All advertiser content is manually reviewed and approved by our Customer Operations team against a defined set of requirements before going live on Rokt's network. This process ensures that all advertiser content on our platform consistently meets the high-quality standards that Rokt promises to their e-commerce clients.

Business problem

The content review process is slow, costly, and results in high error rates. This process in its current state is not sustainable and will not scale with the rapid growth of Rokt’s advertiser network. Below are a few numbers that demonstrate these issues:

$600K

annual spend on analysts dedicated to content moderation

38%

of live ads that have at least one policy violation

16 hrs

on average from the time an ad is added to when it is live

Understand & define

Initial user research to understand the existing user flow

To better understand the ins-and-outs of the current content review process and empathize with the user, I conducted the following two exercises: 1) I shadowed all four of the content moderation analysts as well as an account manager from each vertical and 2) to further immerse myself, I personally submitted and reviewed dozens of content changes.

Frame 1 (2).png
Existing UX/UI for content review and approval

Below are examples of screens that provide insight into the existing content review and approval flow that Account Managers and Operations Analysts are using.

Advertisement creation

The following screen shows the creation flow and form that Account Managers use to configure advertisement settings and content. 

creation.png

Advertisement review and approval

This screen shows the page that Operations Analysts use to find, review, and approve / reject the changes that are submitted by account managers.

approvals bin and task details.png
Defining problem, solution, & goals

User problem

The current content review process for the Rokt operations team and account managers is inefficient, unintuitive, and time-consuming, leading to wasted resources and inaccurate content approval decisions.

Solution

Redesign and streamline the approvals workflow with a two-pronged approach:

  1. Prevention - introduce improvements upstream during the creation process to prevent errors and reduce the number of checks that Operations Analysts have to conduct manually.

  2. Remediation - introduce efficiencies to the review process that will enable Operations Analysts and Account Managers to review, remediate, and approve changes faster to reduce the time it takes for advertisements to go live on Rokt's network.

Goal

Improve the efficiency and enhance the accuracy of the content review process so that the process can scale with the rapid growth of Rokt's advertiser network, remain affordable, and increase the company's revenue and profits.

Performance metrics

Time-to-live, # of checks that need review, time spent per review, % of changes that need remediation, % of live advertisements that have errors.

Design question

How can we modify and improve the current content review process to make it more efficient, accurate, and scalable?

User pain points

Identifying user paint points

In order to better understand the pain points and identify opportunities, I interviewed 4 ops members and 8 account managers to create customer journey maps.

Account manager

User profile & journey

accountmanagerprofileandjourney.png
Pain points
  1. Decentralized process: The review process and communication is spread out across numerous apps, including Jira, Operations Service Desk, Rokt's Ad Platform, email, & chat.

  2. Redundant and manual review request submission:: Account Managers have to submit the same review request in both the Advertiser Platform and in the Operations Service Desk. These review requests require Account Managers to fill out a lot of tedious, detailed information manually, which is a time-consuming and error-prone process.

  3. Lack of clarity into field requirements and best practices: When creating advertisements, Account Managers are not provided with requirements and best practices within the ad management platform.

  4. Manual enforcement of field requirements: There is very little data validation or guardrails when Account Managers are filling out information within the advertiser platform, so compliance depends on manual enforcement by Account Managers, which is tedious, time-consuming, and error-prone.

  5. Lack of transparency into review statuses and rejection reasons: There is no easy way for Account Managers to get real-time status updates or detailed information about rejection reasons without waiting and clarifying with Operations Analysts, which slows down the review process.

Operations analyst

User profile & journey

operationsanalystuserprofileandjourney.png
Pain points
  1. Poor system performance: Loading the review request information takes 11 seconds on average, which is significant for a process done hundreds of times a day. Additionally, the review item search functionality is limited to IDs, which makes it much more difficult to find and navigate to the desired review item.

  2. Receiving inaccurate review request information: Since Account Managers fill out the review request form manually and there is little data validation, many times requests come in with inaccurate details or missing information, which creates more back and forth.

  3. Highly decentralized process: Even more than the review process for Account Managers, the review process for Operations Analysts is spread out across even more apps and tools. The unnecessary switching between pages is disorienting, inefficient, and leads to errors.

  4. Cluttered UI makes quick and accurate review difficult: The existing design system and UI is antiquated and not well maintained, leading to frustrating UI that makes information hard to analyze quickly and accurately.

  5. Missing and irrelevant information: There is important information needed to make an approvals decision that is missing and there is also infomration presented that's irrelevant.

  6. No indication or logs of what information needs to be reviewed: When only a part of an ad is updated there is no indication or logs of what was changed, which causes Operations to re-review the entire ad instead of just what was updated.

  7. Very complex, nuanced, and sometimes subjective requirements: Requirements are very detailed, vary a lot from item to item, and sometimes rely on analyst interpretation, which makes accuracy across all reviews extremely difficult.

  8. There is no standardized review process across analysts: The process is not streamlined, so each analysts review process differs.

  9. Manual review of requirements that could easily be automated: Many tedious requirement checks are done manually, even thought they be easily automated.

  10. Rejection reasons are not granular enough and the data is not stored: Rejection reasons are submitted at the item level so it's unclear what specific requirement is violated & these rejection reasons are not saved anywhere, which doesn't allow for analysis & optimization.

Solutions (Prevention)

Enhanced data validation and error messaging

In the existing process therre are many requirement checks that are done manually by Operations Analysts that could easily be checked upstream or automated. To address these requirement checks before they get to the review phase, we introduced in-line data validation for things like character counts, punctuatio, etc. This included very clear error messaging to ensure Account Managers know exactly what needs to change.

Group 102129.png
Enhanced helper text, placeholder values, and access to field requirements

The existing platform was built in Angular and the codebase was extremely complex, greatly limiting the engineers ability to modify the code and the design possibilities. In order to enable an elevated user experience I was able to convince the product team to build a new standalone app in React, which enable for a complete design overhaul of the existing review process.

Group 102128.png
Introducing LLMs to provide real-time requirement compliance feedback

In addition to the requirements that have simpler logic and can be addressed by in-line data validation, there are also many more nuanced requirements that require additional logic. To address these requirements we implemented LLMs to provide real-time feedback to address requirement violations and also provide best practice suggestions.

Group 102130.png
Consolidating and automating review request submission

To address the duplicative form submission by AMs that resulted in manual input errors, missing information, and wasted time, we moved the form submission into the app and autofilled the item IDs and other review information.

Frame 24.png

Solutions (Remediation)

Transition from Angular to React

The existing platform was built in Angular and the codebase was extremely complex, greatly limiting the engineers ability to modify the code and the design possibilities. In order to enable an elevated user experience I was able to convince the product team to build a new standalone app in React, which enable for a complete design overhaul of the existing review process.

angulartoreact.png
Introducing a new design system

The existing design system was outdated, unchanged since the product launched 10+ years ago, and was also not well maintained. This led to inconsistent and bad UI as well as frustrating, sub-par experiences for the user throughout the platform.

oldDStonewDS.png
New IA to decrease loading time

The current tool loads tasks and all nested review item details at the same time. This loading of unnecessary information from tasks that are irrelevant is inneffecient and leads to slow loading times of 11s on average. To fix this, in the new designs there is an index page with only the necessary, high-level task information and the task details are only loaded on a separate page when that specific task is clicked on.

Group 102123222.png
Centralize process by reducing the number of apps and streamlining communication

To make the process more effecient and accurate, we greatly reduced the number of apps used by Operations Analysts and Account Managers. This included introducing automated form submissions with auto-filled item information and status notifications to reduce the human error within the process, speed up communication, and save everyone time.

Frame 23.png
Cleaner, easier-to-scan UI

To reduce the cognitive load on Operations Analysts I updated the presentation of the information to be reviewed from numerous layers of nested items and information presented in multiple rows and columns to a single row of data that is grouped by category. This makes the information much easier to scan and absorb, reducing the chance of errors and speeding up the time it takes to review information.

Group 102125.png
Clear indication of what information changed and what needs to be reviewed

To increase efficiency, reduce cognitive load, and provide context on what has taken place I: 1) removed unnecessary information, 2) added clear indicators on the status of each attribute to show what needs to be reviewed, and 3) added an activity log to show what happened, what specific information changed, who change it, and when.

Frame 16.png
Surfacing relevant requirements at the attribute level

To take the guesswork out of what requirements need to be checked for each item and to ensure all requirements are checked by Operations Analysts, I brought the checklist into the review task and only surfaced the relevant requirements. Additionally, the Operations Analysts need to mark each requirement as compliant or. non-compliant ensuring that all requirements are actually acknowledged.

Group62.png
Attribute requirement level rejection and data collection

Instead of having Operations Analysts reject entire items, rejections and rejection reasons now happen at the specific requirement level, making it easier for Analysts to communicate what is wrong and for Account Managers to know what needs to be changed. This rejection data is now collected as well so analyses can be done to identify the most common issues and further optimize the process.

Group 81.png

Test & iterate

Soft launch to identify any issues or improvements

Since this process is so vital to Rokt's business, before rolling this out to the entire Operations team and sunsetting the existing review process, we had analysts test out the new process for two weeks in random with the existing process to ensure everything was working as expected. Overall, this soft launch was overwhelmingly successful, however, through feedback with the Operations Analysts we identified a couple of tweaks and additional features that would make the process even more efficient. So before the full rollout, we worked to add these improvements.

Enabling bulk actions

In some cases there can be the same information across multiple review items, in some very rare edge cases 50+ items. So Operations Analysts don't have to individually mark the requirements as compliant or non-compliant on each individual item, this feature indicates when there are matching values and gives analysts the option to apply review decisions to all or a subset of the items.

Group 102131.png
Add contextual account and campaign information for reference

In some cases there can be the same information across multiple review items, in some very rare edge cases 50+ items. So Operations Analysts don't have to individually mark the requirements as compliant or non-compliant on each individual item, this feature indicates when there are matching values and gives analysts the option to apply review decisions to all or a subset of the items.

Group 102133.png

Finalized designs

Content review and approvals (remediation)

w.i.p

Content creation (prevention)

Post-launch analysis

Results

The new content review process was a massive success and it's estimated that this new process will generate millions of dollars in savings and revenue as a result of automation and increased efficiency as well as the boost in performance from the reduced time it takes for advertisements to go live and the improved content quality (lower error rates). Below are a few numbers demonstrating the success of this project:

  • Decreased review task loading time by 82% from 11s to 2s

  • Decreased advertisement time to live by 50% from 16 hrs to 8 hrs

  • Decreased error rates by 70% from 40% to 12%

  • Eliminated any immediate need to hire additional Operations Analysts

  • Created a scalable solution that laid the foundation for a fully automated solution

Moving forward

Since this is a new process and a complete overhaul of the old process, there is still a lot of opportunity to fine tune and optimize. In the short term, here are a couple of our next steps:

  • Slowly automate more requirement checks with LLMs as we test and find more requirement checks that the LLM can conduct with high-confidence.

  • Analyze the newly collected rejection reason data to identify what the most common issues are and brainstorm ways to modify the process upstream to prevent these issues before they are sent to Operations Analysts to review.

Learnings from this Project

At the onset of this project, the discussions were focused on simply introducing improvements to the existing flow. However, through user research and data analysis I was able to advocate for and convince product partners that this would be a temporary, band-aid fix and the need to think bigger. In order to address the real problems we would need to explore options beyond the limitations imposed by the existing Angular codebase, which eventually led to our total revamp of the process and massive improvements that wouldn't have been possible without thinking bigger. The big learning that I tookaway from the success of this project is that I should always provide north star or stretch designs to get conversations started. Who knows what is possible or resonates with the product team. As a designer it is my role to provide these ideas and thought leadership.

Next case study

iPhone 12 Graphite Pro Top View Mockup (1).png
CASE STUDY

DESIGNING A FEEDBACK CHANNEL FOR KIDS

Group 22.png
Company
Meta (Facebook)
Group 23.png
Team
Messenger, Integrity
Group 24.png
Duration
4 months
Image by nemo
bottom of page