• story – 16 min read
  • VIDEO – 36 min listen

How Uber creates a unified knowledge ecosystem with generative AI

A conversation with Director of Global Community Operations Hadley Ferguson
and Global Knowledge Production Lead Peter Tu

Uber customer story
UberAT A GLANCE
Millions
of inbound support requests
~40,000
support agents
+5 Million
pieces of knowledge assets
across the company
WATCH THE CLIPS
Why did you select Writer?
What are your top use cases of Writer?
How does Writer accelerate support workflows?
LISTEN TO THE INTERVIEW

Listen to the Uber story or read the edited story below

Writer is the full-stack generative AI platform for enterprises. We make it easy for organizations to adopt AI apps and workflows that deliver quantifiable ROI quickly.

Writer helps organizations build highly-customized AI apps that compress entire business processes, support complex use cases, and infuse work with company intelligence. Our enterprise-grade platform can be deployed flexibly, keeps your data private, and adheres to global privacy laws and security standards. Leading enterprises choose Writer, including Vanguard, Intuit, L’OrĂ©al, Accenture, and Kenvue.

Hadley Ferguson is the Director of Global Community Operations at Uber, where she leads a team focused on creating and improving support resources for users and agents worldwide. With over ten years of experience at Uber, Hadley is dedicated to ensuring the community operations strategy at Uber not only matches its global vision and values but also empowers and educates its diverse and vibrant community.

Peter Tu is the Global Knowledge Production Lead at Uber, focusing on maintaining high-quality support knowledge and resources through integrated, standardized global tools. Peter is dedicated to helping multinational corporations adopt a ‘Glocal’ approach and enhance the end-to-end customer support experience.

Read or listen to the story of how Hadley, Peter, and the team at Uber are using Writer to scale a central knowledge system, automatically create high-quality support experiences, and adopt an end-user-first approach to implementation.

Tell us about yourself, what you do at Uber, and your value to your customers.

Hadley Ferguson

I’m Hadley Ferguson, Director of Global Community Operations at Uber. I’ve been at Uber for eleven years, always on the operations side.

We’re a global team of over 250 people spread across US&C, APAC, EMEA, and LATAM. Our team primarily focuses on setting strategy and developing knowledge resources for all the topics you see in Uber Help, both in our app and on the web. This includes any material a support agent might use to answer a question with Uber Help, including self-serve and automated processes. Our work is really diverse, because we are responsible for both knowledge for internal users and creating external help resources for all users.

Peter Tu

I’m Peter Tu, and I’ve been with Uber for close to seven years. I started my career as a support specialist and moved into knowledge production around four years ago. Today, I‌ manage a footprint of close to 250 people across the world.

Hadley Ferguson

At Uber, we are a tech company that connects the physical and digital worlds to help make movement happen at the tap of a button because we believe in a world where movement should be accessible so that everyone can move and earn safely. At its core, Uber is helping move people. But I also see us moving into changing the world for the better, not just investing in key spaces like sustainability, but offering opportunities and jobs for our users and earners.

The Uber website
The Uber website

What are your team’s top-level goals?

Hadley Ferguson

Our team is responsible for creating and maintaining internal and external support resources for any customer issue. We have a large and diverse network of support agents across the world, and we manage a library of around five million support assets across five to ten different platforms. Our goal is to transform our support materials into accurate, curated, and accessible knowledge that supports the customer experience. We want to create one source of truth for support knowledge, and establish a “create once and publish everywhere” model to ensure scalable, compliant, and non-duplicative work for the future.

In early 2023, we first started to see people thinking about generative AI use cases at Uber, and we realized that the resources and knowledge that generative AI would rely on were extremely fragmented and decentralized. For Uber’s support teams to scale and adopt generative AI, we needed to use generative AI to prepare our knowledge ecosystem for both machine and human consumption.

Peter Tu

Uber operates in a unique environment where there are footprints across the different regions and cultures that we serve. There are experts in these specific regions who have specific skillsets and can share regional best practices. However, a lot of the knowledge is tribal knowledge that you can’t easily consolidate and scale. But with a generative AI platform like Writer, we’re able to compress all of that knowledge into a single source of truth so that everyone can adopt best practices in a more scalable way.

“We selected Writer because it had the fastest speed-to-market and robust capabilities, in addition to being able to scale with us as we grow.”

Hadley Ferguson

Hadley Ferguson
Director of Global Community Operations

What is your overall strategy when it comes to generative AI?

Hadley Ferguson

Right now, there is a lot of investment in generative AI use cases at Uber, especially where there are opportunities for not just efficiency, but quality of customer support experiences. Within the support organization, our leadership teams are evaluating how we can implement generative AI to make user experiences more contextualized, intuitive, and have less friction.

Within my organization, we want to use generative AI to supercharge knowledge production, continuous improvement, and governance. When thinking about our framework for implementing generative AI, we focused on three categories: product, process, and people.

Peter Tu

Think of generative AI as a super-smart colleague, but a super-smart colleague that doesn’t know anything about your business. First, you have to introduce the colleague to the team, saying, “this is what you’re going to work with in the future,” and they have to learn how to interact with it. Then, you have to consider the knowledge you need to provide to onboard this colleague. There are a lot of rules and tribal knowledge within the Uber ecosystem, and so we need to consider how we’d set that up in different technologies, like prompt engineering or retrieval-augmented generation (RAG).

Once people are enabled in the product and it becomes your thought partner, you’d assume a smart colleague would bounce around ideas with you. That’s where you’d start to interact with them more and think about how to improve performance down the line. It’s like a maturity curve, where we enable people, then the product, and then home in on the process.

Saved reply update custom AI app in Writer
Implementation of saved reply app in Google Sheets
Saved reply update custom AI app in Writer
Implementation of saved reply app in Google Sheets
Hadley Ferguson

We prioritized use cases that were highly recurring and had clear and structured input and output requirements. We also wanted to make sure that our approach would meet all of our global users where they were at by selecting a solution that would be easily integrated into our daily routines.

Compliance and overall quality is extremely important to us, including the format and tone of voice of our support materials. We wanted to use AI to create support assets that were compliant, brand-consistent, and eliminated duplication.

With all of this, we also wanted to ensure that the people on our team feel supported and empowered. The customer support space has changed a lot over the last few years, and we wanted to make sure that our people would be set up for success with great training on the generative AI solution we chose.

What led you to choose Writer?

Hadley Ferguson

We were really lucky in that we had a lot of support from our leadership teams to invest in a generative AI solution. There was such a large ecosystem of AI apps available, and we really wanted to focus on speed to market. How could we go fast, have a quick impact, and scale up? We selected Writer because it had the fastest speed-to-market for us and robust capabilities, in addition to being able to scale with us as we grow.

When we looked for a generative AI solution, we thought about how we could use a tool that could help someone take all the pieces of information and generate high-quality knowledge assets. And on top of that, we needed a solution that could take into account brand and legal language, which differ by country. Writer is our source of truth to make sure that our teams are adhering to the legal and cultural differences for specific countries. It’s so powerful because it enables us to do this all with the touch of a button.

“Writer is our source of truth to make sure that our teams are adhering to the legal and cultural differences for specific countries. It’s so powerful because it enables us to do this all with the touch of a button.”

Hadley Ferguson

Hadley Ferguson
Director of Global Community Operations

What are some of your top use cases?

Hadley Ferguson

Our initial focus with generative AI was to build custom AI apps with Writer to create and update saved replies and generate Uber Help Center articles. We started here because of the customer-facing nature of these resources and the opportunity size — these two use cases are our biggest focus in terms of volume. We create upwards of one million saved replies today, and have over 50,000 help center articles. The Writer apps we’ve built for these use cases are helping us improve our user experience, while simultaneously optimizing our team’s capacity.

Peter Tu

Saved replies are commonly known as “macros” in the support domain.  They are the source of truth when it comes to the correct response that support agents should use for a particular user question. This is very repeatable work, so instead of writing these responses from scratch every time, we created an AI app with Writer that automatically recommends and generates these saved replies. Instead of having a human reading through multiple docs to distill and summarize replies, you can rely on generative AI to help you go through these docs and produce a high-quality response. It greatly expedites the whole process.

On our side, the data sources for the saved replies custom AI app are the internal knowledge and context on our users and product. We have different answers for Uber users who are recent adopters versus ones who are loyal customers, and the Writer model takes that difference into account. The custom AI app also considers output factors like length, format, tone, and syntax. We’re essentially transferring human thought processes to AI.

“We wanted to make sure that whatever AI solution we chose, it would be one that could grow with us and help us grow as well. Writer felt like a differentiated company in that way.”

Hadley Ferguson

Hadley Ferguson
Director of Global Community Operations

Hadley Ferguson

We not only want to use generative AI to help us draft the best versions of these saved replies but also to help us identify and maintain that ecosystem, helping us find duplication and inconsistencies in our knowledge sources.

It’s a large amount of time for our team to invest in creating net-new responses each time for the launch of every product, and so with this Writer AI app, we can take this off their plate, and that’s huge. We can save the team time so that they can think more about the strategic approach for improving the product. For example, if we’re writing for Uber Eaters, our team could spend more time on the UX side versus the initial drafting of support materials.

Peter Tu

Another use case we’ve invested in is FAQs. FAQs are a perfect use case for generative AI because they come in very structured components, so they’re easy to replicate with the technology.

FAQs have a title, a body paragraph explaining the policy, and a call to action to nudge users towards the right answer if they aren’t in the right place. Creating FAQs requires translating product knowledge into layperson terms and generating the output in a very specific format. With Writer, we were able to build high-quality FAQ apps that met all those requirements.

FAQ generation
Help Center FAQ article generation
FAQ generation custom AI app in Writer
Implementation of FAQ generation app in Google Sheets

Your implementation of your AI apps is pretty unique! Tell us more.

Hadley Ferguson

We put a lot of thought into meeting our people where they’re at. Our teams who build our support resources work in so many different internal and external tools, but they spend most of their time in Google Sheets and Google Docs. Rather than adding another tool for them to work in, we thought that it would be easiest to integrate Writer AI apps directly in Google Sheets.

We also spent a lot of time thinking about how to address all of our legal guidelines and factors that could be hyper-regional or hyper-nuanced to local markets. For example, the UK is a very highly regulated environment for us. We wanted to use Writer to take that stress off our team, to implement all of those different regional nuances automatically from a centralized source of truth.

We uploaded the various legal language and style guides, which varied by region and local market, ‌to Writer. Then, we worked with the Writer team to set up a custom implementation where the support assets generated by our AI apps would automatically adhere to the appropriate legal, regulatory, and brand rules.

Our team is very diverse, with regional footprints around the globe, which is a huge benefit to us. However, it can be challenging as we’re moving towards creating an English source of truth for support resources, and then translating or localizing it to different languages. Writer empowers our non-native English speakers to be able to create high-quality support assets in English that into account all the regional nuances, which can be really challenging.

With this implementation, we’re able  to give our teams a new tool that can make them more efficient, effective, and confident, directly integrated into their existing workflow. On top of that, we’re able to address concerns that our stakeholders may have, like adhering to legal terminology and regulatory requirements, which have historically been complex‌.

“Rather than adding another tool for our people to work in, we wanted to integrate Writer into their day-to-day life, which is why we implemented our AI apps directly in Google Sheets.”

Hadley Ferguson

Hadley Ferguson
Director of Global Community Operations

How has generative AI impacted the way you think about building your organization?

Hadley Ferguson

It’s impossible to predict the future, but as much as we can, we are focusing on how generative AI is going to change our organization, and what skills we’ll need to make it effective. We want to be more strategic rather than reactive to the industry changes that come with generative AI.

Peter Tu

Something crucial that we learned about large language models is that there are two types of thinking that they can handle. System one is a quick reaction, like an instant FAQ, while system two is more reasoning and planning. What we have learned over time is that LLMs are very good at system one and can do repeatable work very fast. They can do calculations and transform wordings, but they sometimes lack ‌context beyond one single point of knowledge. From a business perspective, our customer support teams will know what the operations team thinks, what marketing thinks, so they can drive that higher-level type-two thinking that the language model is not fully capable of yet.

With that in mind, if people can free up a lot of their workload from ‌system one thinking, then we should reinvest their time in challenges that require system two, such as reasoning and planning, which ties back into business acumen.

How do you plan to measure the outcomes of generative AI?

Hadley Ferguson

We’re still in the process of implementing generative AI, but metrics are very important to us. As our account matures, there are three categories we want to look at — tool usage and enhancement, operational efficiency, and quality output and experience. We want to focus on how we can create more high-quality support resources faster, and deliver it seamlessly. We don’t just want to be the enabler for success at Uber, but the reason for success.

“We’re using Writer Knowledge Graph to make it easier for our internal teams to access accurate and compliant company and product information.”

Peter Tu

Peter Tu
Global Knowledge Production Lead

What advice do you have for other AI leaders?

Hadley Ferguson

You can’t look too much further down the road than six months when it comes to generative AI, so don’t overthink the first use case or avoid jumping in. We committed to having a use case in place by Q1 2024 with our leadership team, so we had full support from the top-down, which allowed us to easily go to market.

Peter Tu

In addition to that, never underestimate the paradigm shift required for these new technologies. Uber’s technology went through an analogous paradigm shift. Uber disrupted ‌the market by allowing users to hail a cab online. People were used to hailing a cab right on the street, and drivers were very unfamiliar with this new technology. It took us some time to go from zero to one, to get people onboard with this paradigm shift.

What’s next for your team?

Peter Tu

We’re excited about our implementation of Writer Knowledge Graph, the Writer-built graph-based RAG solution. We’re using Knowledge Graph to make it easier for our internal teams to easily access accurate and compliant company and product information.

But we know that with generative AI, you need to have great input so that you can have great output, otherwise it’ll be garbage in and garbage out. We are working to enhance our knowledge reference points so that when the LLM consumes that knowledge, it’s looking at the correct information in the correct way. Knowledge Graph ties all the loose ends of taxonomy, metadata, research, and labeling together.

Hadley Ferguson

Additionally, as Writer evolves, the product and possible use cases just become more than they were before. There is a lot of opportunity for teams to think about and develop rigor around how to use Writer in their workflows. We also want to better understand where there is a gap that Writer might fill. We need to take a holistic look at our ecosystem and the platforms that we are already using and how we can take full advantage ‌of them.

Writer

AI your people will love

Request a demo