Google announced MUM (Multitask Unified Model) on May 18th, 2021 as a new milestone in understanding information. MUM uses the T5 text-to-text framework and is 1,000 times more powerful than BERT. Google announced on September 29th, 2021 that they will be integrating MUM into search over the next few months.

Video Transcript:

In this episode, we’re going to be looking at Google MUM. We’re going to look at specifically how it’s going to be integrated into search, what are some of the changes that we’re going to see, and how we can set ourselves up for success both today and in the long run.

All right. We got a big one here. This is Google MUM and the future of search.

What is MUM and Why Should We Care?

Google announced MUM on May 18th, 2021, and MUM stands for Multitask Unified Model.

MUM uses something called T5. It’s a text-to-text framework, it’s kind of one of the iterations or improvements to BERT, but it’s a 1,000 times more powerful than BERT. This is not BERT, this is something new. BERT really jump-started a lot of this really in-depth NLP work, and there’s been a number of changes and growth, and it’s kind of spurred all these other types of language models like GTP, RoBERTa, and ALBERT. There’s a ton of them out there.

MUM is using T5. And as you can see here, T5 does a really good job of a number of tasks. And as you can see, there’s a little GIF here to the side which is from Google that shows T5 and kind of what it’s doing. And it does a number of really cool things from language transformation to extraction, to text generation, summarization, and more. Google has used that in order to build MUM. Google announced on September 29th, 2021, that now they’ll be integrating MUM into Google search over the next few months.

what Makes MUM So Special?

MUM – Text Generation

Well, the first thing is, is text generation. MUM not only understands language, but it can generate it. That’s one of the areas that T5 was making some massive changes and advancements in, and MUM is taking it even further. So not only can MUM look at the language and understand what it means, it can reply. It can give other texts’ answers. It can generate texts based on the inputs that you’ve given it.

MUM – Multilingual

It’s multilingual. Now, this is a pretty big deal. It’s been trained across 75 different languagea, a ton of various tasks at once, and allows it to develop a more comprehensive understanding of information than the other previous models. It understands the meaning between one language and another. So let’s say you ask a question in Spanish, but you want an English answer, MUM knows that answer and it can understand the context between languages. That’s a huge advancement when it comes to language models.

MUM – Multimodal

And then it’s also multimodal, which means that it understands information across text as well as images. And in the future, it’s going to expand even further to things like video and audio. So not only can it read text and know what it means, it can look at an image, understand what’s happening in that image, look and see if there’s text within that image and understand the information and the connection between the two. And like I said, it’s going to be expanding into video and audio as well. This is a pretty big advancement when it comes to language models. And now that it’s going to be used within Google search, it’s going to allow them to really understand context even further.

So on this slide, we’ve got two different applications of MUM within search. And if we look to the right, right here this is from the Google product blog that they did about this. And with Google Lens, you can actually zoom in to a shirt. So this guy’s got the shirt on with flowers, but say you want socks with that pattern on it. So now you can ask a question, “Can you give me socks to that same pattern?” And MUM can identify that floral pattern and try to find socks that might match or go along with that shirt. That’s pretty impressive.

Now, on the other side, on the right side, let’s say you want to fix something, but you don’t know what it is. With Google Lens, you can take a picture and say, “How do I fix this thing?” And then MUM will go, “Okay. I know what that is. And based on what you’re looking for, this is some videos on how to fix that.”

So as you can see, it’s making search much more helpful. It’s allowing us to do a lot more things. It’s allowing us to ask different questions about the things that we’re seeing, almost like we’re interacting with a person, which is pretty intense, right? So like, “How do I fix this thing?” You could send it to a friend. Instead of doing that, you’re sending it to Google. They’re understanding what you mean. They understand exactly what it is, and they’re giving you the information you need.

Well, they’re going to be doing a redesign of search based on the learnings that they’ve had with MUM and the capabilities of MUM. Now, there’s three big changes coming to Google search. So they are, Things to Know, the ability to zoom in and out of topics, and an increased use of visual search.

Things to Know

So ‘Things to Know’ looks similar to ‘People Also Ask’, but it’s much deeper. So Google now has a much deeper understanding of how people explore a specific topic, and they’re going to show the aspects that people are most likely to look at first.

They’re going to give you step-by-step instructions, or they’re going to give you guides, and styles, and tips, and more within this ‘Things to Know’ area. This is a very interesting feature because it’s not just questions from users, it’s now taking all that data that Google has, aggregating it in the most explored parts of this or the aspects that people are most likely to do first. And then giving that to the user to walk them through the process.

Google is also expanding the ability to broaden or refine your search, so they’re giving users the ability to explore ideas and letting them zoom in and out of a specific topic. So you’ve got options like refine this search, where they’re going to give you some different keywords. This is kind of like related search terms, right? So that’s going to allow you to go deeper into something. But you can also broaden it.

It can give you this option of going up a level as well. So they’re covering the entire topic, the entire journey from both a very high level to a very refined level. And this is very important to understand that they’re looking at things within an entire topic set and not just solely within a keyword or a specific niche. This is something we really need to pay attention to. And if you haven’t been creating content like this, now is the time to really start thinking about this.

Visual Search Update

They’re also going to be making a big update visually within visual search. They are making it easier for users to find visual inspiration with images right within search.

For instance, they’re looking at puddle pouring ideas and painting, and here you can see these are not just images coming from image search, which we have at the top. But if you scroll down, you see that Google is now pulling images from web articles, web pages, videos, and more to make it more visually appealing.

So all the different aspects of your page are going to play a huge role today based on how Google’s going to be extracting images and leveraging those. So again, from a marketing standpoint, what kind of images are you using, and are you capturing visually the things that you’re talking about within your content?

Bonus: Video Search Upgrade

Here’s a bonus, this wasn’t technically part of the search redesign, but it is something within videos, which is pretty expansive, Google now also understands topics within videos. And they can share related topics even if they’re not explicitly mentioned in the video to the user, based on this advanced algorithm update.

This system understands the topics in the videos and how they relate. So again, you could look at a video of penguins, and right here they’ve got the Macaroni penguins and the related topic, South Georgia Island. That’s not in the title, maybe it’s in the video, but Google now is starting to pull all these related topics that it’s extracting from what it’s learned by crawling that video. So this feature is starting to roll out in a few weeks and they’re going to add more enhancements as that comes as well.

Again, if you’re doing video search, if you’re creating video content, look at these things and understand these related topics because they could really help you go deeper and cover other aspects that you might not be covering if you were just going off a keyword research tool.

Google MUM: Key Takeaways

So what are some key takeaways?

Well, MUM is going to have a huge impact on search. It not only understands language itself, but it can generate language. MUM is multilingual and it can connect information between languages. It’s also multimodal, which means it understands information across text and images, and it’s going to expand into video and audio.

And then Google is using MUM to redesign search and create a really different user experience, and probably a better user experience, based on the learnings that it had. So we’ve got things coming like Things to Know, broaden and refine your search, an increase use of visual search, and topic search within video content itself.

Well, MUM is going to have a huge impact on search. It not only understands language itself, but it can generate language. MUM is multilingual and it can connect information between languages. It’s also multimodal, which means it understands information across text and images, and it’s going to expand into video and audio.

What Can You Do?

So what can you do? How can you leverage this information to create better content strategies, better search strategies to make sure that you not just rank, but you rank through the entire funnel?

Well, MUM at its core is all about creating deeper connections between concepts and how people search for that information within the topic itself. So you need to focus on creating topic authority and you need to do this in a way that’s also visually appealing. Cover your niche in detail and help your audience find the information they need from going broad to very, very refined. You need to make sure that you have all of that covered if it fits within what you’re trying to do.

You also need to think about building a knowledge graph within your site and leveraging structured data to help the crawlers better understand the context of your content. Structured data helps that, uses linked open data which feeds into these tools. These tools learn from linked open data. And so when you give them that information, you remove any ambiguity and you allow them to understand exactly what you’re talking about.

So if you’re new to semantic search or you’re just getting started with semantic search and you want to be able to set yourself up for success, I have a course on that and it’s going to give you the tools you need to get yourself started. And it’s going to do it in a way where you don’t have to be an expert, you don’t have to have any basis of coding, but we’re going to walk you through the process as we go.

So whether you’re a long-time SEO or whether you’re just getting started, this course has actionable insights for you. Because you’re a YouTube watcher, you can get this course at 25% off by using the code SEMANTICSEO and you can sign up at

Thanks again for watching this video.

If you’ve got any questions on MUM and the impact that you might see in the future, or you want to add any insight to this video, please comment below. We’d love to continue that conversation with you.