Did MUM Just Kill BERT?

Did MUM Just Kill BERT?

Google's Multitask Unified Model may replace BERT

Google’s new Multitask Unified Model (MUM) could revolutionize question answering and search.

We’ve posited relentlessly that consumer experience with Google internet search drives expectations for enterprise search and other applications (shout-out to Bing for also doing a good job). The expected experience has morphed from keyword search and a list of results to question answering systems (“how tall is Mt Fuji?”).

This has created the need to rethink enterprise and other search application architectures to incorporate not just content ingestion and search engine indexing, but AI technologies like machine learning and natural language processing to better understand user queries and extract answers from content stored in knowledge graphs or authoritative documents like FAQs. This is where Google’s BERT (Bidirectional Encoder Representations from Transformers) and its offshoots played a big role.

But what if the answer you are looking for doesn’t already exist in one place? Or what if the question is more much more complex?
“I’ve hiked Mt. Adams and I am hiking Mt. Fuji in the Fall. What should I do differently to prepare?”

Unless someone has already answered this exact question, getting an answer will probably require multiple Google searches and sifting through a lot of information. But what if a search bar could work out the answer with a single query?

That is the problem that the folks at Google are researching, as described in this very interesting blog by Pandu Nayak (Google Fellow and VP of Search). The blog introduces MUM (Multitask Unified Model) as a likely successor to BERT (OK, not by assassination). I highly recommend reading the short piece, but MUM’s exciting developments can be summarized as follows:

  • MUM uses the T5 text-to-text framework and is 1,000 times more powerful than BERT
  • MUM is trained across 75 different languages and many different tasks at once
  • MUM is multimodal, so it understands information across text and images and, in the future, can expand to more modalities like video and audio

The solution architecture we use for most clients is very flexible, so we look forward to exploring what we can do with MUM. BERT will just have to stay at home.

If you have any questions about BERT, or MUM, or anything else related to your next search application project, drop me a note at info@pureinsights.com.




Stay up to date with our latest insights!