“It’s not expected to become this question-answering system,” Nayak told Search Engine Land, adding that such a system is “simply not useful” for complex demands. For the past two decades, search engines have mostly operated in the same way. They’ve gotten better at detecting intent, offering relevant results, and combining different verticals (such as image, video, or local search). But the principle is still the same type in a text query, and the search engine will return a mix of organic links, rich results, and advertisements. Recent advances, such as BERT, have improved search engines’ language processing capabilities. Allowing them to better interpret searches and offer more relevant results. Google recently introduced its Multitask Unified Model (MUM), a 1,000-times more powerful technology than BERT. According to Google, it combines multitasking and multimodal input capabilities with language understanding. In an interview with Search Engine Land, Pandu Nayak, Google’s vice