It seems fitting that one of Google’s most important inventions — one that would come back to haunt the company — was initially devised over lunch.
In 2017, researchers at Alphabet’s Mountain View, California, headquarters were talking over their midday meal about how to make computers generate text more efficiently. Over the next five months they ran experiments and, not realizing the magnitude of what they’d discovered, wrote their findings up in a research paper called “Attention is All You Need.” The result was a leap forward in AI.
The paper’s eight authors had created the Transformer, a system that made it possible for machines to generate humanlike text, images, DNA sequences and many other kinds of data more efficiently than ever before. Their paper would eventually be cited more than 80,000 times by other researchers, and the AI architecture they designed would underpin OpenAI’s ChatGPT (the “T” stands for Transformer), image-generating tools like Midjourney and more.
Elon Musk Launches AI Startup xAI With Team of Former Google, OpenAI Engineers