Innovation

Why Google Wants You to Smell This Article with AI

SNIFF TEST

It’s one step close to a real life smell-o-vision.

Photo illustration of the Google logo with the “O”s as eyes and a cut out nose placed on top, with smell lines going into the nose.
Photo Illustration by Elizabeth Brockway/The Daily Beast/Getty

Of all of your bodily senses, there’s perhaps none more powerful than smell. Catch a whiff of cologne or perfume from an old flame. Or the pages of a book from when you first fell in love with reading. Or freshly cut grass on a Saturday morning. All of a sudden, you’re taken back to the time and place where you first experienced and knew that scent for the first time.

That’s because your olfactory bulb, the part of your brain that processes smell, is very close to the region responsible for memory. That means it’s incredibly powerful for triggering strong emotions and memories from long in your past.

As powerful as it is, though, smell is a sense that has long perplexed—and even eluded—scientists. We still don’t have a complete understanding of why molecules create the smells that they do. So, unlike sights and sounds, we’ve been unable to digitize smells. That means being able to literally turn it into a digital format with numbers.

ADVERTISEMENT

Think about that cologne, or book, or fresh cut grass again. This would be a way of turning those smells into a string of numbers and storing them on your computer so you can experience it again.

“Your phone can easily share images and sound,” Joel Mainland, an olfactory researcher at the Monell Chemical Senses Center in Philadelphia, Pennsylvania, told The Daily Beast. “You can archive them and look at them over and over again without destroying them—but you can’t really do that with smells.”

Being able to do so would be a fragrant revolution in the world of neuroscience. Aside from creating an actual smell-o-vision by allowing people to share smells with one another on a digital format, it could lead to the development of new odors and flavors for food, perfumes, and mosquito repellents; or even help identify diseases.

Researchers at Monell and Google-backed olfactory AI startup Osmo have created a machine-learning model that could accurately describe the smells of chemicals by their molecules. The team found that the AI could also describe the scents better and more accurately than humans. This breakthrough, detailed in the journal Science on Thursday, is a major step forward in the quest to digitize smells.

“We’d like to do things like record smells and share them,” said Mainland, who is a co-senior author on the paper. “We’d also like to be able to specify odors so you can do things like disease diagnosis, quantify flavor, disrupt [bad odors], identify molecules that represent odors that have never been synthesized before, and optimize molecules in current flavors and fragrances to be more environmentally friendly.”

The model was trained on a dataset containing the molecular structures and scent descriptors of 5,000 smells. Researchers wanted to be able to essentially prompt the model with a molecule shape and have it describe what the smell of that molecule is.

The team then gave a group of 15 human participants a set of 400 odorants and a set of 55 words to describe them. The words included things like mint, ozone, garlic, and musty. The participants were also trained on how to recognize the smells and select the most appropriate descriptors.

Both the panelists and the AI were tasked with applying a term to the odor on a 1 to 5 scale. For example, a participant might describe a molecule as very garlicky (5) and a little sweet (3). In the end, the AI ended up describing the odors more accurately than any of the panelists—performing 53 percent better than the average participant.

“This is something that was developed to better represent molecules,” Mainland explained. “Combining the larger dataset and the new model architecture really improved performance to the point where we could now replace a person on our panel with a computer and improve our ability to describe smells.”

Mainland added that the end goal is to eventually be able to identify “primary odors,” the same way that the images on our digital screens are made up of three primary colors (i.e. red, green, and blue). If we find these primary odors, it would allow us to mix and match them around in such a way that we can digitally create pretty much any smell we want.

“To really digitize odors, we need to understand this mixture piece,” Mainland said. “Then you can find out primary odors to understand how to make a huge variety of other odors.”

There’s still a long way to go before we start emailing each other smells. However, this work is a big step in that direction—and lets us catch a whiff of the futures of digital smells.

Got a tip? Send it to The Daily Beast here.