electronic tongue

AI-Powered Electronic Tongue Successfully Distinguishes Coca-Cola from Pepsi in New Study

Business

Can an AI electronic tongue tell the difference between Coke and Pepsi? Well, some scientists say they created one that can.

A new graphene electronic tongue can distinguish between similar liquids, like milk with different water content, and identify different products, like various sodas and coffee blends. It can even detect spoilage in fruit juices and potential food safety issues. Researchers at Penn State found that the electronic tongue’s accuracy improved when combined with artificial intelligence, which can interpret the data in its own unique way.

Graphene is a single layer of carbon atoms arranged in a honeycomb lattice structure. It’s considered one of the strongest materials known, yet it’s incredibly thin and flexible.

The scientists explained that graphene has long been sought after as a chemical sensor; however, minute differences between devices made it unreliable. The team behind the electronic tongue say they solved this problem by “training” an AI to tell the difference between similar liquids regardless of variations between graphene devices. They hope that their work shows that it’s possible to use ‘imperfect’ chemical sensors to get accurate readings and that the ‘tongue’ will be able to help detect problems with food.

“We’re trying to make an artificial tongue, but the process of how we experience different foods involves more than just the tongue,” said corresponding author Saptarshi Das, the Ackley Professor of Engineering and professor of engineering science and mechanics. “We have the tongue itself, consisting of taste receptors that interact with food species and send their information to the gustatory cortex — a biological neural network.”

The gustatory cortex is the region of the brain that perceives and interprets various tastes beyond what can be sensed by taste receptors, which primarily categorize foods via the five broad categories of sweet, sour, bitter, salty and savory. As the brain learns the nuances of the tastes, it can better differentiate the subtlety of flavors. To artificially imitate the gustatory cortex, the researchers developed a neural network, which is a machine learning algorithm that mimics the human brain in assessing and understanding data.

“Previously, we investigated how the brain reacts to different tastes and mimicked this process by integrating different 2D materials to develop a kind of blueprint as to how AI can process information more like a human being,” said co-author Harikrishnan Ravichandran, a doctoral student in engineering science and mechanics advised by Das. “Now, in this work, we’re considering several chemicals to see if the sensors can accurately detect them, and furthermore, whether they can detect minute differences between similar foods and discern instances of food safety concerns.”