Trending Back to Blog

Mice: The unusual heroes that fight fake news

5 minuteminutos readde lectura
Gabriela Patrón
ByPor Gabriela Patrón

112

Thanks to the internet, we have a great quantity of information at just the click of our fingertips. From magazines to books to the latest news, much is available to inform and teach us more about the world around us. But not all the information shared on social networks, forums, and platforms is true.  Fake news is a clear example of this overabundance of information.

The digital phenomenon has grown in the past years, thanks to social media. Fake news pretends to be real content, propagating harmful information. Many times, it has the intention of hurting the reputation of some politician or promoting hatred against a particular group. 

This content’s propagation is facilitated by distribution over social networks like Facebook, Instagram, and Whatsapp, in addition to the growth of the so-called “post-truth.”According to Oxford’s definition, post-truth is produced when the “objective facts are less influential in shaping public opinion than appeals to emotion and personal belief.” In other words, people allow themselves to be guided more by their emotions than by the facts. Fake news targets at exactly this human discontentment, in order to create content that augments this negative sentiment. 

Deep fake: a new breed of false news

In the majority of cases, false news tends to be published as an informative text. But with technological advances, it has become easier and easier to falsify video and audio files. Using Artificial Intelligence (AI), unscrupulous individuals are able to trick users, making deepfakes of famous people saying whatever the creators want. 

According to The Economist, the term “deepfake” was coined on Reddit, when false videos of celebreties having sex went viral. This method is born of the fusion of two terms: “deep,” relating to the deep AI learning that recognizes human facial gestures, expressions, and movements from millions of photographs; and “fake,” which obviates the falsehood.   

The reality is that deepfakes are increasingly more difficult to spot, thanks to the impressive video quality produced. Easily-recognizable people such as politicians and actors tend to be the dummies predisposed to cybercrime. There are thousands of public video and audio samples of these people on the internet, which gives the deepfake algorithm more data from which to create false content that seems very true to life. The news platform, Buzzfeed, wanted to demonstrate to this technology’s danger, publishing a video of Obama insulting the current US president. 

The fight against deepfake

An investigative team from the Neuroscience Institute at the University of Oregon was able to determine that mice could be great allies in the fight against these falsifications. Their findings demonstrate that these rodents’ auditory system can detect modified audio.  

“We believe that mice are a promising model to study complex sound processing”, explains Jonathan Saunders, leader of this investigation, in a press release. “Studying the computational mechanisms by which the mammalian auditory system detects fake audio could inform next-generation, generalizable algorithms for spoof detection”. 

While mice cannot identify words like human beings, they can detect subtle errors in the discourse, for example, the change of one letter for another, or phonetic changes.  Thanks to mice’s learning, an algorithm can be created that allows computers to spot a deepfake. 

Large companies like Facebook, are investing hefty sums of money into eliminating this type of content from their platforms. A few days ago, Google published over 3 thousand deepfakes so that specialists can use them to create new detection algorithms. One very effective method is to empower communication channels and technological specialists, with the goal to eliminate false content and offer real facts. By this means, readers will have the ability to inform themselves accurately and generate their own conclusions. 

Video: