Climate misinformation may be thriving on YouTube
Of 200 videos analyzed, a majority supported views not upheld by science
By Sujata Gupta
Beware what you view about climate change on YouTube. Some skeptics have hijacked common terms on the site related to climate. This lets them deceive viewers of the online video-sharing website, a social scientist now warns. He urges other scientists to respond by getting accurate info about their work onto the site. Indeed, he’d like them to flood YouTube with scientifically accurate content.
Facebook and Twitter often get the most attention when it comes to concerns over fake news. But YouTube is also hugely popular, says Joachim Allgaier. YouTube claims to reach some 2 billion users each month. That’s about one-third of all internet users. This makes the site a powerful communication tool, the social scientist notes. He works at RWTH Aachen University in Germany. His work focuses on how people communicate science online.
Allgaier started out studying science-themed music videos on YouTube. “I was amazed by the creativity,” he says. He found several on Darwin’s theory of evolution. There’s a song about the periodic table. It’s by the band They Might Be Giants. But in 2012, Allgaier became disturbed by music videos that attacked well established science. Some questioned whether human activities are driving climate change. Others challenged the value of drugs to treat cancer. Some attacked the safety of vaccines.
Allgaier decided to take a closer look at the climate videos. His new analysis appears July 25 in Frontiers in Communication.
He started by searching YouTube for 10 different terms. These included “climate change,” “global warming” and “climate science.” He also searched for “climate manipulation.” And he searched for “geoengineering.” The last two refer to emerging ideas on how to cool the Earth. One would add tiny particles high in Earth’s atmosphere. There, they would block some of the sun’s energy. And that cooling could offset some global warming. (It also, however, might affect the planet in some yet-unknown ways.)
Earlier internet searches can shape the results of later ones. So Allgaier obscured his computer’s address, location and search history. Then, he picked the top 20 videos for each of the 10 terms.
Of these 200, 89 supported the idea that human activities affect climate change. Ample science supports this idea. Four videos featured neutral debates between scientists and climate-change skeptics. (Scientists far outnumber skeptics. So videos presenting them in equal numbers would be misleading.) Sixteen videos denied that humans are causing climate change.
There also were 91 videos that promoted conspiracy theories. The most notorious one focused on so-called chemtrails. Some people believe politicians or government agents have been spreading toxic chemicals. This is supposedly done through airplane condensation trails, called “chemtrails” in this context. It’s an idea unrelated to climate change. And it’s not supported by science.
The dangers
The words someone puts into the YouTube search engine matters. Common terms like “climate change” and “global warming” typically lead to accurate videos, Allgaier found. But newer terms like “geoengineering” and “climate modification” yield different results. They lead to those chemtrail videos almost 93 percent of the time.
Some geoengineering ideas are perfect fuel for conspiracies. They feed into ideas that a government could be doing bad things and not telling the public. Plus, geoengineering is a fairly new research area. What’s more, the scientists and engineers who work in this area do not often show up on YouTube, notes Allgaier.
All this has let conspiracy theorists hijack terms related to the technology, he says. One tactic they use is mirroring. This is when followers upload a video to many YouTube channels. They then tag each version with different keywords to dominate the online-video database.
Another tactic makes it easy for people to find links to the conspiracy videos during searches. It relies on what’s known as search engine optimization, which uses some of the most commonly looked for terms. Yet another approach is to comment on legitimate science videos. Those comments then link to conspiracy content.
One 2018 survey by the Pew Research Center in Washington, D.C., found that 21 percent — more than one in every five people — get their news from YouTube. In Germany, another survey looked at people aged 14 to 29. Of those, 70 percent reported using YouTube and other online video sites to learn about science. Based on such stats, Allgaier worries that many people could become misinformed about climate science from YouTube videos.
Scientists and science communicators can reclaim terms that have been highjacked, Allgaier says. They must even mimic those conspiracy tactics. If researchers remain silent, they risk losing control of information about their work to the conspirators. “It’s necessary to take action,” he says.