, November 13, 2024

Biased or radical content on YouTube; is AI the solution?


  •   7 min reads
Biased or radical content on YouTube; is AI the solution?

How serious a problem is biased or radical content on YouTube? Education and AI provides the fix

How serious is the problem of radical content on YouTube as a medium for promoting extremism? Researchers have been investigating.

"We find little evidence that the YouTube recommendation algorithm is driving attention to," FarRight and anti woke content, finds a study.

"Data suggests that YouTube's recommendation algorithm actively discourages viewers from visiting radicalizing or extremist content," finds another study,

So does that prove there is no link between YouTube and extremism? Well, let's not go so fast; there is nuance.

Look at the comments

Play this game; watch a video expressing views with which you strongly disagree. You may find the process quite upsetting. But then look at the comments; chances are you will find very few dissenting views, rather a small army of people agreeing with each other. Few things are more depressing than being surrounded by people who hold views that directly contradict the views you hold dear.

Are echo chambers a myth? New report suggests they might be
Are echo chambers and filter bubbles really such a big deal; a new report finds that search engines and social media have more diverse news repertoires.

Unconscious bias

There is another problem.  And prepare yourself for a shock; not all content that is presented as factual turns out to be accurate.

And maybe a problem that is more serious than radical content is content with errors in it or even unintentionally promotes a particular political leaning playing to the subconscious bias of the video's presenters.

Take as an example the YouTube channel Economics Explained. Up to a point, it does a reasonable job of explaining economics. But it comes with a twist, and there does seem to be an agenda to promote certain political leanings. In this video: It suggests that the Netherlands is the most unequal economic nation on Earth — so much then for liberal politics. It is just that, as this video explains, it is not true.

Of take the hyperinflation discussion.  Economics Explained provides a compelling list of reasons why hyperinflation is nigh. And as a general rule, the 'inflation is coming' meme is often associated with right-wing thinking. But as this video points out, once again, many of the arguments made by Economics Explained are highly debatable.

Here is the problem. Of the several million people who subscribe to the Economics Explained channel, very few are aware of any controversy surrounding its claims.

And that in part is where the danger lies.

In a paper published in August 2021, entitled: Examining the consumption of radical content on YouTube, Homa Hosseinmardi, Amir Ghasemian, and Aaron Clauset examined the consumption of radical content on YouTube.

Its conclusion: "We find that news consumption on YouTube is dominated by mainstream and largely centrist sources."

"Consumers of far-right content, while more engaged than average, represent a small and stable percentage of news consumers."

"However, consumption of 'anti-woke' content, defined in terms of its opposition to progressive intellectual and political agendas, grew steadily in popularity and is correlated with consumption of far-right content off-platform."

"We find no evidence that engagement with far-right content is caused by YouTube recommendations systematically, nor do we find clear evidence that anti-woke channels serve as a gateway to the far right. Rather, consumption of political content on YouTube appears to reflect individual preferences that extend across the web as a whole."

Another paper, published in 2020, by Mark Ledwich and Anna Zaitsev, concludes:

"After categorizing nearly 800 political channels, we were able to differentiate between political schemas to analyse the algorithm traffic flows out and between each group."

"After conducting a detailed analysis of recommendations received by each channel type, we refute the popular radicalization claims."

"To the contrary, these data suggest that YouTube's recommendation algorithm actively discourages viewers from visiting radicalizing or extremist content. Instead, the algorithm is shown to favour mainstream media and cable news content over independent YouTube channels with slant towards left-leaning or politically neutral channels."

"Our study thus suggests that YouTube's recommendation algorithm fails to promote inflammatory or radicalized content, as previously claimed by several outlets."

We landed on Mars, but can we fix the madness of humanity?
I sometimes wonder if mad-cow disease has mutated — causing a new type of human disease. It’s not directly fatal, but in its own way, just as scary, I’ll call it mad human disease.

The catch — the echo chamber still exists

But the two papers concerned were focusing on YouTube's recommendation algorithm. That is not the same thing as saying YouTube doesn't encourage extremist views. The differentiation may be subtle, but it is important.

In another paper, this time partly written by the three authors of the 'Examining the consumption of radical content on YouTube' paper and published in November 2020, the authors conclude:

"We also find evidence for a small but growing 'echo chamber' of far-right content consumption. Users in this community show higher engagement and greater "stickiness" than users who consume any other category of content. Moreover, YouTube accounts for an increasing fraction of these users' overall online news consumption."

Why do people believe conspiracy theories?
A new study finds that people who believe in conspiracy theories have low critical thinking skills.

The long tail

In theory, there should never be an occasion in human history when we have ever been so wise. Nearly all of us carry around with us in our pockets or handbag access to the greatest library that has ever existed. In times gone by, learned men and women spent thousands of hours in libraries pouring over documents, and few doubted that they possessed enquiring minds. Today, we all spend thousands of hours in this massive library, and yet enquiring minds it appears, have to a large extent, gone absent without leave."

Yet, YouTube is this incredible resource. Let's say you want to know about the fall of the Roman Empire, or the city of Babylon or the discovery of DNA or investigate the careers of West Ham's best football players — the material is there for you.

Yet, most of us don't. Maybe we will drill down into a particular topic we find interesting, but most of the time, our YouTube viewing history is not so educational — we may watch cats being cute, and there is nothing wrong with that — but broader interest would be good.

If we choose to watch fun videos on YouTube with no educational value, there is no harm done.

Maybe the real harm is done when we watch content with some educational element without a broader base to know when the content is misleading or inaccurate.

We all suffer from confirmation bias — and YouTube provides the greatest opportunity in history to feed our bias with videos that in some cases make unintentional errors, while in other cases deliberately distort the truth.

Whatever your thing: whether it is gaining a better understanding of the War of the Roses, fixing bicycles or the influx of immigrants, the information is there in a vast variety. And if your passion is medieval wars or the workings of bikes — then that is great. However, if, on the other hand, your passion is more political in nature, then the odds that your YouTube viewings will lead you to more exaggerated versions of your older views, will in all likelihood increase.

Social media is screaming extremism and revolution, is it time for censorship and to fight bias?
I feel like the internet is screaming. Never have I known such polarisation. The most innocuous statement from a leading politician, and those who love the speaker react like he or she has walked on water. Those who support the other side, react like Hitler has come amongst us.

So, what is the answer?

We are not arguing for censoring YouTube — or at least only censorship in extreme circumstances.

The solution probably lies with education and teaching critical thinking skills.

AI could provide a partial solution. For example; if algorithms push us towards watching diverse views on a topic — like an algorithm that automatically recommends videos which present an opposing point of view. This may help towards offering a more complete understanding on a particular subject.

Imagine, in a decade or two from now, and our smartphones have mutated into always with us digital assistants, with their screen forming part of contact lenses and their voice interface working via subtle earpieces such that the device whispers into our ear. It may whisper travel directions — turn right at the next junction. It may whisper radical content — and that would be terrifying, or it may challenge us and offer different views. So, if a politician states a bare-faced lie, our AI assistant whispers into our ears, ' these statements are controversial, and some believe …."

What we can say for sure is that whatever way we consume digital content today, it will be different in five years, more different in ten years and a good deal more different in 20 years. The danger that we will be fed views that play to our biases is real but our understanding of bias in the digital world is now a recognised issue and something that is being addressed.

Education in how to consume online content and how to build our critical thinking abilities is lacking at the moment. So perhaps AI that nudges us to question what we see and hear might be our best short term solution, helping to avoid or at least question the vortex of our own biases.  

Related News

You've successfully subscribed to Techopian - Responsible business
All done, we'll keep you informed when we post articles. Just check your email
Welcome back!
Success!
Success! Your billing info is updated.
Billing info update failed.
Your link has expired.