Category: Civic Participation

The Importance of Digital Skills for Countering Disinformation

25/05/2025 

Main author: Sally Heier

Contributing researcher and editor: Dr Mikael Leidenhag

 

Background

Disinformation, which refers to the intentional spread of false or misleading information aimed at deception, is often called “fake news” or “misinformation” (Githaiga, 2019). This issue has become increasingly pressing, though its full extent remains unclear. Although disinformation is not a recent phenomenon, today’s digital landscape enables it to spread more rapidly than ever before, compared to earlier media environments (Githaiga, 2019; Maréchal et al., 2020; Law, 2017).


Unchecked disinformation can have widespread repercussions, influencing public health decisions and voting behaviours (McGrew and Kohnen, 2024). Additionally, disinformation intersects with harmful practices such as racism, misogyny, and the exploitation of vulnerable populations, leading to greater confusion among users and enhancing its impact (McDougall, 2019).


Challenges in Countering Disinformation

Speed: The internet enables the rapid and effortless creation and dissemination of disinformation in ways to which we must still adapt (McDougall, 2019; Maréchal et al., 2020; McGrew and Kohnen, 2024). Disinformation spreads swiftly during its initial phase when it is at its most viral, meaning it is shared more frequently compared to later stages. Some interventions, such as fact-checking, are too slow to diminish engagement with disinformation during these highly viral stages (Chuai et al., 2024). 


Platform Algorithms: Social media platforms facilitate the dissemination of disinformation (Githaiga, 2019; Law, 2017). In the competition for users' attention, these platforms rely on highly personalised and controversial content, typically in the form of targeted advertising, to maximise consumer engagement (Matschke et al., 2021). The targeted advertising business models of social media platforms are primarily driven by their internal algorithms, which can aid in the spread of disinformation (Maréchal et al., 2024). The mechanisms behind these algorithms are often not transparent, making it challenging to predict which content will be promoted (Maréchal et al., 2024). 


Digital Literacy: Many individuals lack the essential skills to identify and combat disinformation effectively (Matschke et al., 2021). 


Together, these imbalances foster a digital environment that follows Brandolini's law: “The amount of energy needed to refute bullshit is an order of magnitude bigger than that needed to produce it” (Williamson, 2016).


Strategies for Addressing Disinformation

Addressing and combating disinformation necessitates a holistic approach with complementary measures across various areas, including platform reforms, policy changes, and educational initiatives (McGrew and Kohnen, 2024).


1.    Policy and Regulation (top-down approach)

While privacy laws could be amended to allow users to opt out of targeted advertising practices and gain greater control over the collection and sharing of their information, platforms could also be mandated to maintain public databases concerning their advertising and algorithm practices and mechanisms (Maréchal et al., 2024). Additionally, platforms could conduct assessments of their social impact and human rights risks (Maréchal et al., 2024). Shareholders could be encouraged to take a more active role in platform governance and hold leadership accountable for the social implications of business models (Maréchal et al., 2024).


Actors in international cooperation contexts, such as governments, donors, and civil society organisations, should support initiatives that promote democratic and civil rights-based content regulation on digital platforms, for example, through the establishment of advisory councils comprising representatives from civil society (Matschke et al., 2021).

Beyond mainstream digital platforms, the same actors should assist in providing high-quality journalism, particularly in rural areas, by promoting community media, reporting in minority languages, or facilitating cross-border media (Matschke et al., 2021). Furthermore, universal access to media and the internet must be encouraged, especially to enable such alternative reporting. To achieve this, it is essential to promote investment in digital infrastructure, improve access to it (Matschke et al., 2021), and enhance the skills necessary to navigate and create media.


2.    Platform-Based Solutions (top-down approach)

While previous strategies employed legal mechanisms, platforms themselves can also implement processes to alleviate the spread and effects of disinformation. Although fact-checking is often too slow to defuse disinformation in its most viral phase (Chuai et al., 2024), it can be a useful tool when disinformation resurfaces.


Friction, referring to behavioural design measures that create obstacles to sharing content, could reduce the spread of disinformation by making the process of posting or sharing more deliberate or slower (Jahn et al., 2023). Examples of friction include delaying the completion of an action with a prompt asking the user to reflect, requiring users to solve micro-exams like quizzes and puzzles before they can complete an action (Jahn et al., 2023), providing the option to share a post only after it has been read in its entirety, and allowing monitoring of what is about to be sent before it is dispatched (House of Commons Digital, Culture, Media and Sport Committee, 2019).


However, introducing friction in this manner contradicts social media’s competition for user attention, as it risks reducing the time users spend on these platforms in the long term.


Digital platforms can also play a role in enhancing digital literacy, for instance by promoting and funding digital literacy among vulnerable and hard-to-reach groups (Maréchal et al., 2020). Brief messages on social media platforms reminding users to verify sources and claims can also be beneficial as they encourage users to practise their media literacy (see below).


3.    Educational Initiatives: Digital Literacy (bottom-up approach)

Digital and broader media literacy programs are identified as essential tools in combating the negative effects of disinformation. These programs train individuals of all ages to identify and counteract common disinformation tactics while enhancing critical thinking skills (European Digital Media Observatory, 2024). Digital literacy focuses on increasing knowledge and understanding of the digital landscape and its functioning (European Digital Media Observatory, 2024), fostering a sceptical resilience when interacting with digital media (McDougall, 2019).

Digital literacy should be integrated into overall media literacy and critical thinking skills, particularly since established media outlets can also propagate disinformation (Githaiga, 2019).


Targeted digital literacy initiatives ought to be created for particularly vulnerable populations, especially elderly individuals, as well as for users who have previously shared disinformation content (Brashier, 2024).

While not sufficient as a standalone answer, media literacy is essential in combating disinformation and its effects by enabling individuals to:


a) Evaluate online information critically


b) Make informed decisions as both citizens and consumers


c) Participate more creatively and fully in the realms of online and offline media (European Digital Media Observatory, 2024).


Given that digital and media landscapes are continuously changing, media literacy should be viewed as a lifelong learning endeavour that requires training and continual practice from both children and adults (European Digital Media Observatory, 2024).


Current UK recommendations and legislation

The House of Commons Digital, Culture, Media and Sport Committee’s (2019) disinformation and ‘fake news’ report illustrates how a holistic approach to combating disinformation could be structured in the UK. This report places significant emphasis on legal and top-down strategies, aiming to establish a more accountable and transparent framework for tech companies operating in the UK, particularly regarding user data protection and the integrity of political processes.


  1.  Legal Liability: Social media companies ought to be held accountable for the content shared on their platforms, rather than just claiming to be neutral platforms. A proposed new category of tech company would impose legal liability for harmful user-posted content, encouraging the government to reflect on this in upcoming legislation.

  2. Independent Regulation: It is crucial to establish an independent regulatory framework for tech companies, which includes a mandatory Code of Ethics to identify harmful content and guarantee adherence. An independent regulator with legal authority should oversee the enforcement of this code.

  3. Data Protection: Suggestions involve broadening privacy safeguards to cover inferred data and establishing a fee on technology firms to assist regulatory organisations like the Information Commissioner’s Office (ICO).

  4. Public Access to Advertising Data: A searchable database for political ads is suggested to promote transparency about ad funding sources and their target audiences, thereby improving public awareness of political influence.

The Online Safety Act 2023 incorporates several recommendations aimed at enhancing digital safety. This legislation mandates that platforms promptly eliminate illegal content and limit access to harmful materials. Oversight will be provided by Ofcom, the UK’s communications regulator, which has the authority to impose substantial fines of up to £18 million or 10% of a company’s global revenue on those who do not comply. Companies will be required to verify user ages, publish risk assessments, and maintain clear reporting systems for incidents of online abuse, bullying, and exposure to self-harm, violence, or eating disorders (UK Parliament, 2024b).


Nevertheless, the government currently has an inconsistent strategy for addressing digital exclusion (UK Parliament, 2023), and it is projected that basic digital skills will represent the UK’s most significant skills gap by 2030 (UK Parliament, 2024a).

 

References

Brashier, N.M. (2024). Fighting misinformation among the most vulnerable users. Current Opinion in Psychology, 57, pp.101813.


Chuai, Y., Tian, H., Pröllochs, N. and Lenzini, G. (2024). Did the Roll-Out of Community Notes Reduce Engagement with Misinformation on X/Twitter? [Online]. Available from: https://arxiv.org/abs/2307.07960


European Digital Media Observatory. (2024). The Importance of Media Literacy in Countering Disinformation. [Online]. Available from: https://edmo.eu/areas-of-activities/media-literacy/the-importance-of-media-literacy-in-countering-disinformation/


Githaiga, G. (2019). Fake news: A threat to digital inclusion? [Online]. Available from: https://waccglobal.org/fake-news-a-threat-to-digital-inclusion/


House of Commons Digital, Culture, Media and Sport Committee. 2019. Disinformation and ‘fake news’: Final Report. [Online]. Available from: https://publications.parliament.uk/pa/cm201719/cmselect/cmcumeds/1791/1791.pdf


Jahn, L., Rendsvig, R.K., Flammini, A., Menczer, F. and Hendricks, V.F. (2023). Friction Interventions to Curb the Spread of Misinformation on Social Media. [Online]. Available from: https://arxiv.org/abs/2307.11498


Law, A. (2017). Post-truth and fake news. Media Education Journal, 61, pp.3-6.


Maréchal, N., MacKinnon, R., and Dheere, J. (2020). Getting to the Source of Infodemics: It’s the Business Model. [Online]. Available from: https://www.newamerica.org/oti/reports/getting-to-the-source-of-infodemics-its-the-business-model/


Matschke, A., Mack, J., Reineck, D. and Zaitoonie, R. (2021). Media and information literacy and digital inclusion. [Online]. Available from: https://akademie.dw.com/en/3-media-and-information-literacy-and-digital-inclusion/a-59767639


McDougall, J. (2019). Editorial: Digital Literacy, Fake News and Education. [Online]. Available from: https://eprints.bournemouth.ac.uk/32314/3/Editorial%20v4%20%281%29.pdf


McGrew, S. and Kohnen, A.M. (2024). Tackling misinformation through online information literacy: Structural and contextual considerations. Journal of Research on Technology in Education, 56(1), pp.1–6.


UK Parliament. (2024a). Digital exclusion in the UK: Communications and Digital Committee report. [Online]. Available from: https://lordslibrary.parliament.uk/digital-exclusion-in-the-uk-communications-and-digital-committee-report/#heading-1


UK Parliament. (2024b). Online Safety Act 2023. [Online]. Available from: https://bills.parliament.uk/bills/3137


UK Parliament. (2023). The Government has “no credible strategy” to tackle digital exclusion. [Online]. Available from: https://committees.parliament.uk/committee/170/communications-and-digital-committee/news/196028/the-government-has-no-credible-strategy-to-tackle-digital-exclusion/ 


Williamson, P. (2016). Take the time and effort to correct misinformation. Nature, 540(171).