Navigating the surge of misinformation and disinformation

In the early 2000s, the world witnessed a profound shift in the way information is disseminated and consumed. The rise of the internet brought about the increased availability of news and editorial publications online, making information more accessible to a global audience. However, with the rapid ascent of social media between 2007 and 2014, the nature of information dissemination transformed once again. Social media became a business-critical lens, giving individuals and organizations an unprecedented platform to share their views and information. Unfortunately, this transition was not without its challenges. The years that followed saw the explicit tampering of elections and democratic processes, accompanied by the proliferation of “fake news” and the blurring of lines between fact and fiction. As we enter 2023, we are facing another significant transformation – the democratization of AI. This future presents both opportunities and challenges, particularly in terms of content and communities. This blog is based on Meltwater’s “Navigating the Surge of Misinformation and Disinformation” webinar and my notes when attending it. I take no credit in this helpful knowledge sharing. 

Understanding Misinformation and Disinformation 

Before delving into strategies to combat misinformation and disinformation, it’s crucial to understand the difference between the two: 

  • Misinformation refers to the inadvertent spreading of false information. It can result from errors, misunderstandings, or honest mistakes. 
  • Disinformation, on the other hand, is the deliberate spreading of false information with malign intent. It is often driven by a desire to deceive, manipulate, or sow discord. 

To combat misinformation and disinformation effectively, it is essential to further break down these threats into their core components: 

  1. Actors: These are individuals, groups, or organizations with malign intent who are behind the spread of disinformation. 
  2. Behaviors: Tactics, techniques, and protocols used by the actors to spread false information. 
  3. Content: The actual substance of the false information. 
  4. Domain: Where the disinformation is disseminated, whether through social media, websites, or other platforms. 
  5. Environment: Those who are intended to be influenced by the disinformation, which can range from specific demographics to entire populations. 

Learning from Conspiracy Theories 

Conspiracy theories often play a significant role in the spread of misinformation and disinformation. They serve several basic human needs: 

  1. Making Sense of the World: Conspiracy theories offer simplistic explanations for complex events, helping individuals make sense of a chaotic world. 
  2. Narrative Control: They allow people to control the narrative, reduce uncertainty, and assert agency in a world where they may feel powerless. 
  3. Fulfilling Relational Needs: Conspiracy theories provide a sense of belonging to marginalized or like-minded communities. 

The 4T Framework: Building Resilience Against Disinformation 

To build resilience against misinformation and disinformation, organizations can employ the 4T Framework: 


  • Utilize data to gain insights and detect early warning signs. 
  • Engage in social listening across multiple channels to understand different communities. 
  • Monitor disreputable sources to track narratives and suspicious activities. 
  • Be cautious about programmatic advertising placement, avoiding harmful content.


  • Demonstrating that you care is crucial for gaining trust. 
  • Focus on conveying your organization’s values, not just virtue signaling. 
  • Communicate clearly and consistently about both your capability and character. 
  • Incorporate trust metrics into core KPIs and executive compensation plans. 
  • Emphasize local engagement and outreach through appropriate spokespeople and influencers.


  • Expand cybersecurity parameters to protect against disinformation campaigns. 
  • Use market engagement skills to test for brand resilience. 
  • Assess effectiveness across different consumer and demographic groups. 
  • Engage the employee stakeholder group to learn from their perspectives. 
  • Quantify trust, credibility, and organizational risk. 
  • Experiment with counter-narratives and strategic communication opportunities. 


  • Make intervention a priority to adapt and fine-tune your approach. 
  • Continuously track additional factors and vulnerabilities. 
  • Clarify and reinforce your organization’s values and purpose. 
  • Be prepared to test and adjust strategies as necessary. 

Final Remarks 

Misinformation and disinformation represent systemic problems that require structural changes. Advances in technology, coupled with the rise of populism, have amplified these issues. In places like Vietnam, where information control has been historically tight, these challenges take on unique dimensions. Bad actors can undermine public institutions, weaken economies, and stoke divisions by targeting businesses and governments. A lack of information, trust, and belonging can spark and sustain misinformation campaigns, impacting perception and, ultimately, behavior. 

The 4T Framework offers a robust rubric for organizations to build resilience and remain effective, all while staying compliant with the principles of organizations like the OECD. As we navigate the future of information dissemination and AI, the ability to combat misinformation and disinformation will be a defining factor in ensuring a more informed, connected, and resilient society. 

(Cross-posted on Clāra’s Insight)

Written by