The solution to Fake News

The Solution to Fake News

This post was originally written on my LinkedIn and is being reposted here on the blog.

The solution to fake news is much closer than you think.

Fake news, fake news!! That's all you read, hear, and discuss these days, am I right? (Well, I'm exaggerating, but it's not a lie).

Since the Facebook/Cambridge Analytica scandal, this topic has become popular.

If you don't know what I'm talking about, check out this summary:

In summary: the British company Cambridge Analytica used personality tests and Facebook likes to collect data, lots of data. This information revealed the complete psychological profile of 50 million people on the social network, who then began receiving highly personalized election propaganda. This occurred during, for example, the 2016 US presidential campaign that elected Donald Trump, and Brexit, the referendum last year that led the United Kingdom to leave the European Union. Source: UOL Technology

But what if I told you that the solution to an internet lies within the internet itself? But first:

What is fake news?

Fake news is a strategy for spreading misinformation or rumors via print media, television, radio, and especially online .

Fake news: what does it mean?

"Fake news is written and published with the intention to deceive, in order to obtain financial or political gain, often with sensationalist, exaggerated, or blatantly false headlines to attract attention." – according to Wikipedia.

The name of this solution is Linked Data!

But what is Linked Data?

The concept of linked data is a set of practices introduced by Tim Berners-Lee web architecture, " Linked Data ," with the function of publishing and structuring data on the web.

How did I arrive at this conclusion?

Professor Teodora Petkova course, "Content Writing in the Semantic Web ," I came into contact with the principles of data and the Rhetorical Triangle . In this triangle, Petkova establishes a relationship between Ethos (Credibility), Logos (Trust), and Pathos ( Relevance) , and what she calls the 4 Vs of Data, which are:

  1. Speed
  2. Variety
  3. Truthfulness
  4. Volume

I recommend the course if you want to know everything about the subject, but what interests me now is the third V, the one for Truthfulness .

How does connecting data on the web keep us away from fake news?

fake news solution
Is there a solution to fake news?

Quite simply: everything we put on the internet becomes data . But the problem is that only we humans have the ability to easily understand the meaning of what is written. And even then, we have problems identifying a lie, sarcasm, etc., but that's another topic.

The important question is: if we have reliable sources for fact-checking, why aren't they used by machines to, for example, determine if a particular news item is fake?

This is because this data is separated from each other, and these computers cannot understand the meaning of each part without us humans "translating" it into a language they understand.

And that's where Semantic Tagging , linked data, RDFa , and everything else that's part of Semantic Optimization Google 's work : " How Google is using Linked Data Today and Vision For Tomorrow ".

What is the Semantic Web?

But getting back to the subject, to combat what is false we need speed in verifying data, variety of sources, veracity of the information we are going to use, and volume of data. ( there you have the 4 Vs ).

If we all work together to create content for the web (writers, programmers, researchers, etc.), we can finally build the dream Tim had when he wrote Linked Data :

The Semantic Web isn't just about putting data on the web. It's about creating links, so that a person or a machine can explore the network of data. With linked data, when you have some of it, you can find other related data.

And with that, put an end to the proliferation of lies on the web.

Robots, Cyborgs, and Elections in Brazil

A lengthy and detailed BBC report recently revealed that bots have been a key player in Brazilian politics since 2014. These automated systems are allegedly creating a cyborg democracy, aiming to obscure debates through Twitter, Facebook, and WhatsApp.

In Braincast 266 , Carlos Merigo , Alexandre Maron , André Pontes from NBW , and Marco Konopacki , project coordinator at the Institute of Technology and Society of Rio and creator of PegaBot , discuss the ability of computational political propaganda to alter our perception of reality, taking advantage of the herd effect to manipulate public opinion and inflate numbers.

For those of you who listened (yes, I know, you just scrolled past), here are two of the fact-checking agencies that were mentioned in the conversation:

http://piaui.folha.uol.com.br/lupa/

https://apublica.org/checagem

Update April 2018:

On April 24, 2018, in Lyon, France, The Web Conf , and one of the topics discussed here resonates with what I'm talking about:

Epistemology in the Cloud

in fake news and digital sovereignty

In this lecture Henry Story presented the main points of his paper called Epistemology in the Cloud (which can be downloaded as an epub), where he philosophically defines what knowledge and its connections to logic.

You don't need any knowledge of logic to read this material.

In it, we will find a very clear picture of how this logic works, with applications to the simplest everyday issues we face, before transforming them into issues for Fake News.

Next, Henry uses this analysis to outline requirements for creating a kind of epistemological and shows its political dimension, ultimately suggesting a techno/social response to integrate knowledge institutions on the web, in order to reduce many of our problems with fake news.

Hello, I'm Alexander Rodrigues Silva, SEO specialist and author of the book "Semantic SEO: Semantic Workflow". I've worked in the digital world for over two decades, focusing on website optimization since 2009. My choices have led me to delve into the intersection between user experience and content marketing strategies, always with a focus on increasing organic traffic in the long term. My research and specialization focus on Semantic SEO, where I investigate and apply semantics and connected data to website optimization. It's a fascinating field that allows me to combine my background in advertising with library science. In my second degree, in Library and Information Science, I seek to expand my knowledge in Indexing, Classification, and Categorization of Information, seeing an intrinsic connection and great application of these concepts to SEO work. I have been researching and connecting Library Science tools (such as Domain Analysis, Controlled Vocabulary, Taxonomies, and Ontologies) with new Artificial Intelligence (AI) tools and Large-Scale Language Models (LLMs), exploring everything from Knowledge Graphs to the role of autonomous agents. In my role as an SEO consultant, I seek to bring a new perspective to optimization, integrating a long-term vision, content engineering, and the possibilities offered by artificial intelligence. For me, SEO work is a strategy that needs to be aligned with your business objectives, but it requires a deep understanding of how search engines work and an ability to understand search results.

Semantic Blog
Privacy Overview

This website uses cookies so that we can provide you with the best user experience possible. Cookie information is stored in your browser and performs functions such as recognizing you when you return to our website and helping our team to understand which sections of the website you find most interesting and useful.