The impact of fake news on the recent election has focused public attention on this multi-tentacled and growing problem. Vast swaths of the population fall prey to such misinformation, while others struggle to discern unbiased truth from the morass of lies and distortions that surrounds us.
Experts recommend that we to follow basic principles of information hygiene to separate fake from real, including checking sources, looking for bad grammar and typos, and seeking out corroborating information. And top of the list: never believe anything you read on Facebook.
However, none of these techniques is particularly effective. The quantity of fake news is now reaching crisis proportions, and the problem is only getting worse. Furthermore, the challenge of misinformation reaches well beyond the realm of public discourse, impacting the core of business as well.
Fake News as a Big Data Problem
Among the many ‘V’s’ that characterize big data (volume, variety, and velocity being the most familiar), we have now the added challenge of data veracity. Fake news, after all, is in essence a big data veracity challenge. It doesn’t matter how well we move, process, or secure our information if our information is simply incorrect.
Even the definition of data veracity is surprisingly muddled. Common sense would suggest that information has veracity if it accurately represents the facts in question. Yet facts – or truth more broadly – are surprisingly hard to discern.
Read the entire article at http://www.forbes.com/sites/jasonbloomberg/2017/01/08/fake-news-big-data-and-artificial-intelligence-to-the-rescue/.
Intellyx publishes the Agile Digital Transformation Roadmap poster, advises companies on their digital transformation initiatives, and helps vendors communicate their agility stories. As of the time of writing, none of the organizations mentioned in this article are Intellyx customers. Image credit: Philipp Rudloff, Jim Lipsey, and Jason Bloomberg.