In most writings and speakings about the knowledge value chain, I have intentionally avoided discussions of Truth and Falsehood. Not because I don’t believe these are valid and important constructs, they definitely are — but because the KVC is essentially content-agnostic. That is, it works similarly regardless of what is fed into it.
The old data processing expression, “Garbage in, garbage out” (GIGO), applies to knowledge as well. Even if you have a “high fidelity” process that moves data with integrity into analysis, decisions, and actions, these latter will be flawed if they rest on faulty data. Each stage in the process depends on the accuracy of the stage preceding it. Moving forward, each stage includes and propagates any flaws that it has “inherited.”
Such flaws can be the result of errors (i.e., mis-information), or of intentional actions by bad actors (i.e., dis-information.)
The ability to represent carries inextricably with it the ability to misrepresent. Thus when we speak of dis-information, by implication we are also speaking of dis-data, dis-knowledge, and dis-intelligence. Every part of the knowledge value chain has a “dark side” counterpart.
Lying is a powerful tool primarily because, too often, it works. “Strategic misrepresentation” has long been recognized as a (not-so-savory, but widely used) business practice. The whole modern ethic of “fake it till you make it” has led to as-if-ness being embraced as a valid, even cool and sexy, business strategy. Theranos, built from bottom to top on fraud, is a poster child for the triumph of truthiness over truth. It sometimes seems as if we are living in a real-life Truman Show where image trumps substance.
This has been going on in the political world for a while. Russia has adopted systematic disinformation campaigns as an integral part of their conceptually advanced hybrid warfare programs. These techniques were highly successful in their 2015 invasion of the Ukraine and their 2016 attacks on the integrity of the U.S. president election.
Yale professor Timothy Snyder’s book The Road to Unfreedom is a scholarly yet breath-stealing look at the recent history of Russian political disinformation. His sobering conclusion is that democracy depends on trust, which in turn depends on truth — something that we need more of. “Freedom depends upon citizens who are able to make a distinction between what is true and what they want to hear. Authoritarianism arrives not because people say that they want it, but because they lose the ability to distinguish between facts and desires.”
My firm TKA recently conducted a study on Brand Equity Risk created by digital and social media for The Conference Board. One of the top-rated concerns among large organizations is Disinformation — the risk created by fake followers, celebrity endorser misrepresentations, doppelgänger web sites, intentionally fake news stories — the list goes on. This is not just paranoia — more than 40% of the sample in our study reported having already been affected by disinformation directed at their brand(s).
Where Disinformation is concerned, there are two kinds of companies — those who have already been victimized by it, and those who will be.
The technology to create doctored or fake content is highly advanced already — and likely will become easier to use and more widely available. The recent videos of Nancy Pelosi that were slowed down in order to make her appear to be drunk are relatively easy to make. Adobe recently released a feature called “content-aware fill” as part of their Premiere Pro video editing software that allows complete deletion of people or objects from videos, as previously feasible for still shots with Photoshop. You can be sure that propaganda ministries within certain countries are hard at work on this.
Even newer technology (neural rendering and reenactment) allows synthesis (i.e., total fabrication) of videos by in effect replacing an actor’s head with the head of a person one wishes to impersonate (if that’s the right word.)
Who would want to trash-talk a company or brand, just as they might a political candidate? The most obvious answer, a rival brand, is only part of the answer. Columbia law professor Joshua Mitts has documented several cases in which activist investors have taken short positions, then trash-talked companies and brands in anonymous online forums, driving down their stock price and yielding a fat and easy profit for the trader. Mitts calls this technique “short and distort,” which is also the name of his paper. Not only is this technique successful and widely used, it is not illegal under current U.S. securities laws — a situation Mitts is hoping to draw attention to and change.
I just found this interesting guide describing how to spot fake videos. And here’s a fascinating look at how social media memes can rapidly escalate. To top things off, here are some scarily realistic examples of deepfakes at work.
Comments RSS Feed