The promises of technology are vast. I recall the first time, decades ago, that I was impressed with “artificial intelligence” — which I continue to call “machine intelligence,” since there’s nothing artificial about it. At that time, I became aware that a major credit card company was (and probably still is) using neural networks to detect change in spending patterns that could be the result of fraudulent use. If your card company has ever contacted you to alert you that your card may have been compromised, you, too, have likely benefitted from this capability. When it works, it seems like magic.
Behind the scenes, a positive customer experience (CX) is the result of great technologies combined with business processes that deliver the greatest value impact at the user level. Having great technology is a necessary — but not sufficient — condition for a great CX. Without business processes that support and enable it, technology will not — indeed, cannot — deliver its full value potential.
There is a vignette in my upcoming book (Section 2.1.2) in which I address the phenomenal versus the epistemic worlds. The phenomenal world is the “real” world, the world of people, places, and things, the world in which I am a flesh-and-blood person. The epistemic world consists of the representations of the phenomenal world — numbers, words, and images that stand for or signify phenomenal entities.
Like most of us in the wired world, the flesh-and-blood me carries with me an “epistemic shadow” — the mini-universe of data that describes and characterizes me.
Of course, my epistemic shadow is not me — any more than is the shadow on the ground cast by me on a sunny day. I remember a time not long ago when I was confident that “I am not my data” — even though I am conservative when it comes to using credit cards and social media.
Then I was hacked, more than once, in important financial accounts. I can’t say my identity was stolen, since I am still me — and even my epistemic shadow is largely intact (consisting, as it does, largely of things I can’t change easily, if at all — like my mother’s maiden name and where I live). My identity was used by someone I don’t know to represent themself as me, and to thereby do things in my name that I neither authorized nor wanted (like borrowing a five-figure sum of money.)
I have come to realize that my epistemic shadow is, in effect, my “data doppelgänger” — it identifies as me, under certain conditions and for certain important purposes. Now if I were to meet some friends for dinner, and sent my data self instead of my real self, I have every confidence they would immediately notice and not buy into it.
But the Very Large Bank (VLB) — of which I was a customer for nearly a half-century — was apparently convinced over the phone with a handful of data points and keywords that someone else was me, enough to loan large money to. This egregious error on VLB’s part took me over six weeks to resolve, and I am now their unhappy (and talkative) ex-client. They have, to be fair, reversed their error and offered me (after some digging on my part) some explanation of how it happened — but have neither apologized, nor offered any credible assurance that it will not happen again, both of which I would consider mandatory in order to win my continued patronage (in which which they demonstrate little interest.)
The features we like for their convenience are the same ones that can be most easily turned against us. The ease of transferring funds to another bank, for example, I use myself and love. When this same capability was used to siphon money illegally out of my account, I started rethinking that convenience value proposition.
To cite another example, the little RFID chips on your credit card that enable “touch” to read can also be read remotely by an easily-obtained reader through a wallet or purse.
In September 2017, 147 million accounts at the credit scoring firm of Equifax were compromised. The US government and states created a settlement that included victims’ compensation of $425 million — less than $3 per case. Though Equifax stock took a hit at that time, it now trades at an all-time high more than 60% above that low.
This means that over half of the US adult population has already had their privacy and identity severely compromised — in this one major event! And these hacks now happen on an alarmingly frequent and broad basis. Everyone is at risk, and the proportion who have not been affected is shrinking rapidly.
It is time for companies to radically re-think their privacy and data security practices and procedures — which many will be doing to comply with GDPR and the California CPA. But beyond that, companies need to examine their practices with regard to after-the-fact mitigation and communications with customers. I’ve been hearing the term “zero trust” with regard to data security — that is, even if you seem to be legitimately inside the firewall, we first verify that you actually are who you say you are. Zero-basing trust and authentication is a good place to start and a mindset we need to adopt.
Despite the advances in technology from which we all benefit, one of the most intractable problems seems to be answering the simple, existential question: Who am I? Privacy is a human right, my privacy is what makes me human — and anyone or anything that compromises my privacy thereby compromises my humanity.