Poe’s famous line of “Believe only half of what you see and nothing that you hear,” may have had a point at the time but it’s now time to rethink his original position.
In short, “don’t believe a damn thing unless you were there and even then, be suspicious of your memory.”
What a way to live!
Let’s examine the current state of the art. This isn’t science fiction or fantasy, it’s here now and being unleashed.
And no matter what “Fake News” is being bandied about in American politics, we’ve just scratched the surface.
Faking A Voice
Well, yes but how can they really fake somebody’s voice? You can tell the difference between fake and real. Right? OK then, here’s a company that’s about to prove you wrong. Understand this is at the demo level and hasn’t yet hit the market or perfection.
But how far out can it be?
Faking An Image Of Somebody Talking
You mean you haven’t seen a cgi (computer generated imagery) movie yet? Seriously, with a decent skill level and software anybody can be seen to say anything.
It’s Now Simple
Combine the fake voice-sound and the cgi-video and you have…
That’s the world we’re headed towards at the speed of technological intrusion.
And if I can think about it, there’s a backroom political operative that’s planning on doing it.
But Here Come The White Hats
When it comes to voice recognition, there are white hats among us. One group at M.I.T. is working with technology to assist those with social recognition skill issues. Those are good people.
And then there are these folks who’ve already recognized the negative issues around fake voices and fake news and are working on a system that will help you identify fake voices.
This Is Your Very Real Future
I’ve been working on some new stories and somehow my ideas and thoughts carelessly wandered down this rabbit hole of voice recognition.
- Given accurate voice synthesis such as Lyrebird, how do you tell the real from the fake moving forward?
- How do you make decisions about future policies and political leadership when this technology exists and is likely to be used by the opposition.
- Not only are we in a “he-said, she-said” kind of info-war, we’re in a technological battle to identify or at the very least, label your opponent as having used “fake” data.
I can hear you saying, “This is terrible but it really doesn’t apply to me in my personal life.”
This is clearly a problem for anybody in a leadership position at the moment but it doesn’t take much imagination to ask what would happen if somebody mimicked your voice in a conversation with a banker or spouse.
Now that I have your attention. This is me speaking to you and not some robot.