Perhaps you noticed from a recent Vanity Fair publication that Oprah Winfrey has three hands and Reese Witherspoon has some odd looking legs. Of course they really don’t. This was just “magic gone wrong” in the world of photo editing and likely invoked more than a few Homer Simpson “d’ohs!” and forehead smacks.
Goofy mistakes aside though, some photo editing and CGI work has been quite impressive and will surely get better. AI is even playing a role in this space. We’re going to keep this blog G-rated, but if you’re following the technology, it is possible to put somebody’s face on somebody else’s body in videos that are highly suggestive. Thankfully, at quick glance you can still tell these are fakes, but for how long will the naked eye be able to spot a fake?
So what do fake images and videos have to do with cybersecurity? Well, it’s a question of data integrity.
You see, there was a time where we considered a picture or a video definitive proof of something having happened. We are well into the early stages of “that may no longer be the case” and here is why.
Hollywood and LA music studio tricks have been commercialized and miniaturized. Steve Soderbergh (of the Ocean’s Eleven remake and series) recently said at Sundance 2018 that he only wants to shoot movies on iPhones from now on, just as his latest horror-thriller Unsane was. As somebody who lived a past life as a DJ and producer, I can tell you that music production is no different. I literally have the tools to produce a studio-quality album on the same computer I’m typing up this blog post.
Now, given that I have these tools at my fingertips, with some work and practice, I can really do some incredible things. Never mind simple audio editing, like adding pauses or cutting out certain pieces of what was said. I can alter things like voice inflection, pitch, speed, you name it. So not only can I change what you said, I can change how you said it.
Does this worry you? It should.
And I am certain you have heard the term “Photoshopping” (which comes from being able to use Adobe’s Photoshop to create something that may not have happened). Sure, it’s fun to add filters to your pictures or cut and paste thingsto make goofy looking pictures, but what happens when these fakes become indistinguishable from the real thing?
Now you really should be worried.
The technology is good enough to make that happen, especially if you have a dedicated and meticulous user trying to alter the data. If somebody is committed enough, they will go pixel to pixel, nanosecond to nanosecond to eliminate all possible traces of frauds. Add AI assistance into this mix and the process will only become easier.
We’ve clearly fallen behind the times legislatively with respect to cybersecurity laws. I’ve made comments in the past that we still don’t have basic terminology down right. For example, “stealing” somebody’s emails is profoundly different from “copying” somebody’s emails, but we still have too many “experts” and pundits using the terms interchangeably, not knowing the difference.
My concern is we’re going to fall behind the curve when it comes to the integrity of evidence as well. Consider this scenario: a crime happens and the victim calls the police to investigate. The perpetrator knows that the victim has a video system protecting their facility, but is able to hack into that system and alter the video on the DVR. Of course, the investigators will ask for the video, but will only find out – from the video – that no crime took place. Or worse, the victim will get accused or charged for making a false claim!
Have some integrity.
They key to all of these problems is data integrity. Some of our systems have ways to ensure the data hasn’t been tampered with. For example, most emails have some sort of stamp burned into the header properties, meaning that it is pretty tough to fake emails (tough, though not impossible if somebody is truly committed enough and depending on the email system being used). The same though cannot be said for picture, video, and audio files.
Are there ways to ensure digital data has not been tampered with? Yes, the most common being the practice of hashing. Super simple version: take a file or text, run it through a “checksum” utility, use some algorithm (MD5, SHA-256, take your choice) and watch some funky garble get generated. If that funky garble is changed, even by one character, the garble will be different.
For example, the paragraph above, when run through a SHA-256 calculator, gives the following string:
7C15BFDDE577EA8C58BB317741FEF1017CC4A5EA052086226B61F12B057BE648
If I take out the quotation marks around the word checksum, the string changes to:
D8536F24D5695955FB65A03D1309959971509F6A550E28E8660937795D943CF8
And if I put back the quotations, but change the capitalization of “Are” to “are” the string changes to:
F450B0275C9B61687601A6F98C698769D1880AA95F4A719A3A11A00C5E0426EE
This is a problem we really need to spend some serious time on, because new technology will really play a role on how we treat and handle our evidence. Best we start having this discussion before the stakes are higher than Oprah’s hand being misplaced.
By George Platsis, SDI Cyber Risk Practice
February 13, 2018