An Eye on GDPR

There is a lot of talk about the European Union’s General Data Protection Regulation (Regulation (EU) 2016/679).  And rightly so, because it will impact a great many organizations, many of which reside in the U.S.  Set to come fully into effect May 25, 2018, the GDPR has understandably caused a lot of headaches because it is wide-sweeping and costly regulation, especially if you are in violation.

Clearly, the first question to ask is if the GDPR applies to you. If it doesn’t, you are in the clear (but that is not an excuse to relax your data protection measures).  If it does, well, you have work to do if you haven’t been on top of your GDPR compliance. This is especially true if you are a big organization, are not based in the EU, and have a lot of EU customers and clients.

I would like to take a step back here for a moment and perhaps calm some of the GDPR hysteria out there. Yes, some commenters and compliance professionals are rightly having heartburn over the GDPR. And some others have said not to freak out, like Elizabeth Denham, the UK Privacy Commissioner, stating that the GDPR should just be looked at as an “evolution” in data protection and not a revolution.

My humble opinion is that if the GDPR applies to you and you are a non-EU country, your worry should be greater than zero.  Here is why: the EU needs money. And who do you think they will fine first?  EU-based organizations or non-EU-based organizations?  Option 1 seems like it could be detrimental to the EU economy (something about hurting your own) but Option 2 seems like a nice windfall being extracted from a competitor.  If I’m the EU, I know who I am fining first.

But the fines can’t be that bad, can they?  Yes, they can be that bad. Violators of the GDPR can be fined up to 4 percent of annual global turnover or €20 Million, whichever is greater. That sounds like some industrial strength motivation to take the GDPR seriously, especially if you could end up near the top of the pecking order.

Apart from all your usual data protection and cybersecurity grief, the real shift of power of the GDPR comes in the form of individual rights, specifically in terms of privacy. This nuance is important culturally, because Europeans have generally had more constitutional protections that relate to privacy than say freedom of speech.  And from a business perspective, what that means is that individual consumers will have incredible leverage over organizations.

The GDPR will give individual consumers the following powers:

– The right to be informed

– The right of access

– The right of rectification

– The right of erasure

– The right to restrict processing

– The right to data portability

– The right to object

– Rights related to automated decision making and profiling

All of this sounds pretty straightforward, but think of all the resources required to implement and comply.  To begin, anything that could be considered “personal data” is swallowed up by the GDPR. This could be a name, a credit card number, IP address, and preferences.  As you can imagine, the list can go on and on. This begs the question: have you identified all possible pieces of “personal data” within your organization?  By the way, charities are not exempt from the GDPR, so if your thought is that your well-meaning good-cause not-for-profit will be given a pass, I wouldn’t bet the farm on that sort of wishful thinking.

Of course, each of the rights presents its own set of headaches for the organization, but I will pick the first “the right to be informed” as an example. Think Equifax. Think Uber. Now think about how to notify those tens and hundreds of millions within 72 hours. That is the sort of headache you are going to have to deal with.

A single blog post is not going to give you all the answers you need regarding GDPR, but I will close with this: the Data Protection Officer (DPO), could end up making or breaking you. The comparison to the Chief Compliance Officer is not right, because the DPO has some incredible powers that other C-Suite officers may not have.  For example, the DPO must:

– Act “independently”

– Not take instructions from their employer regarding the exercise of their tasks

– Have expert knowledge of data protection law

– Be provided with sufficient resources

– Not be dismissed merely for performing their tasks

– Report directly to the “highest management level”

And guess what?  You could be fined for not allowing your DPO to do their job!  If this GDPR thing is starting to give you some unexpected heartburn, it would be completely expected.

While I would like to believe the intent of the GDPR is to instill some good data protection and cybersecurity habits into all of us, remember what is driving it: a focus on privacy and a very big stick (with no apparent carrot in sight).  The coffers in Brussels need to be refilled, so don’t be surprised if the bureaucrats are looking across the pond for a way to do just that.

In closing, a very Merry Christmas and Season’s Greetings!  May the Holiday Season and the New Year be full of health, happiness, and success for you and yours!  See you in 2018!

By George Platsis, SDI Cyber Risk Practice

December 5, 2017

 

Susan Davis International Wins Prestigious Stevie Awards!

SDI Executive Vice President Judy Whittlesey accepts Gold Stevie Award. Photo credit: Stevie Awards.

During the annual Stevie Awards for Women in Business ceremony in New York City, SDI received the Gold Stevie Award for Communications or PR Campaign of the Year for the Elizabeth Dole Foundation Hidden Heroes Campaign and a Silver Stevie Award for Women-Run Workplace of the Year – More Than 10 Employees – Advertising, Marketing, Public Relations and Business Services.

The Gold Stevie Award, celebrating businesses, organizations, and individual achievements in more than 60 nations, recognizes SDI’s role in the 2016 launch of the Hidden Heroes campaign for the Elizabeth Dole Foundation (EDF). EDF, founded by Senator Elizabeth Dole in 2012, is a non-profit organization strengthening and empowering America’s military caregivers and their families by raising public awareness, driving research, championing policy, and leading collaborations that make a significant impact on their lives.

SDI’s Judy Whittlesey and Dan Gregory joined the Army Historical Foundation, National Museum of the United States Army and Clark Construction to sign the Museum’s final steel beam. Photo credit: Frank Ruggles.

The Silver Stevie Award for a Women-Run Workplace of the Year – More Than 10 Employees – Advertising, Marketing, Public Relations and Business Services was awarded to SDI for the firm’s work with diverse clients ranging from nonprofits to corporations and government agencies. Competition for both Stevie awards was global.

The Stevie® Awards are the world’s premier business awards.  They were created in 2002 to honor and generate public recognition of the achievements and positive contributions of organizations and working professionals worldwide.

Susan Davis; Donald Cardinal Wuerl, Archbishop of Washington; Roma Downey and Mark Burnett celebrating the dedication of the Museum of the Bible.

Winning the Stevie Awards capped a banner month for SDI. During November SDI spearheaded the grand opening of Museum of the Bible; the topping out ceremony for the National Museum of the United States Army, and the Elizabeth Dole Foundation and U.S. Department of Veterans Affairs’ 2nd National Convening: The Military Caregiver Journey.  SDI also supported the 20th anniversary of the Women’s Memorial and the groundbreaking of the WWI Memorial.

SDI salutes the team members whose outstanding work contributed to such an extraordinary month.

Senator Elizabeth Dole, Former First Lady Laura Bush, and U.S. Secretary of Veterans Affairs David Shulkin meet with caregivers at the Elizabeth Dole Foundation and VA’s 2nd Annual National Convening, managed by SDI. Photo credit: Lisa Nipp.

Prepare to Defend Your Reputation

“Lose money and I will forgive you. Lose even a shred of reputation and I will be ruthless. …Wealth can always be recreated, but reputation takes a lifetime to build and often only a moment to destroy.”

Warren Buffet

There is widespread acknowledgement that corporate reputation has significant value.  Calculating that value with any precision is a bit more dicey. Many have attempted to quantify reputational value, and estimates vary from 20 percent on the low end to 70 percent to 80 percent on the high end. One can accept that there is value, and the value represents an asset that must be protected, and ideally, enhanced. An article in the Harvard Business Review sought to assess reputational risk.  It posited there were three determinants of reputational risk, saying “Three things determine the extent to which a company is exposed to reputational risk. The first is whether its reputation exceeds its true character. The second is how much external beliefs and expectations change, which can widen or (less likely) narrow this gap. The third is the quality of internal coordination, which also can affect the gap.”

Today I want to focus on the second of those determinants. A recent article by Dan Kiely in Entrepreneur looked at how reputation of smaller firms can be adversely affected by cyber breaches.  “…don’t be fooled into thinking that you have to be a Fortune 500 corporation to be a target. Cybercrime is an equal opportunity menace. Larger mature companies are hit most often, but smaller scale-ups are hit the hardest, and it takes longer for them to recover. Only 14 percent of small businesses rate their ability to mitigate cyber risks, vulnerabilities and attacks as highly effective. In today’s digital economy, winning and maintaining the trust of your customers is central to business growth, and nothing erodes trust quite like a cyber breach.”

The many people who have a trust relationship with a business, customers, clients, shareholders, investors, employees alike, expect that certain standards will be met with regard to cybersecurity. They do not expect perfection, and may even have some tolerance for breaches, if the business can show that it has engaged in a rigorous process to defend itself against being breached, and communicates effectively before, during and after a breach. However, if analysis of the breach exposes unexpected shortcomings in preparation and/or response, beliefs and expectations about the company will change for the worse, and reputation will suffer.

Heed Warren Buffet’s words, protect your reputation.

By Tom Davis, SDI Cyber Risk Practice

November 21, 2017

 

Controlling Your Cyber Supply Chain

Back in September, I wrote a piece that questioned whether or not you trust your network. As an extension to that piece, this piece focuses on your cyber supply chain.

Let’s begin with this simple premise: you may never fully know who is a part of your cyber supply chain. Why do I say that?  It is because it is exactly impossible for you to have a watchful eye on all parts of the supply chain. It would be a full time job for you. In my view, the only entity that could have full control of their cyber supply chain is a government (emphasis on could because even for a government full control of the cyber supply chain could be an incredibly difficult and expensive proposition).

If you accept that simple premise, then by extension, you will have no problem accepting this one as well: the probability of you being breached is greater than zero.

If you are with me so far, this is excellent. It means you have not bought a bag of magical beans from vendors or consultants who are already preaching to you that you are on the way to the cyber secure promised land.

My point is this: you don’t know what you don’t know, so when that is the case, ensure that you are taking some extra cautionary steps. And this is why I will reference a very handy tool from NIST that outlines some basic principles regarding the cyber supply chain. I won’t go through the entire tool but just focus on two areas: principles and key risks.

Principles

I’m not going to reinvent the wheel, so will therefore say the majority of what you need to know about cybersecurity is captured within these three principles:

1) Develop your defenses based on the principle that your systems will be breached.

2) Cybersecurity is never just a technology problem, it’s a people, processes and knowledge problem.

3) Security is Security.

I recommend viewing the tool, but here is my brief commentary on each point:

1) If you believe – even for a nanosecond – you have an impenetrable system (or let somebody convince you that one is possible) you may also believe that all is well in the world right now.  Caveat: even if we achieve some incredible technology, like Quantum Key Distribution (QKD) for communications, there will still be other threats, which is a perfect lead in to the next comment.

2) If you are not placing considerable emphasis on the human element, your cybersecurity strategy will always fail.  What has started as a hypothesis of mine has turned into a truism for me over the years: I am so certain of the human element issue that I am willing to personally guarantee your cybersecurity strategy will fail, 100% of the time, if you are not showing significant bias to solve the human element of the problem. Plenty more on this issue can be found in previous SDI posts and on LinkedIn.

3) If your cybersecurity strategy is independent of your security posture, you’re looking for trouble. This is why we say cybersecurity should be viewed through an organization risk management lens. This means if your IT department is not working with your security department and both are not working with all other departments in the organization, the question is not “when will I get breached” but rather “how badly will I be breached when it happens?”  Leadership at the top is crucial and absolutely necessary. The C-suite needs to adopt a risk management mentality and instill a culture of “security smart” within the organization.

You are probably wondering what I mean by “security smart” right now.  It’s simple: make sure everybody has a generally good idea of what the cyber risks are.  Don’t be paranoid.  Just get your staff to understand these threats are real and they can impact your organization and their jobs.  You do not see people freaking out that a fire may spontaneously erupt in the middle of your organization’s lobby, but people are trained enough to know that if they smell something burning or see some smoke, it’s best to warn others, quickly investigate, and if needed, pull a fire alarm or call 911.

We don’t have “hall monitors” walking around our offices checking for fires.  It’s something all persons of the organization have a watch out for (in large part, because of personal safety).  Well, if your company goes bankrupt because all its IP has been stolen, I think that impacts your personal safety.  So, start a program of being “security smart” within your organization (hint: SDI Cyber can help there).

Key Risks

The next section is all straight forward, again from the NIST tool.  All you need to know is that these risks exist and you should be thinking of ways on how to deal with them. These risks include:

  • Third party service providers or vendors – from janitorial services to software engineering with physical or virtual access to information systems, software code, or IP.
  • Poor information security practices by lower–tier suppliers.
  • Compromised software or hardware purchased from suppliers.
  • Software security vulnerabilities in supply chain management or supplier systems.
  • Counterfeit hardware or hardware with embedded malware.
  • Third party data storage or data aggregators.

It’s a bit of a raw deal, but yes, you have to worry about everybody else that’s part of your supply chain. And here’s the real kicker: you may have no control over what you can do except alter your supply chain, which could be an expensive proposition. This is where risk management comes into play: do you accept that risk (and the associated and potential costs) or do you do something about it?  That’s your decision, but it’s something you need to think about.  Otherwise, you’re just setting yourself up for a world of hurt that you may not be able to recover from.

By George Platsis, SDI Cyber Risk Practice

November 7, 2017

Beary Scary

We are slowly easing through the languorous days of fall, reluctantly trading daylight for darkness, feeling the crunch of leaves, inhaling the smoke-tinged air that marks the fullness of the season. Soon it will be All Hallows Eve, a night when witches ride high across cloud-strewn skies and spirits restlessly roam the earth below. They will be joined by millions of children less concerned about the spirits than the potential bounty that awaits behind closed doors. Tiny princesses will race alongside pirates and ballerinas, each eager to ring a doorbell and shout in unison “trick or treat!” Older adolescents and young adults will gorge on horror shows, feasting on the fright inspired by vampires, werewolves, goblins, and countless maladjusted individuals who act out in truly horrific fashion. Those who’ve been around for a while may think of frightening figures such as Nosferatu, Frankenstein’s monster, the Mummy, and more recently Candyman, Pennywise, Leatherface, and Berserk Bear.

Astute readers may have tripped over Berserk Bear, but Berserk Bear may be very scary indeed. The world was introduced to Berserk Bear in CrowdStrike’s 2014 Global Threat Intel Report. “Proactive analysis during 2014 revealed another Russian actor that has not encountered public exposure, yet appears to have been tasked by Russian state interests. BERSERK BEAR has conducted operations from 2004 through to the present day, primarily aimed at collecting intelligence but has also provided capability in support of offensive operations in parallel to the Russia/Georgia conflict in August 2008.”

Since then, the legend of Berserk Bear has grown. In 2016 it was reported to be attacking energy interests in the Middle East. In September of 2017, Symantec said Berserk Bear had penetrated firms in the U.S., Turkey, and Switzerland, and had the ability to cause mass power outages, shutdown electrical grids, and disrupt utilities. That report was confirmed last Friday, when the Department of Homeland Security (DHS) and the FBI issued an alert warning critical infrastructure companies of “advanced persistent threat (APT) actions targeting government entities and organizations in the energy, nuclear, water, aviation, and critical manufacturing sectors.”

What we know at this point is that the attacks have been successful, and critical parts of the infrastructure have been breached. DHS has reported the attack is ongoing. There are no reports of damage to this point. We are left to speculate as to motivation, and what might happen next.  Like many scary stories, this one may have a sequel. Stay tuned.

By Tom Davis, SDI Cyber Risk Practice

October 24, 2017

 

Have We Normalized Theft?

When did cyberattacks truly begin to concern us?  Was it the Morris worm of 1988?  One would have wished it was, but clearly this is not the case.  How about the 2008 cyberattack on USCENTCOM?  That worm, likely injected into the DoD system through a single USB key, took about 14 months to clean up by some estimates.  Fast forward nine years, Equifax.  145 million records stolen.  Have we learned yet?  I wish I could say “okay, this time we will do something about it!” but I am not too optimistic.

Why?

Because I feel we have slipped into a dangerous area: we have allowed the normalization of data theft.  And today, data theft means anything from personally identifiable information to R&D/intellectual property to good old fashioned money.  My feeling is that because we don’t “feel” data the same way we would, oh a stack of $20s, we don’t really appreciate what is being lost.

Let’s try to put this into perspective.  If in fact 145 million records were stolen from Equifax, what would that look like in a “smash-and-grab” operation?  For simplicity, let’s assume one record is one page.  The average thickness of paper is 0.1 mm (0.0039 inches).  How high would the paper stack in this case?  Well, those 565,500,000 inches equate to about the distance from New York to Manila (over the Pacific), give or take a few hundred miles.*

To think that somebody could perform a break-and-enter like this (and get away with it) sounds so preposterous, this idea wouldn’t even make into a B-movie script.  But when all these “pieces of information” are digitized into a bunch of zeros and ones, well, you can fit all that information into the palm of your hands.

And that’s what gives me heartburn because we are doing such a poor job understanding what is being stolen.  We spend billions of dollars innovating, labor for years, and all these valuable resources could be gone, poof, like that because somebody missed patching a system or left a terminal unprotected or clicked a link they shouldn’t have.  This is asymmetry of galactic proportions.

So back to my point about normalizing theft: I think because we can’t “feel” the pain, we don’t give this issue the attention it deserves.  If I was a nefarious actor and I was able to siphon $5 a month from your bank account, would you care?  Before you answer … would you notice?  What if I was able to make this siphoning as some sort of “fee” or common every day purchase?  You may not give it that much thought and let it slide.  Now let me do that to a million people.  And let me do that to a different million people every week.  How does $260 million a year sound to you?

Does this sound like a tenable business model for an economy to survive?  Nope.  But that’s what we are dealing with when we normalize theft.

Sure, some may say “but we have services to protect us.”  Okay, but those services cost money, $10 a month, let’s say.  That’s $120 a year per individual.  To protect the 52 million people that would have gotten ripped off in the earlier scenario, that’s a hit of $6.24 billion dollars annually.  That’s $6.24 billion dollars that could have gone into paying rent, buying a meal, helping a local foundation, or go towards tuition or medication.

Lost in so much of the cybersecurity conversation is that protection rarely offers a return on investment.  Protection is a tax on business and a tax on individuals.  So unless we start “feeling” this theft on a more personal level and take the steps to properly educate ourselves of the human dimension, we are going to run out of money to invest in protection real fast.  People are generally not good at understanding risk and we often have farmed out that risk to somebody else (insurers, public officials, you name it).  But even this model is becoming too expensive.  So it’s time we take a closer look at ourselves and see if we are part of the problem by having allowed data theft to be normalized.  We shouldn’t be so passive about it.  We should be outraged, because this is a slow strategic bleed of national strength and stability.

By George Platsis, SDI Cyber Risk Practice

October 3, 2017

* Correction: “I’m tempted to say what’s a few extra zero’s among friends, but am forced to heed my own counsel…when you make a mistake, own it: it’s actually 565,500 inches, which is closer to 9 miles, more like New York to Hoboken and back…but that’s still a lot!”

A New Shakespearean Tragedy?

Once more unto the breach, dear friends, once more;
Or close the wall up with our English dead.

KING HENRY V

In Shakespeare’s retelling of the life of King Henry V, he has the king urging his brave soldiers forward once more, hurling themselves against the French army in the early stages of what became the decisive battle of Agincourt.  The line has survived to become a common exhortation for giving something another try. One notes that King Henry did offer the alternative of dying in the gap of the wall, but the essential idea is to flow through the breach to victory.

Today we are dealing with a breach in which the flow is outbound, and there is no victory in sight.  The massive date breach suffered by Equifax has exposed the personal identifying information of over 143 million people. The attackers took people’s names, Social Security numbers, birth dates, addresses and, in some instances, driver’s license numbers. They also stole credit card numbers for about 209,000.  The breach is rightly seen as a monumental failing on the part of Equifax, and the repercussions are mounting rapidly.

Writing on the Gartner Blog Network, John Wheeler calls the breach a game changer for cybersecurity.   Among his predictions, Equifax will cease to exist. “In the last 4 business days since the company disclosed the data breach Equifax has suffered a $5.3 billion loss in market capitalization which represents almost a third of the company’s total value. When considering an estimate of the potential costs associated with the data breach (based on the 2017 IBM/Ponemon Institute Cost of Data Breach Study), Equifax faces a potential loss of $20.2 billion which currently exceeds their total market value by $8.3 billion. Also, the company currently faces more than 23 class actions lawsuits with at least one seeking more than $70 billion in damages. The death spiral will soon take on greater momentum when executives are required to testify before Congress and criminally investigated for potential insider trading related to the delayed disclosure of the data breach. Equifax will ultimately be acquired out of bankruptcy by one of the remaining two credit reporting companies – TransUnion or Experian.”

The “delayed disclosure” noted by Wheeler is extremely problematic. Equifax said it first detected suspicious behavior on July 29. It appears the breach dates back to May of this year, and some reports suggest it may have happened even earlier. Even if one accepts the July 29 date as the first instance in which Equifax became aware of the breach, several weeks went by before customers were made aware. The delay triggered outrage, and credit reporting companies have few friends, so the fury goes on unabated.

The fallout continues. Equifax’s Chief Information Officer and Chief Security Officer “retired,” and its CEO stepped down. More heads will likely roll. Forty states are investigating how Equifax handled the breach. Other regulatory agencies are launching investigations, and there is a real possibility that this breach will lead to significant change in law and regulation.

Once more, out through the breach.

By Tom Davis, SDI Cyber Risk Practice

September 26, 2017

Do You Trust Your Network?

 

The question seems simple enough, doesn’t it? But have you asked the question? My feeling is that not enough people actually do. Of course, a natural response may be: isn’t that a question for my IT department to answer?

Yes and no (more on that in a moment). And I promise I am not trying to play word games, but words and their meanings matter, and am therefore placing particular focus on the word trust. Trust is different than confidence. Trust is different than transparency. Trust has a much more “personal” element than the others. And so much of what we do in the world today is based on trust.

There are times where confidence may be appropriate. For example, “I am confident in Joe’s abilities, but I do not trust he will finish the job.” And there are times where transparency may be appropriate, such as, “blockchain technologies offer transparency, but I do not trust them to serve as the backbone for a currency.”

Notice where I am going? These terms are not interchangeable. Somebody can be “transparent” with you but it is quite possible you do not trust them at all. Conversely, somebody who is not wholly transparent with you may earn your trust.

And trust is a funny thing because it guides so many of actions. Simple example:

“Would you do business with Bob?”
“No. I know he has a solid track record, but something about him I just don’t trust.”

“Would you do business with Sally?”
“Yes. I know she doesn’t have the track record of Bob, but something about her that just makes me feel she’s the right person to do business with.”

In other words, we are dealing with emotion and rational action may be taking a back seat.

So let’s get back to the IT department. I am not asking: do you trust your IT department? Rather, I am asking: do you trust your network? There is a difference. It’s huge. And if you don’t see it as being huge, your cybersecurity nightmares may only be in their opening act.

If you have 20 minutes, there is a 2010 podcast worth listening to by Brian Snow, who was the technical director of information assurance at the National Security Agency. It can be found here and special thanks to my fellow #CyberAvenger Chris Veltsos for pointing out this podcast. At around the 16 minute mark, Brian Snow talks about the “trust bubble” and that while “trust” is “widely used” it is also complicated and poorly understood.

Our world operates with so much going on in the background that we seldom give thought to how complicated things can be. Therefore, the only way we can operate and conduct business is when we have levels of transparency, confidence, and trust. For example, I am confident my ISP will provide reliable service so I can get my professional work done, but I do not trust my ISP when they say they are “best service provider” or “the fastest network” or that they will “have 99.9999% uptime” or whatever else you can think of (nor do I think they make their billing particularly transparent but that is unrelated to network reliability). In other words, I’m keeping my expectations in check.

In fact, I try to keep my expectations so “in check” that I expect my services to go down from time to time because that’s just life! Bad connection, server times out, bandwidth issues, and yes, even potential DDoS attacks and hacks! I expect all of these to happen because my trust in network capabilities can only go so far. Sure, I can invest more capital and overhead, but I do not have a printing press for money, so this solution is untenable over time. You need to use your resources wisely and because my trust in network capabilities can only go so far, I do things like: regularly patch, update, have offline backups, back up devices, have alternate connectivity means, and – get ready for it – even plan for total shutdown (and sometimes the plan is “no way to do work today, find something else to do”).

In summary, I simply do not trust network reliability to be as reliable as the sun coming up from the east every morning. And keep your expectations in check: there are very very few operations that can justify the need (and cost) for 100% uptime (and even those are susceptible to the freak event that shuts them down).

As for social engineering attacks, shame on me if I get suckered into them. I don’t have the expectation that my network should protect me from them. Remember, a social engineering attack is going after YOU FIRST before the actors execute their following intent.

Side commentary: WOW! Some of these social engineering attacks are getting really sophisticated and I am impressed. One of the best I have seen in the last few months is the attacker faking that you are the initiator of the conversation and the attacker is “replying” to your original query. Be careful before you click “reply” because sometimes all the attacker wants you to do is just that, click reply, and scoop up an e-mail address, a device ID, an OS version, message headers, or the basic information on your signature line. All these information leaks can come back to haunt you.

But back to my original question: do you trust your network? If your trust in network reliability is rooted in the trust you have for your IT department, I have a car I want to sell you. I do not say this as a knock against your IT department, but if we can be perfectly candid for a moment, if your IT department has full trust in your network reliability, you should be concerned. Granted, the IT department can be confident about the network, but usually when you are confident, it means that you have done some sort of honest and thorough assessment of the situation.

Therefore, if your IT department says to you, “we’re confident we do not have any malware on our network” ask how they came to that conclusion. If instead they say, “we do not have any malware on our network, honest, trust us!” then raise an eyebrow and get your hands dirty, because you have work to do.

By George Platsis, SDI Cyber Risk Practice
September 12, 2017

 

 

Cybersecurity Valuation and Your Organization

Cybersecurity is everywhere. Everybody is talking about it. Everybody is worried about it. And everybody thinks they need to do something about it.

The problem is that everywhere we look, we get this general feeling that we are failing. One report suggests that only 1 in 5 organizations are “very mature” in adoption of the NIST Cybersecurity Framework. GDPR is around the corner (May 2018) but some estimates show only 25% of EU countries are ready for it. Good luck to the rest when those astronomically heavy fines kick in.  And how long until so many non-New York State entities are forced to follow the NY Department of Financial Services new cybersecurity regulations, just so they can keep doing business in NY? The transitional period for covered entities ends on August 28th, 2017, so you better be ready!

So fine, we get that there are regulations and statutes and frameworks, all of which need to be followed or adhered to. But there is a much more basic question that does not necessarily get asked: do you, as an organization, value cybersecurity? I am quite certain most will say “yes” but, do you value cybersecurity in the sense that it is a “nice to have” type thing or do you value it as “I need this or my life will be over” type thing?

I believe one of the greatest challenges we face when trying to address our cybersecurity issues is that we have done a poor job valuing our assets. Normally, we would hire an appraiser or an insurance company to assist with this task, in the traditional brick-and-mortar sense.  If a sale were more complex, such as the valuation of goodwill, we would bring in a legal or financial firm that specializes in mergers and acquisitions. Could these firms help you when performing valuations? Perhaps they could, but these firms are still trying to get their own heads wrapped around the entire cybersecurity problem.

Ultimately, you should be able to “put a price” on your organization. In the brick-and-mortar model, it is pretty easy.  I have building X, market value is Y, and replacement value is Z if something goes wrong in case of flood, fire, or whatever other “tangible” crisis you could face. Not only could you put a price on these issues, you could estimate recovery times, and possibly even have a rolodex of contractors or service providers that could help you out. And perhaps most importantly, you could budget for this tangible crisis. All this is pretty straight forward stuff.  Have insurance, keep an operating line of credit handy, make sure you keep your debt leverage levels in check, have some cash on hand (also known as the “rainy day” fund for most of us).

Do we do any of these things for cybersecurity related issues?2

My feeling is that we do not because we have not valuated our assets from a cybersecurity perspective. We do not know what the true cost of a damaging social media campaign could be. We do not know what the true cost of massive intellectual property theft is. And we do not know what the true cost of network downtime is.

Why?

I have a couple of theories why, in no particular order:

1) This is hard to do and when things are hard to do, we like to avoid them.

2) We do not know where to start.  How many of us actually can put a dollar figure on the goodwill value of our firm?

3) We still think cybersecurity is a technical issue, so leave it to IT to figure out.  (This would be a big mistake by the way.)

4) We do not have a true appreciation of how much we really rely on technology.

I could go on, but I think this is a good enough list to start with.  Your question now could be: okay, stop telling me problems and start giving me solutions!

Here is my first and perhaps most important solution: put a number on what you value even if that number has to be arbitrary, especially those intangible things, like client records, intellectual property, goodwill, and brand.

Why?  

Because it gives you a starting point. If I think the goodwill value of my business is worth $100,000, I will not spend $100,001 on cybersecurity measures. But if I think the goodwill value of my business is worth $10,000,000 then perhaps spending $500,000 on cybersecurity measures seems like a good idea, whatever these measures are (technical fixes, employee training, system upgrades, crisis communication plans, social media response teams, you name it).  

If you think your client rolodex (which is all digitized now) is worth gold because it took your firm 30 years to build up that network, treat that rolodex as though it belongs in Fort Knox. If the reason you are able to charge a significant premium above your competitors is because you have brand value built over years of interpersonal relationships with your stakeholders, protect the band like it is the most important thing in the world to you.

But put a number on it! The value of “the number” is that you can at least start to budget what you are willing to spend, especially when you are not sure where to start.

Like I noted, this isn’t easy, but it’s necessary. And it will be an important first step to help you with your own cybersecurity challenges.

By George Platsis, SDI Cyber Risk Practice

August 22, 2017

Collateral Damage in Cyber Warfare

Hot on the heels of the infamous WannaCry ransomware attack came the less heralded and seemingly less consequential Petya cyberattack. WannaCry was big and bold, and obviously well named. Petya didn’t seem to measure up, and researchers noted that less than $10,000 was paid in ransom. However, it soon became apparent that Petya was not a ransomware attack, but actually aimed at destroying data. Given that much of the damage associated with Petya focused on Ukraine, suspicion quickly turned to Russia, the assumption being the attack was part of Russia’s ongoing efforts to destabilize Ukraine. Whether the attack actually was carried out by individuals acting on behalf of Russia remains unproven, but what is clear is that, as is the case in all conflicts, there are ancillary casualties.

Take, for example, FedEx, which acquired Dutch shipping company TNT Express for $4.8 billion last year to compete with United Parcel Service Inc. and Deutsche Post AG’s DHL. What seemed like a good aggressive business move now has become a major headache. TNT operations were completely disrupted by the Petya attack, and FedEx now says it has not been able to recover some systems, and may never be able to recover some critical business data.

FedEx just filed its Securities and Exchange Commission (SEC) 10k, and it forecasts material losses. The list of reasons why those losses are mounting is instructive:

⋄ loss of revenue resulting from the operational disruption immediately following the cyber-attack;
⋄ loss of revenue or increased bad debt expense due to the inability to invoice properly;
⋄ loss of revenue due to permanent customer loss;
⋄ remediation costs to restore systems;
⋄ increased operational costs due to contingency plans that remain in place;
⋄ investments in enhanced systems in order to prevent future attacks;
⋄ cost of incentives offered to customers to restore confidence and maintain business relationships;
⋄ reputational damage resulting in the failure to retain or attract customers;
⋄ costs associated with potential litigation or governmental investigations;
⋄ costs associated with any data breach or data loss to third parties that is discovered;
⋄ costs associated with the potential loss of critical business data;
⋄ longer and more costly integration (due to increased expenses and capital spending requirements) of TNT Express and FedEx Express; and
⋄ other consequences of which we are not currently aware but will discover through the remediation process.

Oh, and FedEx also noted it did not have insurance against these losses. Going forward, FedEx may become the poster child for why cyber insurance makes sense.

By Tom Davis, SDI Cyber Risk Practice

July 25, 2017

active