Wednesday, May 28, 2014

Would You Spend $100 to save $1 Billion? If Not, You Could Be the Next Target.

Estimates are that the cost of a relatively simple hack on retail giant Target this past December has reached over $1 billion to date, not to mention the ongoing legal exposure and subsequent damage to its high-value brand.   The ripple effect of over one hundred million credit card numbers being stolen and quickly sold online to a global black-market of ready customers is staggering,  and will ultimately impact millions of consumers, as well as thousands of banks and retailers worldwide.  

Remarkably, Target was one of the largest retail customers of Wall Street darling, Fire Eye, which failed to identify and thwart the hack.  Despite a more than $40 million dollar marketing budget last year, Fire Eye’s advertising no doubt rings hollow with senior management at Target, particularly the CIO and CEO who were terminated in the wake of the hack.  This is just another example of an internal IT group that, because they lacked the needed cyber security domain knowledge, put the fate of their company in some well marketed but flawed legacy cyber security solutions.  It has now been proven that these legacy solutions simply cannot deliver genuine cyber security.   What is particularly poignant is the fact that for as little as $100 per credit card terminal, Target could have secured these critical infrastructure components of the company’s high-value digital information assets.
There are thousands of companies similarly deluded by well-meaning IT marketers, as well as their own generally competent IT managers.   In fact, most IT managers will admit that cyber security is a horse of a different color; cyber security is far different from the typical challenges that IT managers regularly face.  In general, IT managers understand the vastness of the problem, but have been led to believe by legacy providers that there is no “silver bullet” solution.   Like any good fallacy, this folklore has some truth in it.  There is no single solution to the problem of cyber security; however there are several components that, when fully integrated and complemented by tools that incorporate IT management’s intimate knowledge of their own business, can push the effectiveness of cyber security to 99.99% to 100%.   The best part is that it’s not expensive. 

I’ll be writing about these very real “silver bullets” and how to properly deploy them in my next article on the subject, some time in June.  Until then, if you can’t wait, email me and I’ll lay out the seven steps for you.  (Ed@vir2us.com)  
 

 

Thursday, June 27, 2013

Snowden Classified Data Theft Incident was Avoidable


The Snowden incident, (where a government intelligence worker was able to easily copy and disseminate large amounts of highly classified data), highlights one of the fundamental problems of legacy cyber security and the thinking behind it.   Like many complex technology problems, people without the needed domain knowledge required to identify solutions tend to focus on the symptom, at least in part to cover up the fact that the knowledge is lacking.  Unfortunately, next-generation cyber security technology, which the government is trying to adopt and implement, is a solution that few people in government understand.  However, the Federal Government is not alone in its slowness to implement next-generation cyber security.  Banks, oil, gas, water and power utilities are similarly vulnerable when it comes to protecting digital assets and critical infrastructure. 
The Snowden incident could have easily been avoided with some next-generation digital asset protection.  Snowden’s ability to simply copy terabytes of classified data was possible, at least in part because of a reliance on obsolete technologies, security strategies and processes.  The government (NSA) has for some time focused on the use of high-grade cryptography to protect data and, in this area, commercial firms have tended to follow the government’s lead.  However the advent of the Internet and global networks changed the game significantly with respect to protecting data. 
 
The government tends to use encryption as an all or nothing proposition, encrypting hard drives on computers or databases at the file level.  The problem with this approach is that, once a user has entered the access credentials, the entire file or drive is completely exposed.  Instead, using triplex-authentication in conjunction with folder and record level encryption solves the problem.  In this environment Snowden would have been able to do his job and even bring large amounts of data and data files together but all the data would have remained encrypted, except when viewing query results or a limited number of individual records.  He never would have been able to copy entire files, at least not without triplex authentication notification and approval of a higher-up, and not without the copied files remaining encrypted at the record level.  This means that even if he had gotten approval to copy the data to an external storage medium or the cloud, the file would not be divorced from the triplex authentication access required to view or query the data.  Additional protections are available that would have destroyed the encryption lock if the authentication failed even once, since the files would have been tagged as a copy outside of its home domain.  
There are other considerations and failures that the government says may have occurred in this incident but most of these revolve around manual processes, policies and procedures that are only reliable if they are part of closed-loop processes, and even then rely on timely communication.  Finally, the Federal Government continues to operate with legacy cyber security that provides little or no security once access is achieved. The President recently issued an executive order to address the issue and I recommend firms consider doing the same.

Thursday, May 30, 2013

Why Many Companies Are Failing to Achieve Genuine Cyber Security

There are several key reasons many companies are failing to successfully implement genuine cyber security.  Cyber security was an after-thought of a computer industry that did not envision or plan for the connected world we live in today.   Nearly all cyber security solutions in the market today fail to follow the eight time-tested principles of security, instead relying on a post-attack ability to identify and create lists of known-threats after the damage has been done.  Nearly all solutions available today were not “built-in” but instead sit on top of the OS and rely on it for their functionality.  Another major reason for this failure is that senior managers are looking to IT professionals to solve a problem that is less about IT than it is about process and mathematics.  Few IT professionals are process engineers or mathematicians. 

Next-generation cyber security will be built-into applications and computing environments to create inherently secure processes that do not need to identify threats but rather handle processing in a way that makes such threats irrelevant.  Many still don't realize that the computing platform architectures we are leveraging today are more than thirty years old and reaching the end of their lifecycles.  They were not designed with the Internet in mind, nor did they envision the potential secure computing problems that such an environment would produce. 

Wednesday, May 29, 2013

Is Your Company in Denial about Denial-of-Service Attacks?

Denial-of-service attacks are a direct assault on your company’s online revenue stream.  These attacks are pretty easy for hackers to pull off, and your company should not simply be hoping that it won’t be targeted.  Denial-of-service attacks are not limited to a few high-profile companies—every company with significant online revenue is at risk and the attacks are costing firms $billions.  The bad news about denial-of-service attacks is that legacy cyber-security firms have no genuine solution, in part because most of these firms don’t have the deeper domain knowledge required to problem-solve and innovate in this space.   
  
Back in the very early 1990s when the Internet was still new,  some of the big ISPs like UUNet were positioning themselves to be acquired by big telecom operators (for ex. UUNet was acquired by WorldCom).  I remember a discussion at a network planning session when I noted to UUNet executives that the Internet lacked the identifiers that governed telecom networks and that these would be easy to add to the Internet at this early stage of development.  The response was – well I don’t recall precisely what it was—but it went something like, “we don’t need no stinking identifiers”.  Their attitude was understandable at the time.  Demand for access and bandwidth was already growing at a mesmerizing rate.  All they could think of was how to feed the beast.

The design I had suggested at that time would have identified every user that hopped onto the Internet along with their location, point of access, etc.   Also like telecom networks, it would have assigned them a class-of-service or COS that determined what they were or were not allowed to do.  If for any reason they managed to get on the network without this independent channel authentication (something that was very difficult to do) they were assigned a default class-of-service that allowed them to do almost nothing.  
I recently resurrected this design with my engineering group to create a denial-of-service solution that will be offered by Vir2us this Fall (2013).   I’ve added some cool features and tools that we didn’t have back at that time when processors were slower, storage and memory were not such low-cost commodities, and we lacked cloud based speeds and scalability.  There is some complexity here to be sure, and we’ve created some new IP with these innovations that we expect to license to others, but we know it works because we implemented its older brother in hundreds of early private and public digital networks.

Just how does all this stop denial-of-service attacks?  It’s really quite elegant and will also solve some other annoying problems that plague us about the Internet’s architecture.   A denial-of-service attack is like too many people asking you a variable question all at the same moment rather than in succession.  At some point you simply can’t respond quickly enough and everything stops.  Now imagine that only the people you pre-selected were allowed to ask you questions, and you and they were speaking and hearing in a language known only to you and that select group.  You simply wouldn’t hear the requests made in other languages and therefore would not feel any necessity to respond.  There’s a little more to this of course but you get the idea.  You can get notice of the beta release by subscribing to this blog.

Sunday, May 5, 2013

Next-Gen Cyber Security is About Big Profits not IT Budgets

 
One of challenges of cyber security at the CXO executive level is that legacy cyber security solutions have historically been seen as just an expense for the most part.   However, next-generation solutions are poised to become one of biggest profit opportunities for many firms.   Senior management has been reluctant to move on these next-generation solutions very quickly however, and it’s no surprise.  They have spent millions (or in some cases billions) on legacy security solutions that simply don’t work.   Many managers know this truth first hand while others have only read about it.   
 
Some senior executives know their present security solutions are not working but don’t want to pay twice or three times for what they feel they should have gotten from the first investment.   They and their IT staff also lack the know-how to assess new solutions.  Other companies see this merely as a “tech-buying” budget decision.  Many of these managers are reluctant to admit that what they have is not working, while simultaneously hoping they don’t get hacked.
 
The fact is, next generation cyber security is poised to create one of the largest opportunities to enhance profits available to business today.   Why?  Because legacy solutions are not just flawed, they eat up to 80% of network bandwidth capacity and computer processing power.  They are at risk for large chunks of downtime.  This means large chunks of money right off bottom line profits (just ask Sony), to say nothing of the cost of fixing compromised systems and networks when the attacks are over (U of M and Saudi Aramco).   This failure to achieve real cyber security also stifles new and innovative product and service offerings because these products cannot be secured for export.  IP is too easily stolen, copied or counterfeited  (Analyst Report IDC Oct. 2008).
 
What we have learned over the years is that firms don’t focus on what they are not measuring.  In the telecom industry it was revenue-assurance solutions that were quickly created when telecom providers began to measure unbillable network call records for the first time and discovered that it had grown to over six percent of their total revenue from a fraction of one percent ten years earlier.   Last year President Obama said the world is losing a trillion dollars to cyber related crime each year.   The U.S. government is beginning to measure this but most firms still don't. 
 
In the near future, products, services and online platforms that have next-generation, built-in security and privacy features (so that customer information and the products/services themselves cannot be easily be hacked) will be given higher valuations by Wall Street and investors.  The lack of such a strategy for enhancing profit and shareholder value will eliminate many firms as serious competitors in their market space.