Image from cxotoday.com.
Cross-posted to the Blue Coat security blog.
Apologies for the delay in getting this last post in the series out, but I recently got married! We’ve looked at five precious trends for this year and what they’ll mean in the Cybersecurity landscape:
- botclouds not botnets: the idea that physical machines are becoming less and less important for botnets, and that malware authors will take advantage of the same trends in virtualisation to which everyone else has access
- litecoin not bitcoin: the idea that cryptocurrency malware will start to target the easier to create scrypt-based alt coins
- wearable mawlare: the idea that as devices start to accompany us permanently, so too will malware
- deep web surfacing: the idea that the deep web (based on Tor, et al.) which has been largely the domain of the ‘hard-core geek’ is now starting to move more mainstream
- 3rd party security requirements: the idea that organisations will require anyone connecting to their networks to pass the same stringent security requirements that they themselves must pass
The sixth trend logically follows from all this, and that is that what is now thought of as “Big Data” will soon become “Huge Data”. The idea behind big data (and I’ll refuse to keep capitalising it from now on 😉 ) is that the amount of data that organisations are accumulating (in any area, but especially in cybersecurity) are so voluminous that it is simply impossible to scale for any human and most automated systems. This then requires bigger and better automated systems, and the concept of even getting through the data becomes a science in and of itself. This is of course true, and the issue will only get larger. If you think of the standard corporate network, there will be hundreds of users, and for all of those users, they will have their requisite computers with files, and they’ll make thousands of network access requests a day, including to the internet. A responsible organisation will at least need to be aware of these and have some measure of visibility into them. Most organisations will have to have monitors and alerting systems for when things go wrong, and the ability to use some of this voluminous data as actionable when responding to incidents. The real issue comes when one stops to think about how much data is currently hidden. Estimates of how much of an organisation’s traffic are SSL will vary widely, but even if it’s 25%, you can safely assume that for every one out of four requests that cross your network, you are generally unable to see those requests and unable to do anything with that information. Of course if it’s higher, then the issue is only greater. Malware authors have known about, and exploited this tendency for a long time. Most C&C (command and control) traffic that malware uses is encrypted over port 443. This essentially gives them free communication because there is no distinguishing bad encrypted traffic from good encrypted traffic. That will change. SSL visibility refers to the concept of an organisation responsibly examining SSL traffic so as to get a handle on the information that they’re not currently seeing. Responsible SSL visibility means that they’ll do it in such a way that they’ll get a handle on the things that they need to handle without examining all the online banking of their employees. This is possible today. What this will do though is significantly increase the amount of data that organisations are seeing, and in just this one area. Incident response teams will become much busier as previously unseen C&C traffic now becomes more visible; content filtering and/or blocking of SSL-accessible-only sites becomes more necessary, and the amount of data skyrockets. This trend will continue for the near future, well into 2015 and beyond. Now that it’s April, I’ll start thinking about trends for 2015 and hopefully get those out a little sooner that April of 2015. 😉