by Ross Oliver | Technical Consultant at Taos
October is Cyber Security Awareness month (https://www.dhs.gov/national-cyber-security-awareness-month), and Halloween being the major holiday in October has never seemed more appropriate. No ghost, witch, or goblin can be as scary as the cyber security events of this month.
October 3rd: Yahoo upgraded its 2013 breach to include every single Yahoo account (in excess of three billion)
October 6th: web commenting service Disqus announced a data breach of a mere 17 million user accounts from 2012.
October 10th: consulting firm Accenture confirmed the discovery that a store of access credentials and encryption keys (“keys to the kingdom” as the announcement put it) were exposed on several publicly accessible servers.
October 11th: September’s Equifax breach continued to grow new warts in October; on the 11th, security researchers discovered an Equifax public-facing website was redirecting visitors to malware.
October 12th: Hyatt Hotels announced its second breach in 2 years.
October 16th: discovery of the WPA2 vulnerability
Although these events have dominated the headlines, the now seemingly routine attacks of rasomware, CEO fraud, DDoS and botnets continue unabated.
The Big Data era has given rise to the Big Breach epoch.
Why do these breaches seem to be accelerating in both frequency and size, and what can be done to prevent them?
One contributing cause is technological advancement itself. Even 10 years ago, constructing a single data store for hundreds of millions or billions of records was next to impossible. Implementations generally consisted of several separate data linked with custom software. Attempting to steal all the data was equally complex. However, advances in software, data storage, computing power, and network bandwidth have vastly simplified both the creation, and by corollary the theft of large datasets. Centralization means that billions of records may be protected by only a single access credential, that when obtained (or bypassed), grants access to all the data.
Several defense-in-depth strategies can potentially be applied to address the access brittleness of large databases:
– Segment the data requiring separate credentials for different sections
– Require multiple credentials to access particularly sensitive records or fields
– Apply query limits and/or throttling mechanisms to impede attempts to extract unusually large amounts of data
– Impose additional software or service layers between the database and public networks
As databases grow large yet access grows easier, we must devise new methods to impede unauthorized access while still enabling legitimate uses of the data.