I was tempted to use the 1980’s era Irish political acronym GUBU (Grotesque, Unbelievable, Bizarre and Unprecedented ) to describe the announcement by Chancellor Darling yesterday of the loss of 25 million UK citizens’ data records. Grotesque yes; bizarre – putting 25 million private records on two un-encrypted CD/DVD disks and sending it to London, unregistered, via an external courier- yes; unprecedented – in it’s enormous scale – perhaps; but unbelievable – most definitely not!
This particular accident was bound to happen sooner or later and not just at the HMRC, organisations large and small, private and public, needed a wake-up call on the dangers of ignoring the seething monster that lies at the heart of modern business, the database – and this may be it!
As the scale and scope of data that even small businesses regularly accumulate increases, the organisational and technical architecture to manage this new “resource” has not kept pace. The ability of internet protocols and personal storage devices such as USB “sticks” and CD/DVDs to transport and copy large amounts of structured data has not yet sunk in to most organisation’s collective thinking. In fact, the open and seemingly wild’n’dangerous internet is seen as the greatest cause of concern when in fact it’s the risks within the firewall that really need attention.
When using the internet most people now are aware of the risks and either totally avoid them or more commonly manage those risks. Most organisations while actively managing internet access (often to the extreme) totally let down their guard within the firewall, often simply depending on “password protection” and/or “security policies”. You’ll find in may companies that 90% of the staff will have access to 90% of the data held by the company with little or no “need-to-know” monitoring.
I firmly believe that the WAN is the new LAN, driven not just by the expansion of broadband and cheap laptops but by the increasing use of contract/external providers/free agents within business processes. As businesses no longer supply their own electricity plant and networks they soon will no longer maintain their own local area networks. With this move to the WAN, the way we access and analyse our data stores has to change. Luckily there’s been a world-wide experiment going on for that last 15 years or so trying out new ideas, coming up with best practice, educating the general populace; that great experiment has been the consumer focused internet. It’s still not perfect but it’s getting there, and the risk management technologies and procedures that the designers of consumer facing application have to consider are also those that designers of so-called “internal systems” should now focus on.
Top of the head list of “design principles” for a WAN accessible data architecture;
- “Need to know” access controls applied to all system interactions, alongside active monitoring and logging of all data access requests utilising system independent tools such as Splunk.
- Again apply the “need to know” principle to what data to store in the first place and then what data to transfer and share i.e. only share detailed data if it’s absolutely necessary and use summaries/aggregates where possible. This might mean a return to task-specific systems and data-stores but for highly confidential data this may be the only way.
- Encryption, encryption, encryption. Including the use of VPNs (even tiny businesses could use something like Hamachi) and encrypted data partitions on PCs (especially laptops!).
- This encrypted data to be stored in professionally run, staff-vetted, secure and replicated central data centres, e.g. Amazon S3.