A new computing environment is adding layers of complexity to networks. Internet applications, service architectures, virtualization, and mobile devices are making data center operations more flexible, but with fewer boundaries. The recent ransomware attack on Ottawa Hospital drives home the increasing demand for data protection. Everyone is looking for secure transactions and protected storage, but shifting technologies require that all data center elements integrate properly yet have it all secured. While this is increasingly difficult, there are some effective ways to keep your info on lockdown.
Control Physical Access
There is so much to the network side that some people overlook physical security that prevents unauthorized persons from accessing hardware. Normally this includes locked doors, badge or keycard access, security cameras, and locked cages for servers holding sensitive data. Data centers may also be divided into separate rooms with different security levels, such as one for testing and staging, one for development, and the highest security for the production servers.
Zoning allows administrators to focus the majority of their time and energy on securing and monitoring the most mission-critical components of the network. By the time a machine or app is moved to production, administrators should have a clear picture of user and application credentials and permissions, what machines it will communicate with, and what ports are used. Development and testing zones are preparatory phases for the third zone, the production or mission support subnetwork, where multiple layers of protection are implemented.
Communication between Security Devices
Coordinating this traffic allows for visibility into real-time data flows between the data center and client systems, as well as virtual machines. Application visibility is important to maintaining safe operations. Often malware is masked as something else, so all data must be validated. Although a lot of this traffic uses SSL (Secure Sockets Layer) encryption, administrators should have enough application knowledge to understand what normal traffic is. Admins can set up rules for ingoing and outgoing traffic to better monitor for suspicious activity. Traffic between virtual machines can be much harder to monitor, but inter-machine firewalls are coming onto the market to help shore up this aspect.
Scan for Vulnerability
All new applications and machines should be scanned for weaknesses a hacker might take advantage of. Busy datacenter managers can always take advantage of IT managed services providers; qualified engineers can simplify the process by implementing a series of checks on server performance, evaluate which hardware and software need monitoring, and identify probable failure points, along with other security measures such as supervising data flow and transfer and regular review of key log files.
Good habits may come from following the progress of the Canadian government’s ambitious project to consolidate 485 data centers into only seven. Regular monitoring and critical thinking helps keep enterprises adaptable to whatever changes or technologies come along. Looking at the data center and data back-ups not just as a collection of components but as a distinct and unified system in its own right will help to prompt the right questions that keep information flowing safely and reliably.