You are here:
Press & Analysts
Safe in the cloud
Johannesburg, South Africa, June 03, 2011
Hands off cloud computing? Many companies shy away from needs-based, variable-cost IT resources. They fear their data may fall into the wrong hands or may not always be fully available. Specialised service providers keep the risks under control by using dedicated clouds.
Up to now, demand for and corporate acceptance of cloud computing have been limited. Security concerns are one reason for this. To ensure that the cloud doesn't develop into a hurricane, companies need a mix of new protective functions for the cloud and security strategies that have proven their worth in classical outsourcing scenarios. In parallel with a connection to the data centre of an ICT service provider, data is transmitted via a tunnelled network and no longer via the public Internet. Customers maintain full data ownership in a virtualised environment of this kind.
Making sure that "private" is just that
In a private cloud, companies also profit directly from the service provider's know-how of the necessary security measures. Every day, hundreds of experts devote their time exclusively to preventing attacks of all kinds and to developing appropriate responses to attacks. Their efforts soon pay dividends for companies battling against highly trained attackers in the network. What's more, major ICT service providers deliver the same services to a multitude of customers and, due to economies of scale, are able to deploy technologies that a single company alone could hardly afford. Even mobile terminal devices such as laptops, PDAs, and smartphones and the work processes they support can be securely integrated into the cloud.
Customised measures to reflect the actual risk situation
In the first step into the cloud companies and service providers must exchange information on the required level of security. ICT specialists must inform customers of the risks they face and how these can be combated. In cloud computing it is not sufficient to plug new security gaps that arise. The needs-based security of virtualised infrastructures calls for the co-ordinated interplay of network, system management, and security components. This involves, inter alia, multiple lines of defence, network encapsulation, and measures to prevent the uncontrolled spread of malware. It is always important to take a holistic view as it is not enough to consider individual components in isolation.
The final result is a custom solution that includes all necessary security measures. As and when needed, companies can request technologies from their providers on a modular basis. If subsequent risk analyses reveal the need for new services, the security measures in place can be supplemented and modified at any time.
Clear separation of applications and data
Despite many parallels with conventional outsourcing, security in the context of server virtualisation poses a number of special challenges. These relate, above all, to data protection. Since multiple companies access IT resources on the same systems in a data centre, it is essential to ensure, for example, that companies are not able to see the information of the other companies. The data and applications of each and every customer must be ring-fenced. This is done by means of virtual local networks (VLANs); in other words, a separate connection to the server is automatically set up for each company. The server provides an arbitrary number of individual accesses, depending on how many customers there are. The VLAN is managed at a central switch where all the network cables converge. The switch automatically allocates each VLAN to a specific customer. If the customer cannot be identified, no access is granted. Attempted espionage and data manipulation attacks are thus condemned to failure.
Users determine the cloud's location
One more aspect is preventing the spread of the cloud, an aspect that is of no relevance in classical outsourcing. In the public cloud à la Amazon users do not know on which systems, in which data centre, and in which country the provider stores their data. This can have serious consequences. If the data crosses national borders, it may no longer comply with the applicable data-protection and industry-specific provisions.
In the private T-Systems cloud users decide at which data centre their data will be stored. A "twin-core strategy" is also in place to provide a fallback option: all data is mirrored to a second data centre – that also has cloud computing capability. Stored data is synchronised automatically during current operation. If a server fails at the main site or a construction worker accidentally cuts a data line, the "twin" seamlessly assumes the functions of the primary data centre so that the customer's business operations can continue without interruption.
If a company uses the network of its service provider, it can set out the cloud services to be delivered in a contract; such services relate to data centre, networks, PCs, and mobile terminal devices of the corporate users. As a result, the quality and reliability of the services delivered can be accurately monitored and assessed. In its own MPLS (Multi Protocol Label Switching) network, T-Systems can, if desired, give priority to cloud applications over less critical applications. This ensures that no delays occur, even if data bottlenecks arise due to heavy network traffic.
Conclusion: Virtualisation enables companies to drive down costs, improve processes, and save on resources. However, very few companies have the requisite technical know-how, let alone the human resources to implement cloud computing on their own with the required level of security. Service providers such as T-Systems are working at the front line of technology to develop secure cloud infrastructures at benchmark costs that are affordable for all companies.