At DT Cloud Services, the security of the cloud environment has the highest priority. Security directives in effect include physical, administrative and technical controls. DT Cloud Services apply strong measures to protect its organization and systems in use and under development.
Security policies rely on shared responsibilities shared between DT Cloud Services and the customer. DT Cloud Services is responsible for protecting the infrastructure and provide services that can be used securely. The customer depending on these services is responsible for implementing their own security requirements, including data integrity and compliance with applicable laws and regulations.
The scope of responsibility is specified for each service model (IaaS/PaaS/SaaS) based on the NIST definitions of cloud services with some specific deviations from the base models (https://csrc.nist.gov/publications/detail/sp/800-145/final).
Directives and Policies
DT Cloud Services cloud services are implemented according to secure design and policies based on recommendations and guidelines developed by industrial community and governmental organizations such as
Open Web Application Security Project (OWASP)
National Institute of Standards and Technology (NIST)
Cloud Security Alliance (CSA)
Center for Internet Security (CIS)
as well as in-house developed adaptations and security measures.
The guiding principles can be summarized as
Respect least privileges
Minimization of attack surfaces
Following the "defense in depth"
Security by Design
The physical infrastructure of the DT Cloud Services cloud complies with the rigorous security standards of enterprise grade data centers across Europe operated by DTAG and its affiliates.
All DT Cloud Services infrastructure elements, such as servers, routers and switches, are located within highly available and protected data centers. High availability is ensured by architecture, redundant power supply and environmental conditioning and protection through sealed premises with strict access control. The infrastructure is housed in window-less halls, protected by walls and cages, and monitored by CCTV and infrared motion detection at all times.
DT Cloud Services also controls its backbone network, so that customers can rely on that data is located and transported withing EU borders, or even within a particular country.
Data center architecture
All of the backbone and data centers (including underlay and overlay networks and SDN) in the DT Cloud Services cloud are designed for high availability and integrity of infrastructure and confidentiality of control data and user data.
DT Cloud Services data centers are built according to a “zoning concept” with the dedicated areas
The data centers are equipped with fire and smoke detectors, cooling and climate installations for monitored and controlled temperature and humidity, and flooding protection in accordance with local regulations. Power supply is protected by back-up generators providing several stages of resilience to mains power failure.
The cloud infrastructure has designed to remove single points of failure at any level and implemented with redundancy in
transmission (fiber paths)
cloud control functions and user access functions
Each customer has a dedicated logical work space - the tenant - which the customer has exclusive access to. In multi-tenant cloud computing, different users can share cloud resources provided by a common infrastructure, but without sharing the tenant space and user data unless explicitly allowed in a special arrangement, such as a federation.
DT Cloud Services supports multi-tenancy on all cloud stack layers, which means that tenants are separated from each other from application layer down to the hardware layer, and sharing the hardware infrastructure (except for dedicated hosts and other elements). The multi-tenancy level and implementation depends on the serving cloud model and the service.
Decommissioning of storage
Hardware replacement is part of the maintenance procedure. Data leakage from storage components such as HDD/SSD and other non-volatile memory is fully prevented by applying sanitation procedures before their decommissioning.
DT Cloud Services services are offered via RESTful API from the Internet. Access is protected on multiple layers by filtering and Transport Layer Security (TLS/HTTPS). Horizon GUI and OpenStack API endpoints are under DT Cloud Services control, implemented with high availability and protected by TLS short-term Let’s Encrypt certificates.
The infrastructure management layer is only accessible through dedicated access solution with multi-factor authentication and state-of-the-art encryption mechanisms. Only authorized users (DT Cloud Services development and operation engineers) may access this layer according to the least privileges model, and all network activities are monitored and tracked. Customers do not have access to the cloud infrastructure management layer.
Note that protection of services operated by customers is within the responsibility of customers themselves.
DT Cloud Services is constantly monitoring security by collecting, processing and analyzing operational data for intrusion detection in distributed architecture through their Network Intrusion Detection System (NIDS).
The system is based multiple information sources, including open-source intelligence (OSINT), data from internal databases, and network element data to automatically detect security incidents, analyze it and produce a proper response.
DT Cloud Services Incident Response Platform collects and aggregates data from different sources to detect and analyze incidents (Figure 1).
OSINT - Open source intelligence data
Network scanners and network vulnerability scanners
SIEM - Compilation of data from internal databases, network element data and data from external public Internet services
SAST - Static application security testing, performs white box security scanning where accessible program source code is searched for patterns to detect program logic that can be harmful to the application
DAST - Dynamic application security testing, use of black box methods where the program source code may or may not be available. The tool is launched in a secured environment (sandbox) and observes the behavior (inputs or outputs) of the application to find patterns indicating vulnerabilities within the application (for example, missing security parameters in HTTP headers, or strange/unsecure protocols).
CICD - Continuous integration and continuous delivery, results from continuous testing framework
IoC - Indicators of compromise, a set of patterns or features that indicate intrusion, like events that could be the result of malicious attempts.DT Cloud Services is collecting IoC from shared sources, usually anonymized and shared within the community to help others recognize malicious events faster.
DT Cloud Services is constantly logging potential security-related events, such as unauthorized access attempts, failed authentications and port scans. Together with data from internal and external sources, the information is automatically analyzed and raises an alert whenever a harmful activity is detected.
DT Cloud Services provides protection against packet flooding - (Distributed) Denial-of-Service (DoS) - attacks against the cloud and its clients through a combination of packet analysis, Access Control Lists (ACL) and cloud telemetry, that is, cloud-native data collection from both physical and virtual hosts. In addition, deep packet inspection (DPI) and Network Behavioral Analysis are used depending on attack scenario.
DT Cloud Services has a Computer Emergency Response Team (CERT) of personnel trained to respond to detected security incidents.
In the event of a detected security incident, DT Cloud Services will try to reach the affected customer over the defined point of contact. It is therefore important that customers keep their contact details updated.
DT Cloud Services IaaS follows a shared responsibility model where DT Cloud Services is responsible for the physical infrastructure and the hypervisor layers. Protection of services operated and maintained by customers is within the responsibility of the customers themselves (Figure 2).
DT Cloud Services provides necessary networking security tools for customers as a standard feature-set by OpenStack Security Groups for network layer filtering or by managed services like the Web Application Firewall (WebShield) product.
Keeping information secure in the cloud should be a top priority. This includes protection against unauthorized access as well as data loss due to technical malfunction or as a result of an attack.
A security policy should consist of
Policy for access and password management (least privileges)
Platform software upgrade plan
Data backup and restore plan
Identification of data that should be encrypted
Using secure connections when accessing a tenant is crucial to minimize the risk of eavesdropping and intrusion. General security recommendations include
Do not allow any computer to cache passwords and logins.
Make sure to log out from every site or account once the session is finished.
Avoid insecure Wi-Fi hotspots whenever possible, since such connection are vulnerable to eavesdropping.
On the tenant, access policies are supported by proper security group settings, and optionally by a Web Application Firewall. Security groups are normally used in implicit deny mode so that it blocks all traffic by default and allow only traffic specified on the white list.
To access virtual machines, customers are encouraged to use Virtual Private Network (VPN) or SSHv2. It is recommended to configure SSH with at least private key authentication and to disable root login to follow the least privileges principle. Using bastion (jump) hosts with logging capabilities is the best way to access tenants.
SSH with key pair authentication is the common way to access VMs and to enable automation. To do this, the private keys are not protected by any passphrase, so the customer should store the keys securely and never distribute private keys outside secure domains.
With SSHv2, secure file transfer over SFTP is also supported. The SFTP protocol does not provide authentication in itself, but expected the underlying protocol to implement this. For this reason, SFTP is commonly used as a subsystem of SSHv2.
Platform and application deployment
After launching a virtual machine, the customer is responsible for performing system and application hardening with security updates and features according to their own needs. A useful reference are the CIS benchmarks for common operating systems and software platforms.
It is a straightforward task to set up TLS/HTTPS on a cloud-hosted service, for example using Let’s Encrypt certificates.
For created images, the customer needs to ensure hardening with applicable security updates and take measures minimize the attack surface. In particular, preserved images should not contain any sensitive information like passwords or private keys.
Although DT Cloud Services offers a highly redundant infrastructure and the possibility to take instant snapshots, you should always backup your most important data to a separate storage, for example to a local secure server or a laptop computer. The backups should be made regularly, and a recovery procedure should be worked out to restore any corrupt or lost data.
The snapshot feature copies the state of a particular VM, but it does not guarantee data consistency and it depends on the cloud infrastructure itself.
A backup routine should take into consideration the type of data, backup frequency, and storage location. The solution can be based on backup software for local storage or cloud backup, and many different solutions exist addressing various needs.
For maximum protection, data should be encrypted both when at rest (in storage) and in transit. The communication with a tenant is always encrypted. Data at rest, however, is not automatically encrypted and encryption of sensitive data should therefore be considered. It should be viewed as a complement to network security and user-based access control.
The customer can encrypt data before uploading it to the cloud, where the customer is in full control of the encryption process and the integrity of the encryption keys. The protection of encryption is dependent on the strength of the password used, which should be well chosen so that it is difficult to guess and thereby prevent unauthorized access. Data encryption also mitigates the risk for data leakage during migration within the cloud.
The best practice would be to use well known encryption standards for data at rest, for example Advanced Encryption Standard (AES). A full disk encryption (FDE) can be implemented for example through Linux Unified Key Setup (LUKS) where the key management is the responsibility of the customer.
The Linux-native GPG can be used for key-based file encryption on the client before storing it on the cloud, which can be done selectively for sensitive data.
Data encryption prevents data leakage at instance and storage migration within the cloud. Before releasing a virtual disk, the customer needs to ensure that data at rest is properly destroyed. This is easily enforced by using encrypted data and simply destroy the secret keys on the client. The same principles are valid for both block and object storage.
A bastion host acts as gateway between the Internet and and hosts on the private network in the tenant.