IT

On the Road to Ubiquitous Encryption: Are We There Yet?

In the not too distant future, all data, whether it is data at rest or in transit on a network, sensitive or not, will be encrypted. However, the performance overhead associated with encryption and decryption is one of the few remaining roadblocks to enterprise-wide adoption of this technology.

Fortunately, new approaches such as on-chip acceleration are making the ubiquitous use of encryption feasible. Let’s review the factors associated with encryption of data accessed and processed by application, file and database servers to see how close are we are to achieving zero performance overhead for full-time data encryption.

Encryption Steps

The encryption process in an enterprise typically involves the following use cases, all of which have differing performance implications.

  1. Initial encryption or transformation of data: In this case, when encryption is enabled for the first time all pre-existing data at rest must be encrypted. Most sensitive file sets exist prior to the requirement to encrypt them. Thus, an integral part of implementing real-time encryption is encrypting legacy data sets for the first time, because even when a protection policy is in place, individual files are not protected until they have been encrypted.
  2. Rekeying: Increasingly, regulatory agencies are requiring periodic transformation or rekeying. This means decryption, re-encryption using a new key, and overwriting the original.
  3. Clear copy of entire data set: Applications such as data analytics/data mining require a clear (unencrypted) image of an encrypted file set. This is analogous to initial encryption, in that the entire file set must usually be decrypted, and in that only one calculation is performed on each block of data.
  4. Ongoing encryption/decryption: Encrypting data that is newly created and being ingested to the storage system via the file system, data base or a disk block manager. Decrypting the data that is being read by an application or user who has been authorized to have access to clear text data.

Performance Overhead Impacts Service Level Agreements

IT Service-level agreements (SLA), which refer to the contracted delivery time (of the service) or performance, are commonly defined in large enterprises for lines of businesses and/or application owners. An SLA will typically have a technical definition in terms of mean time between failures (MTBF), mean time to repair or mean time to recovery (MTTR), I/O throughput performance and latency.

Application performance and business continuity is an important requirement for enterprises and both of these can be negatively impacted when system performance is degraded by high CPU utilization for non-application related data processing. Therefore, limiting encryption’s performance overhead is critical to maintaining SLAs.

Enter Chip Level Encryption

Until recently, encrypting data accessed and processed by servers necessitated reserving a portion of the available CPU bandwidth for encryption as part of the capacity planning exercise. As a result, system administrators often had to trade off performance for data security. Situations where encryption is enabled without proper planning can lead to a performance impact for applications and consequently create a barrier to adoption.

In response to the growing need for ubiquitous encryption, all major CPU chip vendors have released or plan to release support for cryptographic acceleration in the hardware by way of dedicated instruction sets or on-chip compute engines. The following table summarizes the hardware chip level cryptographic acceleration technology from Intel and Oracle:

Vendor Technology Availability Intel Advanced EncryptionStandard New Instructions (AES –NI) 2010; Shipping withXeon 56xx processor family (Formerly code named Westmere) and newergeneration processors Oracle Niagra2 Crypto Provider(N2CP) offload engine 2010; Shipping withsystems configured with Ultra SPARC T2 and higher systems.

Getting to Close to Zero Performance Overhead

CPU utilization is the most important factor affecting latency (SLAs) and the I/O throughput. Using finely tuned on-chip cryptographic instructions promises to reduce the performance impact of encryption to near zero. To assess the current effectiveness levels of on-chip encryption, benchmark tests were recently conducted using Intel processors equipped with Intel AES-NI on Windows and Linux systems. The tests generated database transactional workloads that drove CPU utilization to nearly 100 percent. The following information illustrates the performance impact imposed by encryption under these conditions.

On Windows, the results indicate that there is no encryption overhead for loads utilizing up to 80 percent of the CPU, and the overhead is no more than approximately 2-3 percent for loads utilizing more the 80 percent of the CPU’s resources.

On Linux, at 70 percent CPU utilization, there was only a 2 percent impact on performance overhead, while at 100 percent CPU utilization, there was no more than a 5 percent performance overhead.

It’s important to note that these results are specific to the transactional workload and the system configuration used in the benchmark. As a result, “mileage may vary” depending on the system configuration and the type of I/O workload. Nevertheless, the findings clearly demonstrate that all of the CPU cycles are available for application data processing. The results were similar for two different operating systems, systems and workloads. Clearly the performance cost of encryption is approaching zero.

The Bottom Line

The availability of processor chip level instructions to provide cryptographic acceleration results in a quantum performance improvement when compared to earlier generation processors that rely on encryption algorithms to be executed entirely in software. This advancement is making the performance impact of encryption virtually transparent to application users. With data encryption becoming an indispensable component of a defense in depth strategy, the timing couldn’t be better.

Ashvin Kamaraju is vice president of product development and partner management for data security vendor Vormetric.

Leave a Comment

Please sign in to post or reply to a comment. New users create a free account.

How confident are you in the reliability of AI-powered search results?
Loading ... Loading ...

Technewsworld Channels