pervasive encryption

In today’s treacherous world of constant hacking threats and corresponding data breaches, IT security and breach prevention has become a top-line concern for everyone in large IT organizations. With over six million data records lost or stolen every day – that’s almost 70 every second – and hackers representing over 75% of retail online logins, it’s no wonder. But the good news is that less than 5% of those breaches were secured breaches where data encryption was used.

Therefore, IBM’s new pervasive encryption facilities on the z14 mainframe are a welcomed security enhancement that comes right out of the box with the newest mainframe computers. This information is covered nicely in considerable detail on the IBM website, and third-party expert blogs like DancingDinosuar, SHARE and Value-4IT.

Pervasive encryption doesn’t come without a penalty though – expect as much as a 5-to-7 percent hit on performance, system-wide (possibly higher for I/O and CPU intensive batch processing). That applies to every single transaction that you run on your mainframe systems, and every time any of your programs access data from disk. It also applies when one of your programs writes temporary data to disk for almost immediate retrieval – encryption and decryption takes place for every such event.

You can implement pervasive encryption on the z196 and z13 class systems, but the performance penalty will be considerably higher than 7 percent.  So, then, is that the cost of security enhancements? An increase in MSU usage, or a decrease in performance? Well, yes and no.

Yes, if you don’t do anything about it; no, if you do. What can be done? Apply mainframe optimization solutions before you cut-over to pervasive encryption usage.  There are a handful of highly-effective mainframe optimization technologies available, ranging from high-performance mainframe in-memory technology to SQL quality automation, and many more.

In-memory technology

In-memory technology allows an application to replace repeated reads to disk with memory access, effectively bypassing the database overhead for those repeated reads. This is made possible with no database changes and no changes at all to program logic – a low-risk and highly effective performance enhancement solution.  This technology is useful particularly in batch processing, where the same data (e.g., interest rates, client name, ID, etc.) can be accessed for use in every transaction, or many times in a single transaction. It is also useful in online transaction processing in some cases.

SQL quality automation

SQL quality automation can help you to discover undetected Db2 bottlenecks that have been running in your productions systems long-term. This type of bottleneck differs from bottlenecks that develop over time, and can easily be identified by your favorite mainframe monitoring tool. If a bottleneck situation has been in production since day-one, no standard monitoring tool is going to discover that months or years later. The technology is also useful in guarding against new introduction of inefficient SQL code.

Benefits

These technologies can recoup the encryption hit, and save you a little more on top of that. Further, for your batch processing applications – some of which can take a 30 percent hit if you run pervasive encryption on a pre-z14 system – can be mitigated as well.  The in-memory technology, specifically, was designed to help control high-intensity transaction processing.  So that 30 percent hit may very well be avoided as well, and in some cases, you’ll be able to save more than that.

If ETL is being used to replicate mainframe data to other platforms on a regular basis, augmenting it with a multi-platform replication solution with enhanced CDC can take a chunk of resource usage off your mainframe plate. It will also take a chunk out of network resource usage, if that is an ongoing challenge.

When IBM brought pervasive encryption into the mainframe world with the z14 introduction in 2017, it was at exactly the right time – as international governance organizations began implementing their new regulations, and as responsible service providers looked for ways to respond in kind. IBM tells us that pervasive encryption simplifies mainframe security – well, that’s true, but don’t take that to mean more that it does.

Take away

It’s important to remember that your mainframe data isn’t secure just because it’s on a mainframe – you have to act to make it secure – and pervasive encryption is just one more tool that IBM is giving you.  It’s up to you to implement it properly, but it’s not the be-all to end all. You still have front-end systems and other systems that access your mainframe, and it’s up to you to implement other measures that will suitably secure or harden them. Without that, you’re still dangerously exposed.

If you haven’t implemented pervasive encryption yet, you’re falling behind and leaving yourself and your customers unnecessarily exposed; you should be doing it right away. And if you want to get off on the right foot, why not optimize your mainframe systems right away to ensure that new security measures don’t impact SLAs? That would be a smart best-practices plan.

Regular Planet Mainframe Blog Contributor
Allan Zander is the CEO of DataKinetics – the global leader in Data Performance and Optimization. As a “Friend of the Mainframe”, Allan’s experience addressing both the technical and business needs of Global Fortune 500 customers has provided him with great insight into the industry’s opportunities and challenges – making him a sought-after writer and speaker on the topic of databases and mainframes.

Leave a Reply

Your email address will not be published. Required fields are marked *