(What will be the impact in the mainframe datacenter?)
GDPR changes the rules
The EU will be introducing the General Data Protection Regulation (GDPR) in May of this year – it is the most important change in data privacy regulation in 20 years. GDPR replaces the European Data Protection Directive 95/46/EC and was designed to harmonize data privacy laws across Europe. It is meant to protect the privacy of all EU citizens, and to reshape the way organizations across the EU manage the data of private citizens. The new regulations improve on the inconsistent data protection laws currently existing in the EU and ensure the secure, free-flow of data.
Despite fair warning, many companies in the UK and the rest of the EU are in danger of non-compliance. The number of companies at risk of GDPR non-compliance is much higher in the US. GDPR essentially gives people – human customers – more say over what companies can do with their personal data. Your customers will have more control over how their personal data is used. Think about what Google, Facebook and other companies have been doing with customer data for the last 10 or 20 years – they have been swapping and selling customer information to whomever will pay for it – as if the information belonged to the company.
GDPR is going to reverse that situation – customers (human beings) will now own their personal data – companies will be allowed to use that data only if the customers permit that use. Consent to use personal data must be given by the customer to the company using the data – opt out scenarios and even pre-selected check boxes will no longer be acceptable. Customers will have the right to demand that a company delete personal data, change personal data, and even ask how it is being used – and answers will be required. Further, customers will now have the “right to be forgotten” – if a customer withdraws consent for use of personal data, the company must delete it.
Perhaps the most serious challenge for companies is the complexity of their own data. Companies use their customer data in many different ways, and it is used by many departments within a company – sales, marketing, support, etc. The various departments often process the data for their own purposes, and therefore possess their own versions of customer data. Managing this data complexity will be a problem for GDPR compliance.
If a company outsources storage or processing of personal customer data to a service provider (for example a cloud service provider), they are still strictly bound by GDPR rules – and so is the third party. It does not matter if a cloud service provider knows whether its customers are using its service to process personal data – ignorance is no defense.
Companies that fail to comply with GDPR – willingly or inadvertently – face a potential 4% fine of global revenues. So it is a big deal if US companies are in danger of defaulting GDPR? The answer is yes, because organizations not based in the EU will be impacted by GDPR if a US company is offering goods or services to EU-based customers, and those reparations will apply to the US company.
So what will be the cost of GDPR compliance? Some expect their GDPR compliance to cost over $10 million. That cost may include changes to business processes, as well as increased staffing to manage GDPR compliance properly. For example, an appointed Data Protection Officer (DPO), an enterprise security leadership role required by GDPR, is to fully understand when and where data transfers of customer data occur. Changes to business processes will undoubtedly include how data is transferred throughout data centers and between partner companies.
GDPR compliance solution: a mainframe-centric approach
How do you comply with the new GDPR requirements? Well, how long is a piece of string? GDPR doesn’t specify how to comply; it only insists that you do comply, so it’s going to be up to you and your organization to determine how you get there. For large organizations with multiple complex datacenters, it might be a good idea to start with common-sense best-practices.
An obvious best-practices business process solution is to ensure that GDPR-level customer data management is handled centrally. This combined with data replication, is a way to ensure that there is one single source of an organization’s GDPR truth. In fact central GDPR data management is a really a must-have. And one way to do that would be to make (or retain) existing mainframe systems as the system of record – this makes sense because that is where most customer data resides anyway. Plus it only makes sense to use a zero-downtime system as the system of record.
Such a process would allow data replication to be initiated only from the system of record to other systems – never in the other direction, and never allowing daisy-chained replication. Non-system-of-record systems should never be allowed to replicate data to other systems, but would be allowed to update the system of record.
In this way, a DPO can be sure that changes to the system of record (addition or removal of customer recorded details or complete customer records) can be tracked and managed. Without this level of data management, a DPO’s job will be much more difficult, especially in large dispersed data and multi-platform environments, or in partner-company relationship environments.
Most large organizations running mainframe systems also have a variety of other computing platforms running in multiple distributed data centers, and transport their data between databases and platforms or a regular basis, as part of their normal business processes. There is no getting around that.
The potential to lose track of that data is significant and growing, hence the need for improved data transport control. Fortunately there are high-performance, multi-platform, multi-database and multi-data type data replication solutions available right now, ready to facilitate this capability.
GDPR compliance will be possible for any company running any and all IT platforms. The only questions are how fast can you get your house in order, and will you choose the right tools and implement the right processes.
Latest posts by Keith Allingham (see all)
- Part II: Six ways to improve datacenter performance while saving on costs - Apr 25, 2019
- Typical Techniques for Improving Datacenter Mainframe Performance - Apr 18, 2019
- Fast Mainframe Data Access – Apples and Oranges - Jan 17, 2019