Machine Learning

In today’s data driven economies there is evermore pressure to drive financial returns and payback for IT investments, increasing the need for IT organizations to deliver both analytics and digital transformation with practical and meaningful business outcomes.

The need for instant services has increased exponentially due to ’consumer driven’ services such as Application Stores, Digital Music, Social Media, E-commerce, FinTech and even IoT. This consumerization of technology drives innovation within the enterprise to reach user expectations but the challenge, as always, is doing this in a highly available and secure manner.

Big Data / Analytics / Machine Learning / AI / Cloud have all been breakthrough technologies, matured into acceptance in a relatively short period of time. The question many may pose is whether the underlying hardware and systems have kept pace with this change. On-premise systems are sometimes frowned upon in this cloudy world but with the huge swarms of business data being managed there is a reason why the IBM Mainframe is considered an evergreen machine.

A recent IBM announcement confirmed that IBM machine learning, part of the Watson portfolio, is coming to the z/OS platform. The cognitive IBM Watson service is now available, direct to data stores within your mainframe environment without the relocation of your data. This is a massive advancement in data analytics capability which means that this can now be taken as a service to YOUR infrastructure without any data replication.

The significance of machine learning, speed and capacity

Many of the largest data stores reside on mainframe systems all around the globe and for many reasons this technology has outperformed x86 systems in every way, and has steadfastly remained the operational core of many industries worldwide.

Today, data has become the fuel for industries – with a mission – to be competitive and innovative. This is challenging as larger data repositories transiting data to a cloud or analytics provider, usually offsite, can be difficult due to volume, network latency and processing costs.

A good example of where the above statement is expected to benefit is within the banking and finance industries, where insight has to be as near real-time as possible. This near real-time capability encompasses all areas such as fraud, transactional processing and batch processing.

Any organization today embarking on a major machine learning or an analytics project will of course be looking at the ability to create, refine and deploy analytical models in a high-volume capacity with the ability to train machine code learning and optimize this data for analytics. To do this within the current mainframe environment is a huge leap towards mainframe infrastructure becoming a “cloud-of-clouds” within its own on-premise or datacenter-hosted environment.

The leap forward for an organization using this “cloud-of-cloud” model is the residency of data which may have a bearing on all industries currently looking at changed data privacy laws in the US, the European Union GDPR issue, or indeed data sovereignty issues which comply with any in-country legislation.

By controlling data from a location and allowing a true command and control structure to be optimized using technology such as APIs, will permit the mainframe to truly be the central heart of the data economy. We are now looking at the final goal of integrating machine learning and analytics and maybe looking at a new age for this technology.

Mainframe environments have proven to be innovative, resilient, scalable and recently open (Open-source ready) and now we have the cognitive functionality with machine learning capabilities, what other systems can boast this statement? In a word, none, in my opinion!

Worked continuously in the Financial Services Industry (primarily on the IT side) for over thirty years.
During this time has worked first-hand on major Industry Initiatives both in the U.K. and in the USA – such as TALISMAN, TAURUS, CREST, (the Bank of England’s) CGO, Counterparty/Client/Settlement Risk Reporting, CHAPS, Model A and B type Clearing, Intra-Day Payment Netting, Capital Gains Tax Reporting, Regulatory Reporting, Trading Interfaces (from DOT through to FIX API’s and beyond), Multi-Instrument and Multi-Currency systems, Direct Market Access and Custodian Services.
In short, I have been pretty much continuously involved with various types of FinTech for the longest time.

Leave a Reply

Your email address will not be published. Required fields are marked *