Part 4 of 4: The Value and Future

In Part 1 The Basics, Part 2 The Good and Part 3 The Bad, I’ve explained I did a proof of concept port of an Enterprise Application from Amazon Web Services on x86 to Linux on System z in 2017. The good news was I got to the point I needed to, the bad news was it was more than difficult to get there. But why did I go there in the first place?

The vendor for the Enterprise application was targeting the Financial Services industry for their initial deployments. This is the primary customer for IBM System z. Their beta customer is running z/OS transaction processing via CICS, but wants to authenticate customers using this vendor’s product running on Amazon Web Services. In order for CICS to call the AWS Cloud, it has to launch Websphere on z/OS to call the vendor’s  service on AWS. The vendor’s application has to do it’s task of authenticating users and get all the way back to CICS in less than 18 seconds so the transaction doesn’t time out. It’s a really powerful use of the vendor’s application and valuable to both the consumer and financial institution to avoid potential fraud or cybersecurity scams.

Java and Analytics run better on z/OS

I was told this vendor wrote all their code in Java, so I immediately began a plan to get this running within z/OS, since Java runs so well there, especially on the z14 systems. I also knew that in the time allotted to run on AWS for those 18 seconds, only three biometric/analytic tests could be completed on behalf of the consumer. I hypothesized that if the vendor app ran within z/OS perhaps up to ten analytic tests could be completed using the outstanding analytics and Java performance. However, once I learned of the number of open source middleware programs required and the complexity of porting them to z/OS, I went to Linux on System z as the target port.

Linux on z as a private cloud has more value than a public cloud

Using RDMA as the memory based communication between z/OS and Linux LPARs, I know it will take a bit more time than running inside z/OS, but much less time than going to a public cloud, so I hypothesized that eight analytics tests could be done instead of the three on AWS. And regardless of z/OS or Linux on z implementation, the vendor agreed that the software price would be the same as AWS. The net is, z would have additional analytic value, and given it’s hardware and software integrity and reliability, it would offer better security and business resilience than any public cloud provider.

So that’s what I set out to prove. Sadly, I got so close and the vendor changed their mind on their business strategy. They received a significant new round of venture capital investment, signed up several new financial firms to try their code and they decided to stick to their current cloud plan and stay off the mainframe, for now.

I still believe that my hypotheses as to the performance and value were correct. But the activity ended just before I was able to prove that. However, the exercise did confirm the possibility of getting the product on the mainframe successfully.

Docker inside z/OS? That would simplify things!

But what else is possible? I said in Part 3 that Docker containers are not portable across architectures. However, they are portable within the same architecture. There are some prototypes underway for Docker to run within z/OS. Given the way Docker works on other platforms, it would infer than any Linux on z containers could run unmodified within z/OS. If Docker for z/OS were to run on a zIIP processor, there would be no software license hits for z/OS. If that all comes to pass, that could lead to significant transaction and analytic value within z/OS and greatly simplify the system management requirements for these types of hybrid workloads, while improving the overall security, resilience and performance and reducing the operational costs. I would hope that a public announcement of this capability is not too far in the future.

Savings and Operational Strengths

That, my IT friends is a win for everyone. Any of the bad associated with a slightly more complex development environment can quickly be eradicated with a greatly reduced operational expense that has greater operational benefits than any alternative architectures might try to demonstrate. This type of workload makes for a very compelling end to end benchmark comparison as well. So while I didn’t succeed in getting the enterprise application to market, that was because of a business decision rather than a technological impediment. And the business decision was tactical, based on their new financials.

I learned a lot and documented many of the short cuts I took and set up required to make this development effort possible. I’m happy to share the experience if you’d like to undertake your own development effort. While I thought the end of the project was a failure, it’s unintended consequence, with the efforts of the great Linux for z community identified in Part 2, is that this will be easier porting for everyone that follows.

Bibliography

LinuxONE and Linux on z Systems Open-source Team

LinuxONE Developers Works

Neale Ferguson’s pre-built Docker containers for z

GitHub repository to S390X open source scripts  From this page, search for the package you are interested

 

Originally Published.

Jim Porell is a Rocket Principal Software Architect, focusing on new functions for System, Storage and Security products from IBM. His primary focus is the architecture of the OMEGAMON family of monitoring agents. Prior to joining Rocket, he was an independent consultant and retired IBM Distinguished Engineer. He held various roles as Chief Architect of IBM's mainframe software and led zSystems Business Development, as well as  marketing of Security and Application Development for the mainframe. His last IBM role was Chief Business Architect for Federal Sales. Jim held a TS/SCI clearance for the US Government, was a member of the US Secret Service Electronic Crimes Taskforce in Chicago and co-authored several security books. He has done cybersecurity forensic work at a number of Retail, Financial and Government agencies and created a methodology for interviewing customers to avoid security breaches for large enterprises.  Jim has over 43 years working with Information Technology.

Leave a Reply

Your email address will not be published. Required fields are marked *