Written by Eric Newcomer, CTO, WSO2, dicusses the state of cloud native development today, and the need for further innovation.
Cloud native application development is an approach to building, running, and improving apps based on well-known techniques and technologies for cloud environments. As more organizations move to cloud-first strategies, this is certainly the direction that enterprises are taking. Why? Because it allows them to build and run scalable applications in modern, dynamic environments such as public, private, and hybrid clouds. Having said that, it is not easy to adopt a different software development approach and I know cloud native development is problematic for enterprises, as I come from the customer side myself.
Harnessing the right skills, knowledge and approach
I worked at multinational institutional and consumer bank, Citi, for several years and during that time, I was directly involved in cloud migration activities. As a result, I experienced first-hand the huge challenge involved in getting the right skills, knowledge, and approach around understanding what is different with cloud native development. It is a completely different infrastructure to deploy applications on. Organizations need to design and develop code specifically intended for that infrastructure to get the best out of it.
I likened this difference to when we were working on application user interfaces for Web browsers, and then mobile devices came along, and we naively tried to just port those web-browser-based apps onto the mobile devices. And they didn’t really respond very well to the different characteristics of that hardware. So we ended up having to develop native apps specifically intended for these devices because they have different characteristics than a PC. This was especially the case with the navigation modes, the touch screen, the size of the screen, biometric authentication, available location and other device information, phone integration, and the inherent multimedia capabilities.
Developing for cloud native is similar in that the infrastructure for cloud computing runs on commodity servers, which means there are hundreds or thousands of PC-level servers you deploy your app across, instead of old app servers and mainframes where you could put all your code into one image or one big application and run it all together. To get the most out of this type of infrastructure, developers must break the application up into microservices that are designed to run individually on these smaller PC hardware units. And then these applications must talk to each other over the network to get all the functions together.
Transitioning from monolith to microservices
That’s a very different paradigm from the old paradigm. The industry is moving from monolith to microservices, and understanding why applications need to be broken up into smaller units of work to run correctly in the new cloud infrastructure can take some time. I used to train teams at Citi, as part of our distinguished engineer program, and one of the areas I focused on was the principles behind microservices and cloud that make it different for applications and software.
Today all the latest IT trends come out of this environment, big data, microservices, DevOps, in effect, the automation required for infrastructure deployment. All these technology characteristics come from the shift in software that is necessary to take advantage of and run correctly in that environment. It takes a while to understand that it’s not just a port from mainframe to Linux or from some proprietary version of Unix like Solaris to Linux; there is a different computing paradigm, where you have to stitch all these smaller units of work together. And the smaller units of work are there because the computers are little. And it takes longer for developers to internalize how to do this correctly. And it’s not just the technology challenge of having to think differently about your applications and data, but it’s also the development culture.
If we look at some of the leading examples of companies who’ve done this, such as Netflix and Amazon’s retail Web site, they also pioneered a practice called the two-pizza teams or autonomous teams who take responsibility for the entirety of a smaller project, from requirements through to deployment and support. The idea is that smaller teams create more effectiveness and take on responsibility for smaller functions. The work of small teams can progress independently from other teams, as long as the interfaces among functions are strongly governed.
This provides what are called “bounded contexts” that allow small teams to work autonomously to develop and deploy an application feature and still plug into all the other features of the larger application without impacting them when making an update and deploying it to production.
This is a big change for companies with IT systems that predate the cloud, typically separate operations and development functions. In the cloud, there is no operations department. Instead, developers perform the operations work using APIs and automated build, test and deploy pipelines.
I have described a huge culture difference that is often more difficult for companies to achieve than just adapting to the technical issues. And it’s not easy to do – it requires a different set of developer skills. If we think about new developers coming into the industry, this begs the question as to whether universities need to change the way they educate computer engineers.
Bridging the gap between education, technology and business
At Citi I ran a small innovation lab team at Citi’s Cornell Tech office, where we worked with the Cornell Tech institution. This is a graduate school offering programs for MBA and computer Masters and PhD degrees, oriented to technology careers. The idea of the colocation was that the businesses located on campus would help students create and oversee projects based on real-life business requirements. The students would help come up with creative ideas to solve those problems.
Technology students, therefore, had a business involvement with companies and also took business classes. MBA students would also be involved with companies and took programming classes. Cornell Tech hosted a formal challenge where companies would submit a business problem for students to work on every year. These projects had cross-pollinated teams where computer scientists would have an MBA student in their team and vice versa.
Through this new approach to graduate education, Cornell Tech was consciously trying to bridge that classical gap between business and technology. Other universities are following this model as well.
Tim O’Reilly, the founder of O’Reilly Media, famously said he thinks that coding will be like reading and writing someday and become one of the basic skills schools teach. We also are seeing the industry embracing no-code and low-code tools. Everybody’s got their own definition of what this means, but I think the core premise is to make it easier for more users to be able to work with software and help develop applications.
Therefore, what we have is two clear trends; literacy of computers is on the rise in universities and schools and the literacy of understanding business problems is improving among computer science people. And then you have the trend toward abstracting the way to make it easier for people to interact with code.
All of this will undoubtedly assist in the current transition toward cloud native computing models and practices.
- Healthcare digitization is seriously lagging
- Google Cloud announces healthcare Data Engine to enable interoperability in healthcare
- Forescout to deliver the first zero-trust solution for OT, IoT, healthcare and hybrid cloud segmentation
- How AI Can assist sustainability in the healthcare industry
Why everything is becoming code
Here at WSO2, our goal is to look at how we can support companies with digitization where so much revolves around creating code that embodies a company’s unique value for its customers and partners. As a result, there is a need for more people in the company to be involved in code in one way or another. Whether that’s putting in requirements, undertaking quality assurance, helping to understand that what’s being coded is what’s needed. Today, I think there is a real recognition that software is becoming the world, or everything is becoming code, which requires new approaches, culture, and ways of thinking.