Latest News

NoOps in a serverless world

Freeing up developer talent from the reliance and restrictions of operational management is the Holy Grail for the smoother, faster app deployment that is required to ride the tide of digital disruption. In this article, Maurizio Canton, VP Customer Success, TIBCO Software explores why keeping the innovation and solutions coming thick and fast and in line with business demands, calls for a frictionless infrastructure and a working culture devoid of bureaucracy, layers and restrictions.

It’s a requirement that has spawned the proliferation of a DevOps culture that now increasingly engulfs the enterprise. This is driven by the core aim of raising the quality and consistency of delivered software via agile processes and continuous integration. The approach has changed the game to tackle the traditional friction that exists between developers wanting to release software often, and those in operations for whom infrastructure stability remains a core focus.

Yet in our endless pursuit of quicker gains and efficiencies, for some, it is a root that doesn’t go far enough. Inevitably, the need for those building the apps to still manage their server infrastructure continues to compromise the speed of development and deployment. Furthermore, as we are reminded time and time again, with the digital transformation narrative, nothing stands still for long. The window of opportunity is expanding. Driven by customer demands, the scale and complexity of the cloud continues to rise exponentially, which is making it ever more difficult to identify issues in the environment. As such, increasingly specialised skills are required by the operations team to comprehend critical changes and respond appropriately.

Against this backdrop, it is perhaps not surprising that the level of automation used throughout enterprise IT is moving up a gear. It now goes beyond the infrastructure set up, configurations and software deployments that have largely defined its core purpose in the DevOps environment to date.

Fuelled by the convergence of more sophisticated machine learning and hyper-automation of cloud computing, the latest incarnation is set to permeate deeper into the enterprise. In doing so, it is bringing us to the cusp of another revolution in enterprise architecture in the form of NoOps. As its moniker suggests, this approach promises a fully automated IT environment entirely abstracted from the underlying hardware infrastructure, enabling even faster app development and turnaround times for business requests. And the ramifications run deeper; whereas DevOps was intrinsically about the merging and collaboration of the two entities, NoOps signals a greater divide between the two, as both focus on their core strengths and operate more independently once again.

As machine learning does more of the heavy lifting, operational tasks are infused with intelligent automation, able to make decisions to known problems and manage traditional infrastructure and security management tasks. Consequently, an even greater degree of human involvement is relinquished to focus on innovation, business outcomes and to bring more value to data to support the business model.

It’s a world in which software and software-defined hardware are provisioned dynamically, fueled by the agility of serverless computing, that removes the need to deal with maintenance, updates, scaling, or capacity planning. This drives experimentation and brings greater agility to application development primed for a variety of uses, spanning ecommerce, mobile back-ends, streaming data analytics, chatbots and artificial intelligence.

We are already seeing growing evidence of the traction. A recent Cloud Foundry global survey of 600 IT decision-makers found that 19 per cent of respondents were already using serverless computing, with another 42 per cent planning to evaluate it within the next two years.

As always, the approach will demand the right tools to bring the vision to fruition. Notably, this includes the use of Function-as-a-Service (FaaS) architecture solutions that integrate serverless computing platforms with machine learning frameworks. Additionally, there is the requirement for lightweight platforms − untethered to particular public cloud platforms − that can build lightweight microservices for serverless architectures with rapid deployment and scaling capabilities. This will slash costs and also minimise the security risks that can arise from being able to access legacy data stores via APIs.