Observability trends evolve as market must tackle cybersecurity with automation

Observability trends evolve as market must tackle cybersecurity with automation

Posted on

In the age of mainframe computing, stacks were static, applications were easy to track, and the monolithic infrastructures supporting them were simple to map. Everything was visible, and application performance monitoring was a manageable task.

Today, APM isn’t as clear-cut. Solutions need to deal with cloud-native applications that exist in a nebulous and dynamic environment, where monitoring needs to be flexible, scalable and happen in real time. It is no longer enough for APM solutions to report on errors and log incidents. Companies require holistic, modern solutions specifically developed to cope with complex and obscure cloud environments across the entirety of the IT spectrum.

In response, APM solutions have expanded from preconfigured monitoring dashboards to built-in tools that allow users to flexibly dig into what’s happening as it’s happening. This new category of advanced APM solutions is known as “observability.”

Observability has been gaining traction since 2019, according to analyst and RedMonk co-founder James Governor, who described it as “a new frontier for user experience, systems and service management in web companies and enterprises alike. Technology providers in the APM, log management and distributed tracing categories are all positioning themselves as observability plays,” he said.

The forces driving the need for observability

The time frame for the switch from APM to observability parallels the disruptions caused by COVID-19, and this is no coincidence. The pandemic catalyzed the digital shift already underway, making cloud adoption an operational mandate and ushering in an experience-centric era where data insights drive business decisions.

Implementing and managing the technology and talent required to deliver those insights is the task of DevOps teams. However, in a newly digital company, many of the members of the IT department are transitioning from a waterfall mindset and lack experience in agile development methods. These teams are juggling secure data distribution to the increasing number of departments requesting it with their primary objective of keeping the technology, upon which the business relies, running.

“Reports of business transformations shrinking from two years to two months during the pandemic are impressive, but the exceptional speed also presents the opportunity for devastating security breaches and problematic application and infrastructure interactions,” Charlotte Dunlap, principal analyst for application platforms, enterprise technology and services at GlobalData PLC, told theCUBE.

The move from monolithic apps to microservices can be overwhelming, according to Dunlap. A major challenge is the almost-impossible task of monitoring and optimizing applications running within containerized environments such as Kubernetes. Due to this, “participants of DevOps efforts are forced to reexamine how they implement observability earlier in the application lifecycle to improve insight in the underlying infrastructure,” she said.

Digital transformation can cause operational inefficiency

It’s impossible to miss the irony — the digital transformation originally intended to increase efficiency has inadvertently created an optimization nightmare.

The knock-on effects of this are significant. First, the inability to see what is happening with applications leads to over-provisioning in an effort to avoid downtime by “playing it safe.” This leads to spiraling cloud costs, which are wasteful both from a budget standpoint and environmentally.

Then, there is the issue of impact on the overall user experience. Keeping every part of this tech stack running smoothly is critical for business success in a world where customers expect pages to load within two seconds … or they’re gone.

“Today, it’s much harder to understand the customer experience, because it’s difficult to get a full picture of all the data. What we mean by that is user data, application data and infrastructure data are all fragmented,” said Dave Vellante, theCUBE and Wikibon analyst, in an article on the evolving APM market.

The third effect has been commanding headlines across the world: cybersecurity. In 2021, it took companies an average of nine months to even notice their systems have been breached. That’s 212 days to detect a breach and a further 75 to contain it. This is time during which the cyberthieves have free rein to explore a company’s data and help themselves to whatever they find.

Ops teams have addressed the need to secure mission-critical data resources by buying point tools to monitor specific applications or areas within their systems. According to 451 Research’s “Voice of the Enterprise” survey data, a third of enterprises use more than eight vendors for workload monitoring and incident response. This indicates that “the needs of a significant number of enterprise IT shops presently are too varied to be suitably served by only a handful of vendors,” James Sanders, research analyst for 451 Research, part of S&P Global Market Intelligence, told theCUBE.

However, in the complex world of containers, the failure of even one small, non-critical element can cause, at best, unacceptable downtime and, at worst, infiltration by bad actors and potential data loss or ransom. This means that despite the current fragmented market, the need is there for full-stack observability solutions that automate application monitoring across the entirety of information technology on which a company’s operations are based.

Next-gen APM solutions incorporate AI/ML

Being able to have observability across infrastructure — data storage, compute, network, containers and database — and through the CI/CD pipeline to client-side code and applications, even the code on which the company’s websites are built, is “the holy grail” of modern APM solutions. This vision of a one-for-all solution merges the tasks of application performance monitoring, optimization and security under one umbrella.

“The grail solution takes all this disparate data, ingests it, transforms it and connects all the dots – across clouds and on-premises, then shapes it with machine intelligence to create an organic systems view and proactively tells you a problem is coming, and even how to fix it … or just fixes it for you,” Vellante stated.

A range of companies have stepped up to offer solutions that, to one extent or another, fill this need, including Gartner Inc. Magic Quadrant leaders Datadog, Dynatrace, Cisco AppDynamics and New Relic, as well as cloud-native startups and potential market disruptors such as Big Panda, Cribl, Moogsoft and StormForge, a flagship platform provider at Gramlabs Inc.

However, the “holy grail” level of insight Vellante describes isn’t possible without automation. So, as well as leading the APM market, these solutions fall under Gartner’s definition of artificial intelligence for IT operations by “combining big data and machine learning to automate IT operations processes, including event correlation, anomaly detection and causality determination.”

One recent notable market disruption came from StormForge, which marked an industry first by closing the observability loop between data generated in production and pre-production. With the announcement of Optimize Live, StormForge has added automated optimization for production environments to its existing pre-production Kubernetes optimization platform Optimize Pro.

Marking StormForge’s importance as moderately high in her intelligence report, “StormForge Disrupts Powerful DevOps Space With Observability Actionability,” Dunlap stated: “Solutions by companies like StormForge are receiving newfound attention among IT operations teams for their ability to address a major pain point among enterprises grappling with deploying modern application architectures into multicloud environments while requiring continuous delivery capabilities.”

Observability, AIOps and security merge

In 2020, Gartner put the AIOps market size at $1.5 billion and estimated it would rise at a compound annual growth rate of 15% over the next five years. Global Market Insights was more optimistic, putting the market at $2 billion in 2020 and projecting a global CAGR of over 20% to reach a market size of approximately $10 billion by 2027. However, both these estimates were low when compared with Research and Markets’ 2021 figures, which showed the 2021 market for AIOps platform sales already at $7.09 billion and a forecast of $27.26 billion by 2028.

If this estimate is correct, AIOps will eclipse the APM market as a whole, which Emergen Research estimates will grow at a steady CAGR to reach $15.4 billion by 2028. But the total addressable market for intelligent observability solutions is forecast to become even larger. Market leaders predict a period of consolidation for the observability and security markets and are positioning products accordingly, according to Kevin Burns, chief financial officer of Dynatrace, and Shay Banon, founder and chief executive officer of Elastic.

“The friction between DevOps and security is high,” Burns stated in December. “If we can minimize that friction there and help both the organizations innovate faster by putting secure code into production, that’s ultimately the goal, and that’s where we see the convergence.”

The inevitability of this market mash-up is such that analysis firm Statista Inc. even puts a dollar figure on it, predicting the combined cybersecurity and observability market will be worth $28.26 billion by 2024.

Disrupting the observability market

The massive opportunity presented by this market consolidation is encouraging innovations that take advantage of advances in data management and intelligent technology to straddle the security-observability divide.

The trend toward automation has birthed a new philosophy that is being embraced by many observability innovators: data as code. The movement applies coding practices to data management, allowing companies to analyze and share data in the same way teams handle code during the software development process.

Disruption is coming from all sides as market forces collide, cultural methods for data management are revolutionized, and technology leaps forward to automate data-monitoring processes. This makes observability a much broader consideration than one limited to advances in intelligent technology that allow companies a greater level of insight into their data assets.

“As with so many important technology areas, observability is a mindset and set of practices as much as it is a technology,” Governor said.

With interest in observability solutions at the “peak of inflated expectations,” according to a recent Gartner Hype Cycle, a market forecast for “cloudy with a high chance of disruption” is a reasonable one. And as cloud computing becomes the dominant business model across the globe, the demand for holistic solutions that provide visibility into the complex environments created by the adoption of cloud-native technologies will continue to increase.

While there isn’t yet one “holy grail” optimization solution that can provide full-stack observability with automated security and optimization built in from the start, the currently fragmented observability and cybersecurity markets look to be moving toward consolidation, with more comprehensive and automated solutions in the works.

Photo by Getty Images

Show your support for our mission by joining our Cube Club and Cube Event Community of experts. Join the community that includes Amazon Web Services and Amazon.com CEO Andy Jassy, Dell Technologies founder and CEO Michael Dell, Intel CEO Pat Gelsinger and many more luminaries and experts.

Source link

Leave a Reply

Your email address will not be published. Required fields are marked *