FinOps Foundation held its EU event in Barcelona 11-14 Nov 2024 with AI for FinOps and FinOps for AI being key themes.
Omdia view
Summary
FinOps X Barcelona 11-14 Oct 2024 opened with the keynote from FinOps Foundation CEO JR Storment, who mentioned three key trends in FinOps today. First, AI continues to make disruptive waves in all spheres, including FinOps, and here it is AI in FinOps (using AI to assist in FinOps) and FinOps for AI (managing the AI workloads in a cost-optimal way). Second, FinOps broadens its scope to encompass IT consumption beyond the public clouds: SaaS, private cloud, and data center. Third, the expansion of FOCUS, the FinOps standard for FinOps data transfer, which is helping the FinOps tool community perform multi-cloud/multi-source data management. This article expands on these themes and covers my conversations at the conference.
Analyst view
The FinOps landscape
Perhaps the first question to ask in a conference run by a non-profit, open community supported by the leading open source software body, the Linux Foundation, is where the open source is. The answer is in the open FinOps framework, which defines the practice and its scope (Figure 1), and with open FOCUS, the data specification for reporting FinOps cost and usage metrics. The premium FinOps tool providers and the CSPs consume these community efforts. The leading CSPs were, understandably perhaps, slow to start adopting FOCUS, but are now fully engaged, seeing the FinOps tool community as partners rather than competitors: typically cost savings give cloud users a better cloud experience, and those savings can translate back into new cloud investment.
Figure 1: FinOps Foundation Framework
Source: FinOps Foundation
The independent FinOps tool vendor landscape can be segmented by players who provide reporting and advice, whether through automation or teams who work through client spreadsheets, to solutions that provide autonomous FinOps processing that lets the tool make the decisions. Some vendors come from an IT Asset Management (ITAM) background and naturally already have in place the greater scope that the FinOps Foundation is now moving into, such as encompassing SaaS, private clouds, and data centers.
AI in FinOps/ FinOps for AI
AI in the form of the older, predictive machine learning technology is commonly used in FinOps tools for anomaly detection, forecasting, and other pattern recognition tasks. Exploiting the newer AI, i.e., Generative AI and large language models, is still on the roadmap for most of the smaller, independent FinOps vendors Omdia spoke with. In contrast, the major CSPs have already started incorporating this technology in their FinOps solutions.
Data center AI training workloads to create AI applications and the challenge for enterprises to manage their employees running queries on premium AI services is a ripe area for FinOps to monitor. This is an area where multiple groups within a large enterprise may be trialing POCs/applications and would benefit from managing the costs in a coordinated way.
FinOps and its near relations
The analyst roundtable hosted by the FinOps Foundation discussed the relationship between the foundation and initiatives such as ITAM, which originated in the early 2000s, and Technology Business Management (TBM, launched in 2012) predate the FinOps Foundation, launched in 2019. There is an overlap between all these initiatives, but the FinOps Foundation enjoys two clear advantages: first, there is the question of scale. FinOps started with the public cloud and the urgent challenge early adopters faced when receiving large unmanaged bills. FinOps started to address cloud costs and usage, and the scale of public cloud adoption made it relevant quickly. Being able to perform granular billing segmentation by users so that organizations could perform internal charge back was a vital benefit.
Second, the open data standard FOCUS allowed the vendor and CSP community to communicate in the same language (with a little translation), which helped FinOps gain traction when other initiatives like ITAM and TBM already existed. At this conference, the expansion of data fields in FOCUS was announced based on feedback from the community, which gives the standard greater strength.
FinOps culture
One of the topics that came up when speaking with delegates at the conference was the visibility and adoption of FinOps within organizations. If the CFO office held responsibility for FinOps, it led to the “people who say no” syndrome that afflicted security for many years, after all, killing a product or project reduced costs to zero – amusing but of no value to the business. If the CIO/CTO, or even COO held responsibility for FinOps, then adopting the tools and culture was more likely to succeed was a common view. Putting FinOps reporting in front of developers as they work was considered key.
Platform engineered FinOps
In the keynote, J.R. Storment also mentioned the idea of shift FinOps left, from production operations when the application has been built and bad, costly decisions may be too late to avoid, to the earlier build stage when FinOps can be made to bear on decisions before they have a negative long-term impact, and then even earlier to the design stage, with even greater impact on long-term costs. Putting FinOps in front of developers as they create and build applications is the answer, through the integration of FinOps monitoring into GitOps or platform engineering solutions: what may be called platform-engineered FinOps. The idea is to make FinOps-related decision-making easier and for developers to have that information in real-time while working.
Engineers from Warner Leisure Hotels, Madoc Batters, and Rich Young gave an insightful talk on integrating generative AI technology with their GitOps and Infrastructure as Code solutions run on AWS. The Generative AI element was provided by Amazon Bedrock, which is fully managed by Amazon and allows users to select a foundation LLM of their choice from the ones AWS supports (they used Anthropic) and augment this with private data through fine-tuning and retrieval augmented generation. Crafting prompts were a key part of the project, which started in Nov 2023 and was operational by September 2024.
As developers build applications, the FinOps solution provides real-time advice, such as reporting instances of over-provisioning, under-provisioning, incorrect selection of compute instances, and memory usage, such as for serverless Lambda calls. The Warner engineers reported an 84% reduction in over-provisioned resources, a 96% reduction in under-provisioned resources, 73 recommendations actioned in two months of operation, and around 14% monthly cost savings.
Vendor conversations
Omdia spoke with representatives from AWS, Broadcom VMware, Flexera, and ProsperOps. AWS will make fresh announcements on its FinOps offerings at AWS re:Invent Las Vegas, 2-6 Dec 2024. Flexera predates the FinOps movement and has a heritage in the ITAM world, which fits well with the scope expansion of the FinOps Foundation.
ProsperOps offers an autonomous FinOps solution that can be embedded in user processes, making decisions in real time without humans in the loop. ProsperOps says automation has the advantage when it surpasses human performance, and it uses the effective savings rate (ESR) metric to assess this, quoting achieving 40%+ ESR against an industry norm of <20% ESR. On the shift left theme, ProsperOps points out that waiting for usage optimization to fully finish at the end of the lifecycle is not as effective as combining usage and rate optimization from the start of the lifecycle.
Arrow Electronics will be the sole global provider of VMware Tanzu CloudHealth: the majority of the Tanzu CloudHealth business has managed service providers, and Arrow has a good relationship with and understanding of this market. Common across all the vendors I spoke with is that machine learning for anomaly detection and forecasting is implemented in FinOps solutions, and the use of LLMs is newer. VMware Tanzu CloudHealth uses LLM to provide recommendations for reducing costs. It also has a GenAI chatbot called Intelligence Assist for help in report building to bridge the skill gap between SQL experts and FinOps experts. Care is taken to ensure the LLM only sees public information. Tanzu CloudHealth supports the major CSPs and has always been a data center FinOps tool, so the FinOps Foundation's scope expansion fits well with its heritage. Tanzu CloudHealth also supports “bring your own data” from any cloud source.
An opportunity yet to be explored is the use of private data to augment LLMs for application in FinOps services. However, customers are wary of giving a wide range of access to their data, so this use case is more likely to be performed with on-premises FinOps solutions.
Appendix
Further Reading
Omdia Universe: FinOps, 2024. OM123051, Roy Illsley, July 2024.
Author
Michael Azoff, Chief Analyst, Cloud and Data Center Practice