ServiceNow is the latest major enterprise technology vendor to share its plans for leveraging generative AI capabilities in enhancing its workflow digitization and automation platform.

Omdia view

Summary

It is hard to escape the hype surrounding generative artificial intelligence (AI). ChatGPT has become a beacon of this movement, capturing the interest and imagination of the business world and consumers. Perhaps ChatGPT’s most notable value is in how it has enabled millions of people to get hands-on with advanced AI capabilities, helping to educate and inspire them. Inevitably, the success of ChatGPT has set the business world alight around the potential of generative AI to transform operations and digital experiences. However, balancing caution and optimism has become important as concerns and questions have emerged around how the technology should be governed and how to secure sensitive data ingested by the large language models (LLMs) that are the foundation of generative AI.

ServiceNow is the latest major enterprise technology vendor to share its plans for leveraging generative AI capabilities in enhancing its workflow digitization and automation platform. At its Knowledge 2023 event, ServiceNow made numerous announcements focused on how generative AI would enhance its platform and alleviate some of the security and governance concerns that business leaders have associated with the technology.

ServiceNow’s AI strategy is guided by a desire to help businesses realize value from intellectual capital

ServiceNow has invested in AI for over a decade, and the vendor is keen to stress how AI will augment human decision-making. At the recent Knowledge 2023 event in Las Vegas, ServiceNow CEO Bill McDermott stressed that intellectual capital will be the primary source of value creation for modern businesses to differentiate and digitally transform. McDermott said that large-scale AI systems have read more text than any human could in their lifetime, but these systems will continue to make mistakes, which is why humans and machines must work together. While ServiceNow recognizes that the efficiency and economic benefits generative AI will offer are becoming better understood, agility is necessary in this high-stakes game. Governance surrounding how these systems are developed and used is vital to ensure the correct guard rails are in place to deliver long-term value and mitigate risk.

ServiceNow will employ a bimodal approach to generative AI

ServiceNow believes that generative AI will add significant value to its platform by leveraging AI to augment and enhance digital workflows within organizations, particularly IT, HR, legal, finance, supply chain, and customer service departments. ServiceNow will implement AI within its platform in two ways. First, businesses can “bring their own LLM” by integrating third-party generative AI models such as OpenAI and, eventually, Google into the platform. This is the most common approach that many other enterprise technology vendors are adopting; however, it is also one being met by concerns about governance and data security. While general-purpose LLMs provide the most value when they ingest unique data from the organization and its industry, there are concerns about how the sensitive data will be used. This is where ServiceNow’s second approach applies, where domain-specific LLMs trained on confidential data are secured and remain private. This enables businesses to train using suitable open-source models on the use cases and scenarios that matter to their organization. These approaches help ServiceNow alleviate data governance concerns and potentially enable the vendor to develop a range of out-of-the-box business functions and industry-specific trained LLMs that, over time, could act as templates for businesses to leverage and customize quickly. This two-fold approach allows organizations to use third-party generative AI solutions with highly specialized, domain-specific LLMs informed by proprietary knowledge and customer needs.

IT service management will be the first domain-specific LLM

Given the heritage of ServiceNow’s platform, it is clear that IT service management (ITSM) will be the solution of focus for the vendor’s first domain-specific LLM development, mainly because ITSM is still the platform’s most broadly adopted use case. Improving employee technical support also provides a well-understood application that ServiceNow can help improve by augmenting it with generative AI. Examples include assisting agents in summarizing incidents and auto-populating case records quickly, recommending resolution steps, and using natural language to develop new workflows.

Powering workflow development with generative AI

Hugging Face (founded in 2016) is an open-source machine-learning (ML)/AI hub that develops and publishes open-source models, datasets, and ML-powered apps for its community of users. In September 2022, ServiceNow and Hugging Face announced the “BigCode Project,” which is an open-source and responsible AI initiative for developing state-of-the-art LLMs for code openly and responsibly.

In May 2023, ServiceNow and Hugging Face announced StarCoder, a code LLM capable of generating new application and workflow code from natural language. In a press release announcing StarCoder, ServiceNow and Hugging Face commented that “the StarCoder model is designed to level the playing field so developers from organizations of all sizes can harness the power of generative AI and maximize the business impact of automation with the proper governance, safety, and compliance protocols.” Using natural language to develop apps and workflows has huge potential to broaden and transform how users and ServiceNow’s vast ecosystem of partners build added value on top of the platform. The rapid rate of digital change means that application and workflow development must be fast but appropriately governed simultaneously. StarCoder is a significant example of how ServiceNow will achieve this balance between digital innovation via a robust and governed approach to bring generative AI capabilities to its platform.

ServiceNow has delivered a robust strategy and approach to leveraging generative AI, but enterprise education is vital

The generative AI enhancements made by ServiceNow, notably its bimodal approach and partnership with NVIDIA, are mature and advanced, which Omdia feels holds much promise. These capabilities have the potential to deliver much value to customers by providing a natural and intuitive means by which people with varying skillsets and platform familiarity can interact and get more value from the ServiceNow solution. However, as with any application of generative AI, enterprises need to become well versed in the implications of using this technology. Factors that business leaders must consider span people and technology and include perception challenges such as employees believing AI will steal their jobs, in addition to data privacy and security concerns. A responsible and inclusive strategy for leveraging generative AI that is guided by broader regulatory measures will be important for businesses.

Appendix

Further reading

“Knowledge 2023 highlights why ServiceNow is at the core of the digital workflow revolution” (June 2023)

2023 Trends to Watch: Workplace Transformation and Hybrid Work (December 2022)

Market Fundamentals: Business Mobile Convergence (BMC) and the Future of Work (April 2023)

Digital Partner Opportunities: Ten insights and recommendations from Omdia’s 2022 future of work survey (July 2022)

Market Landscape: Fundamentals of Employee Experience Management (April 2022)

Author

Adam Holtby, Principal Analyst, Workplace Transformation

askananalyst@omdia.com