The AI balancing act

‘Technical debt’ is on the rise. Generative AI is part of the problem, but it holds promise as a new means to solve it.

Blog
Sanjay Gupta

Sanjay GuptaSenior Writer

Jul 11, 20245 MINS READ

Technology leaders today face a tricky balancing act with AI.

They’re racing to deploy next-gen apps and solutions that they expect to deliver big returns, especially those from generative AI. Like all forms of technology innovation, however, AI comes with hidden costs. 

It’s called technical debt, a part of risk management that CIOs have kept tabs on since Y2K put it on the map in 1999. Simply put, it describes all of the anticipated (and unanticipated) costs businesses incur when deploying new technology: fixing software bugs that couldn’t be dealt with at launch; installing patches after unknown vulnerabilities emerge; the costs of modernizing legacy tech infrastructure. “Debt” accrues when companies leave any of those issues unaddressed. 

Technical debt is on the rise: Years of corporate investment in digital transformation, mass migration to cloud software and systems, increasingly complex tech stacks, and now AI-generated software code have all helped run up a big tab.

In the U.S., total technical debt is estimated at $1.5 trillion annually in operational and security costs, according to one study. Other research by Protiviti shows that CIOs already spend 30% of their IT budgets on managing technical debt, and 9 in 10 CTOs ranked technical debt as their biggest challenge in 2024, according to STX Next.

To take advantage of new generative AI capabilities, tech leaders must constantly address any technical debt that gets accrued. You can't put it off.

Benjamin Wesson

Vice President of AI, Freshworks

Generative AI is the key new variable in the equation. While early adopters take on new risks and future debt with every gen AI deployment, the technology holds promise as a solution to the problem. Gartner predicts that by 2027, companies will use generative AI tools to create replacements for legacy apps and reduce modernization costs by 70%.

In the meantime, tech leaders must navigate carefully. “The challenge with generative AI is that it's moving so fast, and there aren't standards across all of the models,” says Benjamin Wesson, vice president of AI at Freshworks. “To take advantage of new generative AI capabilities, tech leaders must constantly address any technical debt that gets accrued. You can't put it off.” 

Next-gen technical debt

Managing all the future unknown risks associated with generative AI is made more difficult by the multi-modal complexity of the technology. 

Large language models perform their magic in six different media: text, software code, images, audio, video, and 3D/virtual content. Each content type adds more complexity and risk. “Generative AI takes you to an entirely new orbit of challenges when it comes to technical debt,” says Vinod Subramanyam, managing director of Brillio, a global tech consultancy with expertise in AI strategy.

There are ways, however, to tame the beast, according to experts. Insights from Subramanyam and Wesson—along with other research we conducted about technical debt—suggest a variety of tactics to consider to get better control over technical debt without slowing the pace of innovation with AI.

Independent research

Service management sees 356% ROI with Freshservice

Here are several takeaways and recommendations:

  • Ensure AI models integrate with legacy systems. Two critical areas of debt risk for IT leaders are data governance and integration of gen AI code with legacy systems, says Subramanyam. “It is easy to get started on the gen AI journey, but companies that have not invested enough in data quality, integration, and governance will face big challenges over time.” As a foundational first step, tech leaders and others must ensure that their AI model integrates well into their existing IT landscape, Subramanyam says.

  • Set up “LLMOps” to manage operations and data. Plotting out a dedicated governance framework for generative AI applications is another important measure, says Subramanyam. For example, IT leaders should consider replicating the structure and rigor that DevOps and AIOps teams provide in more conventional software deployments and applying those to LLM work. “It's important for companies to expand their DevOps and AIOps initiatives into what I call LLMOps or XOps—a backbone to streamline data and monitor operations.”

  • Use API “wrappers” to insulate code from continual LLM risks. When customer- or employee-experience developer teams adopt and implement LLMs, they’re vulnerable to occasional breakdowns and changes, which can accrue more technical debt overnight. Wesson suggests that internal AI development teams create an API wrapper, which allows LLMs to complete outputs without delays, errors, or exposure to external risks. “For any company adopting LLMs, given the rapid pace of change, finding ways to insulate application developers from the breaking changes is paramount,” says Wesson. “We don’t want CX or the EX development teams to slow down or refactor their code every time an LLM breaks something.”

  • Get ahead of critical risks when fine-tuning language models. There are many obvious reasons to fine-tune LLM models so that they deliver better quality, accuracy, tone, and other factors. Far less obvious, Wesson explains, are the serious risks that fine-tuning often creates, opening your company to new attack vulnerabilities. “Even something that's seemingly benign, like training a model on a data set that's domain-specific to your company, can be exploited,” he says. “CIOs should take notice and be wary. While fine-tuning may seem like the obvious solution, it has ramifications that follow from it.”

Read also: Bob Sutton on the power of subtraction in the world of SaaS 

The most important rule of thumb on which Wesson and Subramanyam both agree is simple: Be as proactive as possible in identifying new AI risk factors that will contribute to future debt. At the same time, leaders should also explore some of the more promising applications that use generative AI to tackle different forms of technical debt. 

Based on recent reports, here are three such strategies worth considering:

  • Code analysis: San Francisco-based Databricks (a Freshworks customer) is using generative AI to quickly analyze and understand its legacy code base—something its CIO says has eased the pain of engineers, according to a recent Wall Street Journal report. Understanding what the code does is one of the important ways to keep technical debt in check.

  • Refactoring or updating old code: Wayfair, the online furniture retailer, is using generative-AI-based coding tools to update old code, according to the same report. Called refactoring, the practice would allow developers to shift focus back to more strategic projects.

  • Code documentation: Many companies struggle to keep software code documentation of their IT applications up to date manually—a recurrent theme in managing technical debt. Generative AI can help streamline that process, according to one report, automating the creation or updating of documentation for new as well as legacy apps. 

Technology leaders who can walk the tightrope between AI innovation and technical debt management stand a better chance of making it across without incurring more unforeseen debt.

“It’s kind of the Wild West with these LLMs right now,” says Wesson. “It is such a breakneck pace that everybody needs to be cognizant and invest additional effort to not carry technical debt."