October 7, 2025
artificial_intellligence-815x400-1

By Conor Jensen, Americas Field Chief Data Officer, Dataiku

In looking at the costs of developing, using, and maintaining AI systems in research and business settings, there’s no denying that companies are casting a critical eye on cost analysis to monitor whether costs will increase. Between inflation, recession, the war in Ukraine, and other geopolitical and economic circumstances, companies want the best possible understanding of what costs should be currently, and how they may look in the future.

What Are the Costs of Developing, Using, and Maintaining AI?

We do know the major drivers of cost in creating and maintaining AI can be broken down into the cost of data and database systems, compute costs for development or models, and personnel costs. These three categories can vary by industry, but outside the major R&D centres for AI (i.e., technology companies) the compute costs for development should be the smallest unless projects and teams are mismanaged.

The cost of data and database systems is variable, especially given the move to the cloud. Data for these systems tends to be curated and worked on, so it is not eligible for the “cheap” storage options provided by the clouds. While it is much cheaper to house such data today than it was in the past, this has also driven the tendency to keep all data as though it is equally valuable, driving up overall costs instead of only keeping and utilising the right data.

Personnel costs are the most variable and highest climbing portion of the costs, especially given the sharp rise in salaries for qualified data scientists and AI/ML engineers over the last three or four years. This is a trend that shows little sign of abating and will continue to drive up costs for companies as they seek to attract and retain the appropriate talent.

What Can Help Lower AI Costs?

It would be naïve to ignore the fact that AI initiatives represent a cost in and of themselves. Even if you know what use cases you need to tackle to cut costs, you won’t be able to benefit from them if you don’t have the right systems in place to move quickly and efficiently on AI initiatives.

The reality is that the AI project lifecycle is rarely linear, and there are different people involved at every stage, which means lots of potential rework and inefficiencies along the way.

One trend towards success is the continued investment in training and tools that allow knowledge workers with high domain and data knowledge, but without the programming and technical expertise to utilise historical data science tools. By upskilling existing workers, companies can both accelerate their journey and lower their overall costs rather than invest too much into hiring large populations of external data scientists and practitioners.

Here’s another way of looking at it: Leveraging AI for cutting costs requires massively increasing the number of cost-saving use cases being addressed across the organisation. This, in turn, requires empowering anyone (not just technical people on the data team) to leverage the work done on existing AI projects to spin up new ones, potentially uncovering untapped cost savings use cases.

Choosing AI Use Cases That Demonstrably Boost Cost Reduction Efforts and Maximise Efficiency

Now is the time to focus on use cases that reduce costs. This may include machine learning (ML) models that drive more efficient staffing and predictive maintenance models which may serve to reduce the cost of maintenance across the board.

Use cases should also focus on critical business functions. Advanced churn prediction, also known as uplift modelling, is a proven way to boost cost savings. When it is several times more expensive to acquire new customers versus retaining old ones, using ML models to focus on identifying churn and predict who will respond positively to marketing efforts can make for a big cost differentiator.

Controlling Maintenance and Infrastructure Costs

It’s also important to pay attention to AI maintenance costs. Putting a model into production is an important milestone, but once a model is developed and deployed, the challenge becomes regularly monitoring and refreshing it to ensure it continues to perform. Conditions and data change, and models need to be maintained with this in mind or businesses risk having their models become less effective, or even harmful.

Good AI project maintenance can save costs and ensure models are delivering value for money. MLOps has emerged as a way of controlling the cost of maintenance, shifting from a one-off task handled by a different person for each model — usually the original data scientist who worked on the project — into a systematised, centralised task.

From a cost of data and computing point of view, companies may see very nominal cost savings over time, but if they aren’t careful, as larger data stores and more powerful servers become available, the amount spent could increase dramatically over time.

Companies that manage the infrastructure carefully, especially those using hybrid environments, should be able to turn this around and see some cost savings. They’ll do so by storing the right things locally, moving the right things to the cloud, and managing their compute strategy. They can do so by utilising local systems for recurring workloads and the cloud for variable and “spiky” workloads, like the development of new models.

Auditing and Re-evaluating Your Data and Analytics Stack From a Cost Perspective

Like many organisations today, your data and analytics stack may have been built over years or decades, and may have been influenced more by integration requirements from other technology investments rather than more relevant business goals.

A cost-cutting mindset and technology audit can help analyse technical stack investments more closely and highlight disparate tools as well as missed automation, or generally any application that isn’t delivering. This includes looking at processes, such as triggered automation where the underlying data or a model or AI system in production has changed significantly.

An audit may help companies determine whether their data and analytics stack is underperforming and interfering with efficiency and cost-cutting measures.

Prioritising Efficiency in AI

Companies are navigating today’s economic volatility and understandably, it’s a time for tight P&L scrutiny and cost-cutting initiatives. It’s also a time for review of all existing and future technology investments. AI investment should not be exempt from this, and companies will no doubt cast a close eye on costs versus results.

The reality is that AI can absolutely be leveraged to cut costs, and organisations committed to AI projects have it in their power to ensure that AI has an actionable path to make business run faster and more efficiently.