As we do every year, we’ve heard from folks in the I&O industry who share their thoughts on which areas will thrive and which might not survive in 2025 and beyond. Here are some of their predictions for next year.

       

Aislinn Wright, VP of product management at EDB
In 2025, enterprises will increasingly adopt hybrid cloud solutions to gain the flexibility of cloud services while retaining greater control over data and total cost of ownership (TCO).

Organizations will prioritize hybrid models that blend cloud agility with the control of private cloud environments, allowing them to optimize costs, meet regulatory requirements, and manage data sovereignty more effectively. This trend will drive demand for platforms that seamlessly integrate cloud and multi cloud capabilities, offering the best of both worlds for mission-critical workloads.

 

Liz Fong-Jones, field CTO at Honeycomb
Gartner recently projected a major uptick in IT spending expected in 2025. Cloud cost continues to remain top of mind for many organizations; having visibility into that cost using observability can ensure that companies are spending their resources in the most efficient manner. On the basis of hype and the herd effect, generative AI is going to be a large portion of that increase in IT spending, but I’d caution that leaders should carefully measure how much technical debt they are introducing while they use AI to write code or add generative AI features to their products. It will be important for organisations to run disaster game days to ensure they still can debug and understand the code that’s been added to their product.

 

Phillip Merrick, co-founder and CEO of pgEdge
In the wake of major vendor outages such as CrowdStrike, CIOs and regulators are both focusing on cloud concentration risk. Truly business-critical applications need to be resilient against both regional and cloud-wide failures, which requires a fully distributed database architecture supporting seamless failover from one cloud region to another, even across clouds. Expect regulators to start scrutinizing cloud provider dependencies at the application level.

 

 

 

Tina Tarquinio, VP of product management at IBM Z and LinuxONE 
In 2025, we will see businesses shift to a fit-for-purpose approach to AI using dedicated hardware—particularly in mainframes that handle high-volume transactional data. These hardware accelerators, which can be delivered on chip and in external cards, enable the use of traditional AI models along with encoder-based large language models (LLMs) of the user’s choice, improving large-scale, real-time data analysis and insights for industries like banking and insurance. Since this approach enables AI workloads to stay on premises, it also enhances the security, resiliency, and compliance management process for mainframe operators in regulated sectors, while empowering them to unlock new levels of efficiency and insights, setting the new standard for predictive outcomes. 

 

Karthik Sj, general manager of AI at LogicMonitor
As AI models grow in complexity and scale, the demand for computing power and energy consumption will soar. Nuclear energy, a clean and reliable source of power, will emerge as a critical solution to power AI data centers. This will not only address the energy needs of AI but also reduce carbon emissions and contribute to a sustainable future. Nuclear-powered AI data centers will help us maintain uninterrupted operation of AI data centers, and will be able to scale to meet the needs of large-scale AI infrastructure.”

 

 

Bill Wisotsky, principal technical architect at SAS 
Quantum computing is set to make significant advancements in error mitigation and correction, substantially increasing the number of computational qubits. This progress will continue to revolutionize the data and AI industry. The fields of quantum machine learning, quantum optimization, and quantum chemistry and biology stand to benefit the most. Quantum computing will also advance in its hybrid development, with Quantum Processing Units (QPUs) being further integrated with CPUs, GPUs, and LPUs. QPUs will be employed for specialized problem classes or formulations. This hybridization will inspire new approaches to classical algorithms, leading to the development of superior quantum-inspired classical algorithms.

 

Derek Ashmore, application transformation principal at Asperitas
Serverless is an example of a type of cloud service that doubles down on the value of the cloud in general. The reason why is that with serverless, not only do DevOps engineers not have to manage a physical server, but they also don’t have to provision or monitor any kind of operating system environment. They simply deploy applications as serverless functions. While not all apps are good candidates for a serverless approach, expect to see more and more organizations taking advantage of serverless for use cases where it’s appropriate in 2025.

 

 

Andrew Shikiar, CEO and executive director of FIDO Alliance
By the end of 2025, we expect one in four of the world’s top 1,000 websites will make passkeys available. The new sign-in method is already being utilized by hundreds of millions of consumers through major brands such as Amazon, Apple, eBay, Google, Microsoft, Shopify, TikTok and Uber. Just two years after passkeys were first announced, consumer awareness has risen by 50%, with 57% now familiar with passkeys in 2024. Data shows when consumers know about passkeys, they use them: as availability and consumer education continue next year, we expect willingness, adoption, and demand to continue rising for the more secure and user-friendly password replacement. 

 

Scott Woody, CEO and founder of Metronome
As more businesses begin to test and rely on AI agents to support their business–like using them to handle customer service questions–software makers that supply the agents will get paid based on successful outcome, not just a monthly or usage-based fee for their software. This change will continue the upending of pricing for many fast-growing software companies. The rise of “outcome-based pricing” for AI agents is being led by companies like Zendesk charging per ticket resolved by their AI agents. More companies will be forced to follow suit as their customers demand the ability to pay for successful outcomes only and not broad, perhaps ineffective, AI solutions.  

 

Ori Saporta, co-founder and VP of engineering at vFunction
Software complexity will become the bottom line: Enterprises must fix bad architecture or pay the price. Far too many organizations run bloated, complex Frankenstein systems they barely understand and can no longer sustain. The mounting pressure to increase reliability and prevent costly outages will drive companies to gain a deeper understanding of their applications and put a critical focus on optimizing their software architecture. Bad architecture carries many costs: skyrocketing cloud bills, increased carbon emissions, engineering team burnout, and more.

 

 

J.J. Kardwell, CEO of Vultr
AI will become smarter and more dependable in the next year, but businesses will require agile, scalable, open, composable ecosystems to unlock its full potential – something Big Tech’s cloud titans aren’t capable of delivering. Enterprises will increasingly look to alternative cloud providers to supply the kind of infrastructure that supports the rapid deployment of new AI models without skyrocketing overheads. These open ecosystems will supplant the monolithic, rigid, and costly single-vendor paradigm that has disproportionately favored enterprises operating closer to the traditional tech heartlands, leveling the playing field for AI innovation across all regions of the world.

 

Randall Degges, head of developer and security relations at Snyk
As AI-driven coding tools become mainstream in 2025, injection attacks are set to make a strong comeback. While AI accelerates development, it frequently generates code with security weaknesses, especially in input validation, creating new vulnerabilities across software systems. This resurgence of injection risks marks a step back to familiar threats, as AI-based tools produce code that may overlook best practices. Organizations must stay vigilant, reinforcing security protocols and validating AI-generated code to mitigate the threat of injection attacks in an increasingly AI-powered development environment.

 

Abhishek Gupta, principal data scientist at Talentica Software
As individual agents become more sophisticated and resource-intensive, hosting them on a single machine will no longer be viable. Instead, agents will operate across interconnected networks, coordinating tasks in real-time to deliver cohesive outputs. This shift will enable scalable, resilient multi-agent ecosystems, allowing organizations to harness distributed computing power to achieve unified results from agents operating from diverse locations. This trend will redefine how organizations approach complex problem-solving, making distributed agent collaboration the new standard in automation and AI-driven workflows.