As investor scrutiny intensifies heading into 2026, the AI trade is entering a more selective and disciplined phase. Early enthusiasm around scale, compute access and technical breakthroughs is giving way to tougher questions about sustainability, margins and real-world deployment.
For Andrew Sobko, Founder and CEO of Argentum AI, this shift marks a fundamental change in how AI companies are being evaluated and which are most likely to emerge as long-term winners.
The narrative, Sobko explains, is moving “from ‘we have access to compute’ to ‘we convert compute into durable revenue and margins’.” In previous cycles, the ability to demonstrate powerful models or secure large volumes of GPU capacity was often enough to attract capital. But today, that’s no longer sufficient.
“Unit economics matter again,” he says. Investors are increasingly focused on “inference cost curves, gross margin trajectory and payback periods – not just model demos.”
This reflects a broader recalibration across technology markets, where proof of commercial performance is taking precedence over technical promise alone.
A Clearer Divide in AI Infrastructure
This shift is also reshaping how AI infrastructure companies are valued. According to Sobko, the market is “separating ‘capacity builders’ (Capex heavy) from ‘capacity unlockers’ (asset-light, marketplace, utilisation-driven).” Rather than rewarding companies that simply build more capacity, investors are paying closer attention to how effectively existing resources are deployed and monetised.
Reliability has also become a critical differentiator here. “Outages + concentration risk are pushing buyers toward redundancy and multi-sourcing,” Sobko notes, adding that “investors follow that demand.”
Indeed, as AI systems become embedded in business-critical workflows, resilience and supply diversity are no longer optional.
What Does the Next Generation of AI Winners Look Like?
Looking ahead, Sobko sees several characteristics that define companies best positioned for the next phase of AI-driven innovation.
One is compute efficiency. He highlights “compute-efficient winners: teams that deliver the same outcome with less GPU time,” pointing to smaller, faster models and more efficient serving stacks as a competitive advantage.
Enterprise readiness is another key factor. Sobko emphasises the growing importance of “security, auditability, compliance, SLAs especially for regulated industries.” As AI adoption deepens across sectors such as finance, healthcare, government and telecommunications, infrastructure that can meet regulatory and operational requirements is increasingly valuable.
He also points to orchestration as a core opportunity. “Software that routes workloads across heterogeneous supply to optimise cost, latency and sovereignty” is becoming essential as enterprises look to balance performance, compliance and geographic constraints.
More from Artificial Intelligence
Discipline, Distribution and Staying Power
As markets become more selective, capital discipline is separating durable businesses from weaker players. Sobko argues that successful companies “avoid build-a-data-center traps unless they have locked-in demand.” So, instead of expanding infrastructure prematurely, they focus on utilisation and efficiency before committing to large capital investments.
Distribution and customer retention are equally important. “Real pipelines, renewals and expansion beat headline launches,” Sobko says. In a more sober investment environment, sustained customer adoption matters far more than attention-grabbing product announcements.
How Are Investors Assessing AI Businesses?
Investor evaluation frameworks have evolved alongside the market.
Sobko points to revenue quality as a core signal, including “ARR, net revenue retention, multi-year contracts, concentration risk.” These indicators provide insight into whether growth is repeatable and diversified, rather than dependent on short-term demand.
And, cost structures are under similar scrutiny.
Investors are closely examining “gross margin and compute COGS: inference margin, cost per token/task, utilization, capacity commitments.” Alongside this, there is increasing emphasis on “proof of demand,” such as “signed LOIs, deployments in production, reference customers, procurement progress.”
Here’s Where the Strongest Opportunities Are Emerging
Looking forward, Sobko highlights several areas of the AI ecosystem that appear particularly well-positioned. One is “AI infrastructure marketplaces,” which are “turning compute into a liquid, price-discovered resource (spot/reserved/credits).” These platforms address inefficiencies in how compute is allocated and priced.
Another key opportunity lies in “enterprise inference at scale,” particularly for “latency-sensitive, compliant deployments” in regulated industries. As AI shifts from experimentation to production, demand for reliable inference infrastructure continues to grow.
Finally, Sobko points to the emergence of a “second-life GPU economy,” focused on “monetising ‘retired’ but still powerful fleets for inference/batch workloads,” which he says “will mean big supply unlock.”
As the AI trade matures, the message is clear: the next wave of winners will not be defined by who builds the biggest models, but by who can deliver efficiency, reliability and real economic value at scale




