Latest top stories
Start-ups
Technology

Verda raises $117M to scale AI cloud infrastructure and expand globally

27 April 2026

Ruben Bryon, founder and CEO of Verda (photo: Verda)

 

Helsinki-based AI infrastructure company Verda has raised $117 million to scale its AI cloud platform and expand into the UK, US and Asia, as demand for high-performance computing continues to accelerate.

The round comprises equity funding led by Lifeline Ventures, with participation from byFounders, Tesi, Varma and other investors, alongside debt financing from Nordic financial institutions. The company, formerly known as DataCrunch, is positioning itself as a provider of high-performance AI cloud infrastructure at a time when access to compute is becoming a critical constraint.

The funding comes amid sustained demand for high-performance compute, driven by the scaling of generative AI and enterprise deployments. Access to GPUs—specialised chips used to train and run AI models—has emerged as a bottleneck, prompting companies to explore alternatives to traditional cloud providers.

 


Building AI infrastructure for scale

 

Founded in 2020, Verda focuses on delivering high-performance compute on demand, aiming to reduce the operational and procurement barriers that can slow down AI development.

The company manages the full stack of infrastructure, from physical servers and data centres to the tools developers use to build and deploy AI systems. This vertically integrated approach is designed to simplify access to AI compute while maintaining performance and flexibility.

“We're building the next generation of AI cloud infrastructure for pioneering teams across the globe. This funding allows us to double down on development and accelerate our expansion across Europe, the US and Asia. We're seeing very strong momentum and we're excited for what's ahead,” says Ruben Bryon, founder and CEO of Verda.

The company recently rebranded from DataCrunch to Verda, signalling a broader ambition beyond compute provisioning towards a more integrated AI infrastructure platform.

 


Expansion driven by demand for AI compute

 

The new capital will support Verda’s expansion beyond its existing operations in Finland, with planned launches in the UK and the US, alongside further growth in Europe and Asia.

This expansion reflects a wider shift in the AI market. As organisations move from experimentation to production, infrastructure requirements are increasing in scale and complexity. Access to reliable compute capacity is becoming a strategic concern, particularly as demand concentrates around a limited number of global providers.

Verda plans to hire more than 100 people by the end of the year, building both technical and commercial capabilities to support its international growth.

 


A vertically integrated AI cloud model

 

A defining feature of Verda’s strategy is its vertically integrated operating model. The company controls multiple layers of the AI infrastructure stack, from data centre operations to developer-facing tools.

“A big part of our success comes from being vertically integrated, handling everything from physical infrastructure to the application layer. We have a dedicated AI Lab team that works directly with customers and uses those insights to drive our product decisions. We're proud of what we've built so far, but we know we're only just getting started,” Bryon says.

This model allows Verda to align infrastructure performance more closely with customer needs than more fragmented approaches. Direct engagement through its AI Lab provides feedback loops that inform product development across the stack.

At the same time, vertical integration introduces operational complexity, particularly as the company expands across regions with different regulatory and technical requirements.

 


Renewable energy and efficient data centres

 

Verda’s data centres are based in Finland and run on 100 percent renewable energy, leveraging the region’s access to clean electricity and favourable cooling conditions.

The environmental impact of AI infrastructure is becoming an increasingly visible issue, as large-scale compute workloads consume significant amounts of energy. Locating data centres in regions with abundant renewable resources can reduce both operational costs and carbon footprint.

Finland’s climate also enables more efficient cooling, limiting the need for energy-intensive systems. This combination supports Verda’s positioning as a provider of scalable and energy-efficient AI infrastructure.

 


Financial performance and NVIDIA partnership

 

Verda reports that its operations are already cash flow positive, with revenue run rate doubling to more than $60 million in the first quarter of 2026.

This level of growth indicates strong demand for its AI cloud services, as organisations seek infrastructure capable of supporting increasingly complex AI workloads.

The company is a NVIDIA Preferred Partner, providing access to advanced GPU hardware and closer collaboration with one of the leading suppliers of AI chips. Its customer base includes Nokia, 1X, ExpressVPN and Freepik, reflecting a mix of enterprise and technology-focused clients.

Access to specialised hardware remains a defining factor in the AI infrastructure market, influencing both performance and scalability.

 


Positioning in a competitive AI infrastructure market

 

Verda enters a market dominated by hyperscale cloud providers, alongside a growing number of specialised GPU cloud platforms focused on AI workloads.

Its vertically integrated model positions it between these categories, combining infrastructure ownership with developer-facing tools. This approach offers an alternative to standardised cloud services, particularly for organisations seeking more control over performance, cost or deployment environments.

However, scaling this model internationally presents operational and capital-intensive challenges. Expanding data centre capacity, securing GPU hardware and maintaining service reliability across regions requires significant investment, even with new funding.

In markets such as the US, Verda will compete directly with established providers that benefit from scale, existing customer relationships and extensive infrastructure networks.

 


From AI models to infrastructure control

 

The rise of companies like Verda reflects a broader shift in the AI landscape: from model innovation to infrastructure control.

As more organisations deploy AI in production, access to compute is becoming as critical as the models themselves. This is reshaping where value sits in the technology stack, placing greater importance on the systems that support, manage and scale AI workloads.

For customers, this shift introduces new considerations around performance, cost and dependency on infrastructure providers. For regions such as Europe, it aligns with ongoing efforts to strengthen digital capabilities.

 


Outlook

 

With fresh capital, international expansion plans and a growing customer base, Verda is entering a phase of accelerated growth.

Its strategy—combining vertical integration, renewable energy infrastructure and customer collaboration—provides a clear approach to competing in the AI infrastructure market. Execution will depend on its ability to scale operations across regions while maintaining performance and reliability.

The funding round reflects continued investor interest in AI infrastructure as a foundational layer of the technology stack. As demand for compute continues to rise, companies that can deliver it efficiently and at scale are likely to play an increasingly central role in the development of AI systems.

 

Further reading on MoveTheNeedle.news:

AI is running out of power. Can nuclear startups like Aalo keep it alive?

Multiverse: How a Spanish startup is using quantum ideas to make AI cheaper and greener