Connect with us
Photo courtesy Qlik.

Product of the Day

Qlik Connect 2025: New format breaks data barriers

Qlik’s new Open Lakehouse rewires enterprise data architecture for the AI era, writes ARTHUR GOLDSTUCK.

The most expensive problem in enterprise data is not bad data, but slow data. The very infrastructure designed to make business smarter has often become a bottleneck. It is typically rigid, bloated and incapable of keeping pace with the ravenous appetite of modern AI. 

At the Qlik Connect 2025 conference in Orlando, Florida, this week, data analytics leader Qlik threw down a gauntlet to traditional data architecture, betting big on open formats and real-time scale, with the launch of Open Lakehouse.

For decades, enterprises have depended on data warehouses to centralise information for business intelligence (B)). Yet, as AI models demand wider, fresher datasets at faster speeds, those same warehouses are revealing their limits. Their costs spiral as data grows. Their structure resists the messiness of real-world systems. And their vendors, too often, lock customers into proprietary tools. Qlik, best known for its roots in analytics and more recently its absorption of Talend’s data integration prowess, is challenging the status quo with a new data backbone that goes beyond BI, into AI.

Built natively into the Qlik Talend Cloud, Open Lakehouse is a fully managed architecture powered by Apache Iceberg, a high-performance open table format rapidly emerging as the industry’s answer to the inflexible silos of the past. It offers real-time ingestion of millions of records per second, automated optimisation that promises up to five times faster query speeds, and native compatibility with engines like Spark, Trino, Snowflake and SageMaker.

“Performance and cost should no longer be a tradeoff in modern data architectures,” said Qlik CEO Mike Capone. “With Qlik Open Lakehouse, enterprises gain real-time scale, full control over their data, and the freedom to choose the tools that work best for them.”

That freedom is central to what makes Iceberg attractive. Unlike traditional file formats, which are fine for static analytics but stumble under real-time demands, Iceberg was designed from scratch to support fast, incremental updates at petabyte scale. It allows for zero-copy data sharing, granular versioning, and true separation of storage and compute – features that feel almost tailor-made for AI workflows.

David Navarro, data dmain architect at Toyota Motor Europe, described it in almost existential terms for large organisations. 

“We urgently need interoperability between diverse business units and partners, each managing its own technology stack and data sovereignty,” he said. “Apache Iceberg is emerging as the key to zero-copy data sharing across vendor-independent lakehouses, and Qlik’s commitment to delivering performance and control in these complex, dynamic landscapes is precisely what the industry requires.”

It’s a shift that speaks to broader market forces. According to ISG Software Research, lakehouse architectures are gaining traction across enterprises that once swore by warehouse-centric thinking. 

“Qlik Open Lakehouse, which leverages open standards such as Apache Iceberg, is well-positioned to meet the growing demand for real-time data access and multi-engine interoperability,” said Matt Aslett, Director of Research for Analytics and Data at ISG. “Enterprises are increasingly adopting lakehouse architectures to unify data across on-premises and cloud environments.”

The numbers back this up. Internal benchmarks from Qlik claim that the new platform delivers up to 5x query acceleration and slashes infrastructure costs by as much as 50 percent. Much of this is thanks to its always-on optimiser, which automates traditionally laborious tasks like file compaction, clustering, and data pruning – tasks that, if mishandled, can severely degrade performance. By integrating these features natively into a managed platform, Qlik aims to give data engineers fewer late nights tuning queries and more time delivering insight.

Qlik’s decision to support Bring Your Own Compute (BYOC) – where customers use their own Amazon Web Services (AWS) environments rather than Qlik’s infrastructure – adds a layer of control that many CIOs now demand.

Still, for all its promise, the Open Lakehouse is not a silver bullet. Iceberg, while powerful, is still relatively young compared to traditional databases. It relies on a growing but not yet universal ecosystem of compatible tools. And while Qlik’s managed platform abstracts away much of the complexity, teams will still need to understand how best to design Iceberg schemas and ingestion pipelines to take full advantage of its capabilities.

However, it reframes the company as a foundational infrastructure player in the era of AI.

  • The Open Lakehouse is currently in private preview and will be generally available in July 2025. For now, Qlik is courting early adopters.
Subscribe to our free newsletter
To Top