Skip to main content
Malaysia
AIMenta
Open source

Apache Iceberg Emerges as the Open Table Format Standard for APAC Enterprise Data Lakehouses

Apache Iceberg becomes the APAC open table format standard as Snowflake, Databricks, and AWS adopt it. APAC enterprises building data lakehouses should standardise on Iceberg — it prevents vendor lock-in and enables multi-engine data access across the modern data stack.

AE By AIMenta Editorial Team ·

Original source: Apache Software Foundation (opens in new tab)

AIMenta editorial take

Apache Iceberg becomes the APAC open table format standard as Snowflake, Databricks, and AWS adopt it. APAC enterprises building data lakehouses should standardise on Iceberg — it prevents vendor lock-in and enables multi-engine data access across the modern data stack.

Apache Iceberg has emerged as the dominant open table format for enterprise data lakehouses, with Snowflake (via Polaris), Databricks (via Delta Lake compatibility layer), AWS (via S3 Tables), and Google BigQuery all adopting Iceberg as a first-class supported format in 2026. The convergence on a single open table format standard resolves the fragmentation that previously forced APAC enterprises to choose between Databricks-native Delta Lake or Snowflake-native External Tables when building hybrid data lakehouse architectures.

For APAC data engineering teams, the Iceberg standardisation has two significant implications: first, data stored in Iceberg format on cloud object storage (S3, Azure Data Lake, GCS) can now be queried directly by any major query engine — Snowflake, Spark, Trino, Dremio — without format conversion or replication, enabling multi-engine data architectures that were previously too complex to operate. Second, the open format prevents cloud vendor lock-in for APAC enterprises' most valuable data assets — Iceberg-formatted data can migrate between cloud providers without the proprietary format conversion costs that previously made switching data platforms expensive.

APAC enterprises at earlier stages of data infrastructure modernisation should consider Iceberg as the default format for new data lake construction — the ecosystem maturity and multi-cloud support now justify standardisation for any organisation building data infrastructure intended to last 5+ years. APAC-headquartered technology companies (Southeast Asian super-apps, Australian fintech, Korean e-commerce) are already moving large-scale data estates to Iceberg as the open standard becomes viable for production workloads.

Beyond this story

Cross-reference our practice depth.

News pieces sit on top of working capability. Browse the service pillars, industry verticals, and Asian markets where AIMenta turns these stories into engagements.

Tagged
#open-source #apache-iceberg #data-lakehouse #apac #snowflake #databricks #data-engineering

Related stories