StreamNative Unveils New Architectural Paradigm Uniting Streaming and Lakehouses






StreamNative, founded by the creators of Apache Pulsar, is introducing Lakestream, a new architectural paradigm for lakehouse-native streaming, alongside the launch of Ursa For Kafka (UFK)—a native Apache Kafka service that puts the Lakestream vision into practice.

According to the company, Lakestream is a new architectural paradigm that unifies data streaming and the lakehouse, not by building better bridges between them, but by making them one and the same. Just as the lakehouse paradigm proved that data warehouses and data lakes don’t need to be separate, Lakestream proves that streaming and the lakehouse don’t need to be separate either.

The core insight: traditional streaming systems push interoperability up to the protocol layer, creating data silos for each protocol. Lakestream pushes interoperability down to the storage and catalog layers—achieving unification through a shared lakehouse-native storage foundation and unified metadata catalog, the company said.

In practice, this means a Kafka topic and an Iceberg table can be the same object: no movement, no connectors, no waiting. The Lakestream architecture that makes this possible is built on three layers: cloud-native stream storage that writes directly to object storage in open formats (Iceberg, Delta Lake); a Lakestream Catalog that federates with Databricks Unity Catalog, Snowflake Horizon Catalog, and AWS S3 Tables; and stateless protocol servers that let Kafka, Pulsar, and other protocols all write to the same underlying storage, the company said.

As the first major proof point of the Lakestream architecture, StreamNative is simultaneously launching Ursa For Kafka (UFK), a native Apache Kafka service entering Limited Public Preview.

UFK is an Apache Kafka 4.2+ fork powered by Ursa, StreamNative’s lakehouse-native stream storage engine. With UFK, every native Kafka topic is simultaneously a lakehouse table, queryable from Spark, Snowflake, and Databricks with zero code changes, no connectors, and no ETL.

“We’ve spent years building toward this moment. Lakestream is the realization that streaming and the lakehouse aren’t two systems that need to be connected, they’re one system that was never designed to be unified—until now. Ursa For Kafka proves that even the world’s most popular streaming protocol can become a native lakehouse citizen without sacrificing anything. The protocol stays the same. The data becomes lakehouse-native. The boundary simply disappears,” said Sijie Guo, CEO and co-founder, StreamNative.

Key capabilities of Ursa For Kafka include:

  • Native Apache Kafka Protocol — Apache Kafka 4.2+ fork; every existing Kafka client, tool, and connector works with zero code changes
  • Lakehouse-Native Storage — Every topic stored as Iceberg or Delta Lake tables on object storage; query streaming data from Spark, Snowflake, Databricks, and Trino
  • Up to 95% Cost Reduction — Leaderless architecture eliminates cross-AZ replication, validated at 5 GB/s sustained throughput
  • Zero-Connector Lakehouse Integration — No Kafka Connect, nomaterialization pipelines, no sink connectors; Kafka topics ARE lakehouse tables
  • Catalog Integrations — Works with Databricks Unity Catalog, Snowflake Horizon Catalog, and AWS S3 Tables out of the box
  • Mixed Storage Flexibility — Run cost-optimized (lakehouse-native) and latency-optimized (disk-based) topic profiles in the same cluster
  • Available on AWS and GCP — With Azure expansion planned

StreamNative will open source Ursa and key Lakestream components in the coming months, reflecting the company’s belief that this architectural paradigm belongs to the community rather than any single vendor, the company said. Apache Pulsar continues to be fully supported on StreamNative Cloud for mission-critical messaging and streaming workloads.

Ursa For Kafka is entering Limited Public Preview today available on AWS and GCP through StreamNative Cloud. UFK works with existing Kafka clients version 0.9 and above—no code changes required.

For more information about this news, visit https://streamnative.io.

Leave a Reply

Your email address will not be published. Required fields are marked *