3D + AI: Inside Mawari’s Plan to Stream Immersive Experiences in Real Time

[5 min read]

At DePIN Day Singapore, Luis Oscar Ramirez, founder and CEO of Mawari, took the stage to unveil how his company is reshaping the future of spatial computing.

His presentation felt less like a technical talk and more like a glimpse into the next internet revolution: an immersive compute network where 3D graphics and AI inference converge at the edge to deliver real-time augmented experiences.

From 2D Screens to Immersive Reality

“Media has always lived inside frames,” Ramirez began. “TVs, phones — we’ve been watching the world through rectangles. Now the world itself becomes the screen.”

That vision — of the environment becoming the medium — lies at the core of Mawari’s Immersive Compute Network: a decentralized GPU architecture that renders and streams 3D content directly to AR and XR devices.

Mawari’s system allows an avatar or digital being to be captured, rendered, and streamed in milliseconds. In a demo with T-Mobile, a performer wearing a motion-capture suit transmitted movement, facial data, and voice to Mawari’s GPUs. Within 32 ms, that data was embedded into a 3D avatar and streamed live to an XR headset.

Rendering and AI inference happen on the same GPU, at the edge,” Ramirez explained. “That’s the only way to make real-time interaction possible.”

The iPhone Moment for XR

Just weeks before the event, Meta unveiled its new generation of Ray-Ban smart glasses — a moment Ramirez called “the iPhone moment for XR.”

After years of slow growth, spatial computing finally has consumer hardware ready for scale — and that’s where Mawari’s 8-year head start matters.

“We started building this in 2017,” Ramirez said. “The XR interface is the natural front-end for AI. Together, they define the next trillion-dollar industry.”

With 4 patents in 3D streaming and distributed XR compute, Mawari is positioning itself as the missing infrastructure layer between Web2 and Web3 — the bridge between today’s content platforms and tomorrow’s immersive internet.

Why the World Needs an Immersive Compute Network

For XR to reach mainstream users, spatial content must stream as seamlessly as video does today.

“When you press play on Netflix, you get a 4K movie instantly,” Ramirez said. “For XR, it must be the same — you wear your glasses, press play, and the world loads around you.”

That simplicity is blocked by two bottlenecks:

  1. Technical: 3D content is massive — terabytes of data not suited for consumer devices.
  2. Infrastructure: Slim AR glasses can’t handle local compute; rendering must move to the edge.

Mawari solves both by offloading 3D rendering and AI inference to nearby GPUs distributed across its DePIN-based network. The result is hyper-local, low-latency immersive streaming, with infrastructure partners spanning Qualcomm, T-Mobile, and Samsung.

From Japan to the World

Mawari began in Japan, where it partnered with telecom giants and cultural institutions to test XR at scale.

Its collaboration with the Mutek Japan digital arts festival became a national sensation: audiences paid to experience a live, fully streamed 3D dance performance powered by Mawari’s technology.


“96% of attendees said it was worth it,” Ramirez shared.

“People are ready to pay for high-quality XR — if the quality is there.”

Mawari’s success led to partnerships with Nankai Railways in Osaka to deploy digital assistants in train stations and airports, and to launch virtual entertainment arenas featuring virtual YouTubers — digital performers who interact with audiences in real time.

That market is already worth $3 billion in Japan and Korea, and Mawari’s new vTubeXR platform brings these creators to mixed reality.

Users can attend concerts, then meet virtual artists one-on-one — paying from $20 up to $500 for personalized sessions.

Scaling XR with DePIN

So why build on decentralized infrastructure?

Because hyperscalers can’t deliver what XR needs: low-latency, edge-based GPU compute.

In 2021, when Fortune 500 clients asked Mawari if it could scale to 100,000 users, the team realized that AWS and Google Cloud had no GPUs near the edge.

That limitation sparked the birth of Mawari Network, a decentralized compute layer purpose-built for XR workloads.

“Centralized clouds weren’t designed for real-time spatial content,” Ramirez said.

“We needed hyper-local compute near users — and that’s what DePIN makes possible.”

Mawari launched what it calls a Decentralized Infrastructure Offering (DIO) — essentially a compliant node-sale model that allows anyone to contribute compute capacity and earn rewards.

Within weeks of launch:

  • 149,000 nodes were reserved globally
  • 30,000 wallets connected

The network runs on an L3 Orbit chain built on Arbitrum, with infrastructure partners including Caldera (rollup-as-a-service) and Halborn (security auditing).

The Bridge Between Web2 and Web3

With over $17 million raised from investors like Samsung Next and 1k(x), Anfield and other, Mawari blends traditional enterprise backing with Web3 scalability.

Ramirez describes the model simply:

“We use Web3 to scale our service — not to define our economy.”

In other words, decentralization isn’t the product; it’s the engine powering a new kind of media — one where every pixel, every AI response, every avatar gesture is streamed in real time from the edge.

A Glimpse of the Immersive Internet

As the keynote ended, Ramirez showed the future he’s been building since 2017:

AI-powered avatars, digital concerts, XR assistants at train stations, and millions of users connected through a decentralized GPU mesh.

+8,083 subscribers
To top