Mixed reality implementation boundaries
Meta Passthrough Stack
Meta Passthrough Stack is a public source map for a specific mixed-reality problem: what a Quest app can render, sense, and process when it uses passthrough. The project separates compositor passthrough, app-visible camera frames, environment depth, scene understanding, patents, and local implementation notes so design claims do not outrun the APIs.
Direction
What the stack separates
Passthrough is not one thing. A compositor passthrough layer can make the real room visible behind virtual content without giving the app raw camera pixels. The Passthrough Camera API is a different surface: it exposes forward-facing RGB camera frames for computer-vision and machine-learning use cases under device, OS, permission, and policy constraints. Environment depth is separate again, and is mainly useful for occlusion, raycasting, and grounding virtual content in the room.
That separation matters for implementation and for public claims. The local notes use Meta and Khronos documentation, sample projects, view-synthesis papers, and patents as a source map. Patents and research systems are treated as design signals, not as evidence that a shipped headset exposes a hidden capability.
The practical thread is Rust and Unity experimentation around what can be built with the exposed stack. The public page keeps that at the level of documentation and source anchors; private implementation notes stay in the working repos.
Current work
- OpenXR and compositor-layer passthrough semantics
- Passthrough Camera API access, permissions, samples, and privacy boundaries
- Depth API, occlusion, raycasting, and scene-understanding separation
- Passthrough+, neural view synthesis, and patent lineage as design context
- Implementation notes for mixed-reality prototypes without overclaiming API access
Connected projects
- Viscereality for Quest-based research systems and operator tooling
- Viscereality Companion for the public tooling surface
- Plasmatic Multitudes for weakly bounded embodiment and mixed-reality body design
Boundary
Why the distinction matters
Mixed-reality writing often collapses passthrough visibility into camera access. This project keeps those apart. Seeing the room through a headset, compositing virtual content over it, receiving camera frames, receiving depth, and inferring scene structure are related but different affordances.
The public usefulness of the archive is therefore corrective: it blocks vague claims before they turn into design assumptions. It also gives experimental work a cleaner checklist for what must be tested on-device rather than inferred from marketing language, patents, or old forum answers.
Public focus
- Compositor passthrough separated from raw camera access
- Depth and scene APIs treated as distinct sensing layers
- Official docs preferred over rumor, demos, and patent extrapolation
- Private prototype details kept out of the public source map
References
Current references
These are the public anchors currently defining the Meta Passthrough Stack page. Official API documentation defines the developer surface; papers and patents provide design context only.
Developer surface
- Meta. "Passthrough Camera API Overview." Meta Horizon documentation, updated April 21, 2026.
- Meta. "Depth API Overview." Meta Horizon documentation, updated March 20, 2025.
- Khronos Group. "XrCompositionLayerPassthroughFB." OpenXR 1.1 reference page.
- Oculus Samples. "Unity Passthrough Camera API Samples." Public sample repository.
View-synthesis and patent context
- Chaurasia et al. "Passthrough+: Real-Time Stereoscopic View Synthesis for Mobile Mixed Reality." Proceedings of the ACM on Computer Graphics and Interactive Techniques (2020).
- Andersson et al. "NeuralPassthrough: Learned Real-Time View Synthesis for VR." arXiv (2022).
- Meta Platforms Technologies. "Display System with Machine Learning Based Stereoscopic View Synthesis over a Wide Field of View." Patent publication family.
- Justia Patents. "US Patent 12,231,615." Patent index page for ML-based stereoscopic view synthesis.