Rust for XR and Meta Quest experiments

Rusty XR

Rusty XR is the Rust foundation I use to make Quest and XR experiments easier to repeat, inspect, and connect to outside tooling. It gives the work shared contracts, diagnostics, sensor bridges, camera and depth models, and small reference examples so experiments can move from one-off prototype to reusable workflow. On Mesmer Prism, the point is how it fits into the broader practice: it is the technical spine behind my Quest source mapping, companion apps, and research systems.

Purpose

What it is good for

Rusty XR is good for the parts of XR work that need to survive beyond a single prototype: shared schemas, repeatable diagnostics, camera and depth reasoning, sensor integration, runtime profiles, and small reference examples that can be exercised by companion tools. It makes the technical layer more portable, so the same ideas can support Viscereality, Meta Quest source mapping, and future Rust-native XR experiments.

The current emphasis is Meta Quest because it is where the practical need is sharpest. Quest experiments involve headset builds, operator tools, raw camera questions, depth surfaces, display casting, and diagnostics. Rusty XR gives those pieces a common language so the work can stay coherent as it moves between research prototypes, documentation examples, and field tools.

Useful for

  • Keeping Quest experiments reproducible across machines and collaborators
  • Sharing runtime profiles, diagnostics, and catalog metadata with companion tools
  • Testing camera, depth, passthrough, and final-display workflows with clearer source labels
  • Connecting sensor streams, visual primitives, and operator-side evidence capture

Broader fit

Quest examples

Source-aware XR experiments

A lot of Quest work becomes confusing when every visual surface gets called passthrough. Rusty XR helps keep the experimental vocabulary clean: compositor passthrough, app-visible camera input, environment depth, casting, and MediaProjection each describe a different kind of evidence. That distinction matters when I am trying to build a tool, debug capture behavior, or explain what a prototype can actually access.

That source-aware approach also supports the companion apps. A Windows or Android operator tool can install a build, launch a profile, cast the final display, capture a screenshot, and export diagnostics while Rusty XR supplies the shared contracts and reference behavior behind the run.

Current themes

  • Rust-native XR examples that can be launched and verified from companion tools
  • Camera and depth experiments that keep source, display, and diagnostic evidence distinct
  • Shared runtime profiles for repeatable Quest runs
  • Diagnostics and visual proof that make headset behavior easier to discuss remotely

Connected projects

References

Project docs

Mesmer Prism keeps the conceptual overview here. Implementation details, onboarding, release notes, and command-level workflows belong in the dedicated project documentation.

Rusty XR

Companion tooling