Current XR research system

Viscereality

Viscereality is my main bioresponsive VR line: a Quest-based research platform that links breath, cardiac rhythm, oscillator dynamics, and visual structure in order to study regulation, coherence, and embodied interaction as things that can be designed rather than merely measured.

Direction

What the project is doing

The public-facing version of Viscereality sits between a study instrument and a design system. The point is not to add generic wellness visuals to a headset. The point is to build an environment where breath pacing, interoceptive feedback, oscillator-driven motion, and visual state changes can be aligned tightly enough to become experimentally meaningful.

That means the project now has two equally important surfaces. One is the participant-facing runtime on Quest, where breathing sources, heart-rate signals, coherence metrics, and visual structure are coupled in real time. The other is the operator-facing surface, where installation, launch, monitoring, telemetry, and study-shell control are kept on the Windows side so research sessions remain stable and reproducible.

Active lanes

  • Breath-linked visual interaction and respiratory pacing
  • Cardiac biofeedback and coherence-oriented training surfaces
  • Oscillator-driven particle systems and structured visual symmetries
  • Quest study tooling, transport, monitoring, and operator-shell design

Connected projects

Translation

Why it matters

Viscereality is where a large part of my writing practice becomes concrete. It is the place where questions about embodiment, regulation, permeability, atmosphere, and designed transformation have to survive real runtime constraints: headset thermals, sensor reliability, operator burden, and the fact that participant experience changes when the control surface is unstable.

That is also why the project spans more than one repository. The Unity runtime, the Windows companion surface, and the manuscript line each stabilize different parts of the same system. The work only really holds together when those layers can be described in one vocabulary.

Current public surface

  • Standalone Quest runtime with oscillator-driven particle rendering
  • Breath, heartbeat, coherence, and runtime-config pathways
  • Windows operator shell for launch, monitoring, installation, and study guidance
  • Publication line across immersive systems, breath interaction, and coherence training

Reference Surface

Current references

These are the main public-facing papers and project surfaces currently defining the Viscereality line.

Papers and public outputs

  • Fejer et al. "Breathing Space: Spatial Mapping of Breath and Cardiac Biofeedback for Affective State Representation and Coherence Training in Viscereality." AlpCHI proceedings (2026).
  • Fejer et al. "Viscereality: A Bio-responsive VR System for Breath-Based Interactions and Coupled Oscillator Dynamics to Augment Altered States of Consciousness." Mensch und Computer workshop proceedings (2025).
  • Barton et al. "The Restorative and State Enhancing Potential of Abstract Fractal-Like Imagery and Interactive Mindfulness Interventions in Virtual Reality." Virtual Reality (2024).
  • Pinilla et al. "Affective Visualization in Virtual Reality: An Integrative Review." Frontiers in Virtual Reality (2021).

Public systems and research surfaces

  • Viscereality project site. viscereality.org is the main public-facing description of the system and its current experiential framing.
  • Alius Research project page. aliusresearch.org/viscereality.html situates the project within its broader research context.
  • Mesmer Prism GitHub surface. github.com/MesmerPrism is the public index for the runtime, tooling, and related research code that support the wider project line.