Out of Fear β€” Psychological Horror VR

Designed to push the boundaries of VR immersion, Out of Fear focuses on creating psychological dread through high-fidelity spatial audio and realistic object manipulation.

  • University of Salford
  • Unity HDRP
  • C#
  • Spatial Audio Design
  • Post-Processing Profiles
Project Summary

Out of Fear is a technical exploration into the "Phsychological Horror" sub-genre, specifically designed for VR. The project focuses on moving away from cheap "jump scares" in favour of sustained atmospheric tension. Key technical pillars include the implementation of spatial audio to guide player movement and physics-based tactile interactions (like opening drwaers and inspecting items) to deepen immersion within a Unity-based environment.

Project Brief

Objective

The primary objective of this project was to explore the intersection of psychological horror and virtual reality, moving away from traditional jump-scares to focus on sustained atmospheric dread. Developed as a solo endeavor at University of Salford, the goal was to create a highly immersive environment where player agency is driven by tactile interactions and sensory feedback. By prioritising spatial audio and physical presence, the project aimed to dmeonstrate how environmental storytelling can be used to manipulate playeur psychology and maintain tension within a 360-degree digital space.

My Role

As the sole developer, I managed the entire production pipeline, from initial conceptualisation and research to final technical implementation in Unity. My resonsibilities encompassed technical design, where I built a robust VR interaction framework, and environmental art, where I focused on atmospheric lighting and level layout to enhance the sense of isolation. Furthermore, I handled the sound engineering and performance optimisation, ensuring that the complex spatial audio triggers and high-fidelity visuals remained performant and comfortable for a seamless VR experience.

Technical Showcases
Systems

Adaptive Voice Recognition

To break the fourth wall of VR immersion, I engineered a custom microphone-driven interaction layer. This system transforms the player's voice into a primary input, allowing the game's AI and environment to react dynamically to speech, volume, and breath.

  • Real-time Acoustic Monitoring: Systemic integration of Unity's microphone API to track player decibel levels and specific keyword triggers.
  • Threat Modulation: Real-time noise-gate logic determines AI awareness such as screaming or heavy breathing increases the "Stress Value" and triggers aggressive encounters.
  • Dynamic Audio Feedback: Correlates microphone input with spatialised 3D audio stingers, creating a feedback loop where the environment "answers" the player's vocal cues.
Rendering

Atmospheric & Optimisation

I engineered a high-performance lighting pipeline designed to deliver cinematic horror visuals within strict VR constraints. By balancing baked global illumination with optimised volumetric effects, I achieved deep atmospheric immersion while maintaining the rock-solid frame rates essential for player comfort.

  • Smart Lighting: Uses a mix of baked and real-time shadows to keep the game dark and moody without lagging.
  • VR Optimisation: Fine-tuned the colours and textures to make sure the screen stays clear and prevents motion sickness.
  • Performance Fixes: Cleaned up the 3D models and code so the game runs fast even on older hardware.
Level Design

Map Creation

I translated hand-drawn concepts into a fully realised 3D environment, focusing on player flow and atmospheric storytelling. The level was designed to guide players naturally through the world using light, shadow, and environmental cues.

  • Concept to 3D: Developed the map layout based on intial sketches to ensure a balanced and intentional player path.
  • Environmental Storytelling: Placed props and lighting to tell a story about the world without using text or dialogue.
  • Spatial Pacing: Carefully designed the "bottlenecks" and "open areas" to control the player's stress and environment speed.
  • Optimised Layout: Structured the environment to minimise hidden geometry, keeping the game running smoothly in VR.
CØde Snippet
Software Engineer

AI-Responsive Voice Interaction System

This module leverages the UnityEngine.Windows.Speech library to bridge real-world audio with in-game logic. It transforms the player's voice into a gameplay tool, enabling both environmental interaction and an immersive "echo" mechanic from AI entities.

  • Proximity-Based Action Execution: Uses a spatial search algorithm (Physics.OverlapSphere) to translate specific vocal "kill phrases" into immediate gameplay consequences, such as destroying nearby enemy objects.
  • Dynamic Speech Mimicry: Implements a speechHistory buffer that allows AI entities to "remember" and repeat phrases back to them, creating a haunting, reactive atmosphere.
  • Hybrid Input Support: Integrates manual text input fallbacks alongside voice recognition to ensure accessibiliy and testing reliability without breaking the gameplay loop.

Contact Me
Want a custom build or deep dive? Reach out β€” I can share raw captures and profiling reports.
Back to Portfolio