JONAS MARTINY

VIRTUAL DJ SHOW

VIRTUAL DJ SHOW

VIRTUAL DJ SHOW

BACHELOR PART II

BACHELOR PART II

Project Overview


This second part of my Bachelor’s thesis expanded the project into a live virtual DJ performance realized in Unreal Engine 5. Using OptiTrack motion capture, real-time performance data was streamed directly into a custom-built digital environment. The DJ was photogrammetrically captured and recreated as a MetaHuman, performing within a dynamic world that transitioned seamlessly from sunset to night. Music-reactive lighting and multiple cinematic camera perspectives, including a tracked virtual camera and a virtual drone, were used to capture and shape an evolving audiovisual experience that merged live performance with real-time technology.


Responsibilities


I handled all creative and environmental aspects: conceiving and building the virtual environment, implementing the day-to-night transition, developing the music-reactive lighting system, and controlling the tracked virtual camera during the live performance. My collaborator, Moritz Till, managed technical execution, data live-streaming, and general camera system implementation.

A live virtual DJ performance in Unreal Engine 5 with a MetaHuman DJ in a music-reactive, real-time environment.

  • (Client)

    Bachelor Thesis II

  • (Software)

    Unreal Engine 5

    Speedtree

    Gaea 2

    Substance Painter

    Davinci Resolve

    Motionbuilder

    Optitrack

  • (Year)

    2023

  • (Services)

    Art Direction

    Virtual Production

    Environment Design

    Content

    Motion Capture

Project Overview


This second part of my Bachelor’s thesis expanded the project into a live virtual DJ performance realized in Unreal Engine 5. Using OptiTrack motion capture, real-time performance data was streamed directly into a custom-built digital environment. The DJ was photogrammetrically captured and recreated as a MetaHuman, performing within a dynamic world that transitioned seamlessly from sunset to night. Music-reactive lighting and multiple cinematic camera perspectives, including a tracked virtual camera and a virtual drone, were used to capture and shape an evolving audiovisual experience that merged live performance with real-time technology.


Responsibilities


I handled all creative and environmental aspects: conceiving and building the virtual environment, implementing the day-to-night transition, developing the music-reactive lighting system, and controlling the tracked virtual camera during the live performance. My collaborator, Moritz Till, managed technical execution, data live-streaming, and general camera system implementation.