The main objective of the project was to create a customizable management platform that allows users to control live theatrical face sync performances with a mixture of real actors, AR, real and virtual backgrounds into one combined performance. The AR immersive app enabled control, animation, and impersonation of virtual characters by actors, streaming of actor's facial features and emotions onto the avatar, general oversight of theatrical processes (rotation and change between scenes, cameras, angles, selection of lighting) supporting multiple output screens and simultaneously participating actors.
A new way of presenting art and theatrical performance is created, allowing Technodramatists to rapidly expand their play portfolio, showcasing technologically integrated theater, music and interactive art narrative by:
Providing instruments for fast character integration, supporting capability of open source character integration within a few clicks.
Recreating mapping with a standardized workflow that allowed imported character animation correlation with the facial features of an actor.
Providing easy real-time hotkey performance control for the theatrical operator.
Allowing to control and sync facial movements via iPhone onto 3d avatars, enabling actors, storytellers, and keynote speakers to capture all motions in real-time with professional quality.
Allowing pre-defined character body animation while controlling the face animation in real-time.