top of page

Virtual Humans

Marija-  Worlds First Virtual Human Citezen

Marija is an interactive AI-powered virtual human assisting tourists who want to learn more about Malta’s history and information about the Maltese Islands. Her knowledge base will grow over time, expanding her understanding of everything in Malta.

Marija- Emotion Samples

Realistic facial emotion animation explores the uncanny valley through animation and whether this virtual human could bridge the gap using different realtime renderers.

Dynamic Digital Human - Jasmen

Dynamic digital human pipeline produces a real, one to one, human performance in VR on Oculus Quest 2, emotion, song test.

​​

M-body- Proportionally Accurate Virtual Humans

Architected topologically consistent meshes to maintain matching vertex IDs across characters, enabling seamless motion transfer and correlation between differently proportioned characters and animations. This consistency supports the creation of accurate consistent actions, aiding in the development of AI models for gesture recognition and AI animation generation.

M-body- Proportionally Accurate Virtual Humans

Created with a procedural character system, using photogrammetry scans to match exact proportions and mesh-to-joint offsets. Training with this data, the model can understand how variations in body proportions affect gesture dynamics and animate characters more naturally, improving the accuracy of AI-driven animation and movement generation.

M-body- Proportionally Accurate Virtual Humans

The high-quality dataset has been procedurally processed to address performance errors of motion capture processing, marker occlusion, and inaccurate location solving, joint orientation, jitter and retargeting making it commercially usable and ready for integration into standard animation pipelines.

Dynamic Digital Human - Jasmen

Dynamic digital human pipeline produces a real, one to one, human performance in VR on Oculus Quest 2, emotion, song test.

​​

Survivorman VR  - Marketing Video

Using Dynamic Digital Humans to animate and render low-cost animation for Mixed Reality on Meta Oculus.  Survivorman VR won the Canadian Screen award for Best Interactive Experience in 2024

​

Dynamic Digital Human - Dominic Mognahan

Dynamic digital human pipeline produces a real, one to one, human performance in VR on Oculus Quest 2, facial speech test.

​

​

M-body- Proportionally Accurate Virtual Humans

Improved Realism and Fluidity in Animation
The system addresses common issues like foot sliding, pose inaccuracies, and mesh intersection errors, significantly improving the fluidity and realism of generated animations.

M-body- Proportionally Accurate Virtual Humans

Supports Real-Time Performance Capture
With the ability to correct motion capture errors dynamically, M-body.ai enables improved real-time gesture generation, making it ideal for virtual production and live animation workflows.

M-body- Proportionally Accurate Virtual Humans

The dataset and tools are open-source, making them accessible for both academic research and commercial use while being compatible with industry-standard software like Unity and Unreal Engine.

Realtime Bilingual Sign-language Translation - Prototype​

MetaHuman Virtual Human demonstrating real-time American Sign Language (ASL) translation in airports. The goal is to improve communication for the deaf community by creating virtual humans that can communicate clear travel instructions providing access, fostering inclusivity, and ensuring equitable access to services.

AMD Digital Human Testing - Machine Learning Libraries

Motion capture, animation processing and retargeting to train a dataset for accurate mouth shape and movement synthesis. Synchronized motion capture of the face, neck, and body allows the system to generate realistic, natural speech animations with precise lip sync and facial expressions.

Virtual Human - Metahuman Conformed Mesh

Conformed mesh MetaHuman by aligning the rig to match the exact physical proportions of the character, ensuring precise and lifelike digital representation. Photogrammetry-based skin textures and high-fidelity detail maps for cavity and wrinkle maps were not included in the original prototype. 

Realtime Character - Unity, Maya for Communication​

Created kid Gru for a virtual human interactive prototype, where gru could interactively communicate with visitors to check them in, review ticket information and guide customers to their seats.

SideFX Houdini Digital Human - Camera Previz

This animation showcases a project made for sidefx Houdini, displaying facial capture and retargeting exported into Houdini Project

4D animated Virtual Humans - LOD Prototype

Demonstrating the levels of detail in 4d captured characters and the difference between captured animations versus procedurally animated and reaction based animations

M-body- Proportionally Accurate Virtual Humans

Enhanced AI-Driven Facial Animation
By capturing synchronized high-quality facial data, M-body.ai enables the creation of ML-driven facial animation models that deliver more natural, emotion-aligned expressions for virtual humans.

Realtime Bilingual Sign-language Translation - Prototype​

MetaHuman Virtual Human demonstrating real-time American Sign Language (ASL) hand detail for clear communication. Hand capture was done with Manus gloves and body with Optitrack and animation processing and corrections using Maya and unreal engines control rig

Virtual Human - Dynamic Digital Humans

Demonstrating VR animated character performances using the high fidelity, low latent system in Oculus quest. Highlights the performance and rendering quality while captured in virtual reality.

Stephan Kozak | Lakeview Animation, Toronto. Ontario | (416) 300 2103

bottom of page