Design4real VR AR Agency

Tips for creating realistic VR avatars for the Metaverse: Uncanny Valley challenge

As the metaverse grows, so does the demand for engaging and lifelike VR avatars. In order to foster a sense of social presence and immersion, it is critical to provide the Uncanny Valley effect in the design of these avatars. The Uncanny Valley phenomenon occurs when an avatar closely resembles a human appearance, but still triggers uncomfortable feelings in users. This blog post explores strategies to create VR avatars that bridge the Uncanny Valley gap and enhance the metaverse experience.

Prioritization visual realism

Achieving visual reality is crucial for convincing VR avatars. However, striving for photorealistic representation is not always the best approach. Instead, a balance between realism and stylization should be sought. The focus is on the key elements that contribute to human recognition, such as facial features, eyes and skin texture. Through the use of detailed modeling, texturing and Shading-techniques, visually appealing avatars can be created without triggering the Uncanny Valley effect.

Realistic avatars

A realistic avatar offers many customization options to come as close as possible to your own appearance. Especially in a professional environment, where the credibility of the user plays a role, an avatar with realistic features is often the first choice. However, the realistic look also poses problems: The so-called "Uncanny Valley" effect describes the point at which an artificial avatar wants to look as real as possible, but only comes close to achieving this goal - which in turn makes it appear implausible or even creepy.

Hardware limitations play a role here, as do limited options for creating a realistic avatar with little effort. One common method is to take a webcam photo of the user and project it onto a generic avatar. This serves its purpose, as the avatar is recognizable as a person, but the simple representation quickly breaks with the hoped-for reality. In addition, the avatars usually float depending on the VR headset chosen, due to the lack of tracking options on the legs and face, and the facial expressions appear relatively rigid and therefore also alienating.


Avatars from photogrammetry data

A technically more complex approach is to generate avatars from photogrammetry data.

The creation of VR avatars from photogrammetry data is a complex procedure that is divided into several steps. First, a 3D scan of the person is created by photographing them from different angles. These photos are then processed using photogrammetry software to create a detailed 3D model. Finally, the model is converted into a VR-compatible format. This process makes it possible to create quite realistic avatars for VR applications.


3D scan apps for cell phones like Polycam

3D scanning of people on the iPhone is made possible by scanning software applications such as PolyCam much easier. This app utilizes the powerful LiDAR technology built into newer iPhone models to create detailed 3D scans of objects and environments. 


Cartoon Characters from Ready Player Me

Cartoon-like avatars

Cartoon-like avatars do not usually have an Uncanny Valley problem, as they do not attempt to copy reality. Nevertheless, they offer many opportunities for individual design. For example, characteristic features of a user can be transferred to a cartoon-like avatar based on a photo - similar to what the ReadyPlayerMe app already does. Alternatively, users can design their own avatar with the help of a character creator, as is the case in "The Sims", for example. The result is an avatar that has a consistent style, similar to the other avatars, and is not dependent on external factors such as lighting and the quality of the user's webcam.

The cartoon-like style doesn't appeal to everyone, of course - a playful look in a professional setting, especially when presenting or demonstrating a realistic product, can put people off or make presenters seem less credible. Cartoon-like avatars also have no legs and limited facial expressions.

Mastering facial expressions and emotions

Facial expressions are crucial for human communication. To avoid the Uncanny Valley, it is important that VR avatars show a wide range of emotions that match their speech and interactions. Advanced animation and motion capture techniques enable natural and seamless facial expressions thanks to Face Capturing. VR headsets such as Meta Quest Pro and Pico 4 Enterprise support Face and eye tracking. Particular attention should be paid to eye movements, eyebrow movements and lip movements, as these subtle cues play a significant role in the authenticity of the avatar's emotions. Facial motion capture can be used to transfer facial movements from a real person to the digital avatar. There are face capture solutions that usually use an iPhone to record facial movements. For example, iClone and Metahuman Animator from Unreal. With them it is possible to transfer realistic emotions to the characters with moderate effort. With the help of automatic winks it is also possible to make a character wink realistically without much effort.

Meta Avatar
New avatars from Meta now also with legs -picture:Meta

Realistic animation of avatars and NPCs through motion capture (not real time)

Natural and fluid animation is crucial for the credibility of avatars. Choppy or robotic movements can instantly destroy the illusion of realism. The use of Motion capture technology makes it possible to capture the movements of real actors and gives the avatars an additional authenticity. In addition, attention should be paid to body language and gestures. These should accompany the avatar's speech and enhance the communication experience rather than detract from it. In retrospect, in 3D softwares such as. Iclone or Motion BuilderThe motion capture animations will be reworked to achieve even better realistic results. For example, the direction of the character's gaze can be adjusted to the 3D world, or collisions with the 3D body can be removed.


No standard formats for avatars

Unfortunately, there are no standard formats for avatars, so an avatar that is optimized for one software solution cannot simply be integrated into another. 


Realistic animation of avatars and in real time 

Things become more difficult when the movements of players wearing VR glasses are to be captured in real time using the headset's sensors and transferred to the player avatar. Meta has made great progress here recently, so that Meta's avatars can now even have legs in the virtual world. Furthermore, the very annoying overlaps between arms and upper body have been significantly reduced. To enable more realism in the movements of real-time avatars on the hardware side, headsets such as the Quest 3which, thanks to improved cameras and sensors, can better track the position of the wearer's body file. In addition to the hardware, the software, the Avatar SDK, is also largely responsible for how the movements of the avatars are reproduced. Here, hardware manufacturers such as Meta and Pico have their own solutions with different features. However, there are also plugin implementations for Unreal and Unity that are device agnostic.

Focus on speech and audio integration

Voice is another important aspect of the Metaverse experience. Make sure that the avatar's voice matches its appearance and conveys emotions appropriately. Integrating real-time voice modulation and lip sync can greatly enhance the reality of the avatar. It is better to use voices of real people instead of relying on AI voices. AI voices are quite robotic sounding at the moment and show less emotion in the voice as real humans do. In addition, spatial audio techniques can help the avatar's voice anchor its position in the virtual environment and further immerse users in the experience. With Lip Synch Technology like AccuLips from Iclone, audio files can be automatically translated into realistic lip movements on the avtar.

Customization and personalization

The ability to customize avatars can help users feel more connected to their virtual representations. Offer a diverse range of customization options for facial features, hairstyles, clothing, and accessories. Personalization not only improves user engagement, but also mitigates the Uncanny Valley effect, as individual preferences can vary in terms of realism. With programs like Character Creator 4 or also Unreals MetaHuman personal avatars can be created. From images and 3D scans can be created with the help of e.g. the Character Creator 4 plugin "Headshot" and the improved successor "Headshot 2" avatars can be created that are nearly identical in appearance to the real-life templates. Precise face recognition and advanced AI technology allow for remarkable accuracy and detail, resulting in created avatars that bear a remarkable resemblance to real people.

Collect iteration and user feedback

Creating realistic VR avatars is an iterative process. Continuously collect user feedback and observe how users interact with the avatars in different scenarios. Identifying specific elements that trigger the Uncanny Valley effect can help optimize the design and animations. It is important to actively involve users in the development process to create avatars that match community and customer preferences.


  • Avoiding the Uncanny Valley effect is critical for VR avatars in the metaverse.
  • Prioritizing visual realism and mastering facial expressions are important aspects.
  • Animation and movement make for believable avatars.
  • Focus on speech and audio integration improves the reality of avatars.
  • Customization and personalization enable individual user experiences.
  • Iteration based on user feedback refines the avatars.
  • Continuous advances in technology and innovation in the field of avatar integration open up new immersion possibilities in the metaverse.