We use proprietary and third party's cookies to improve your experience and our services, identifying your Internet Browsing preferences on our website; develop analytic activities and display advertising based on your preferences. If you keep browsing, you accept its use. You can get more information on our Cookie Policy
Cookies Policy
FIWARE.OpenSpecification.WebUI.VirtualCharacters - FIWARE Forge Wiki


From FIWARE Forge Wiki

Jump to: navigation, search
Name FIWARE.OpenSpecification.MiWi.VirtualCharacters
Chapter Advanced Web-based User Interfaces,
Catalogue-Link to Implementation Virtual Characters
Owner Adminotech Ltd., Lasse Öörni



Within this document you find a self-contained open specification of a FIWARE generic enabler, please consult as well the FIWARE Product Vision, the website on http://www.fiware.org and similar pages in order to understand the complete context of the FIWARE platform.


Legal Notice

Please check the following Legal Notice to understand the rights to use these specifications.



Video games and film have for long used systems which provide rendering, animation and interactions for virtual characters. However, virtual character technology is often proprietary and assets created for one virtual character system may not be easily reusable in another system.

This GE consists of an open standard and reference implementation for virtual characters on the Web, allowing for immersive, dynamic character interactions including multi-user experiences. Web applications will be able to create, display and animate virtual characters utilizing industry standard data formats. The characters can be composed of multiple mesh parts, to eg. allow easily swappable parts like upper or lower bodies, and attached objects such as clothing.

The virtual character functionality is implemented as a JavaScript library. The 3D-UI GE (which in turn uses the WebGL API) is utilized for the Entity-Component-Attribute based scene model, the hierarchical transformation graph and implementing the actual rendering: a virtual character becomes part of the scene hierarchy and can be manipulated using the scene model's functions.

If a virtual character is created into a network-synchronized scene utilizing the Synchronization Server GE, its appearance, movements and animations will be transmitted to all observers. This is not however required and the library can also function in purely local mode.

Basic Concepts

Virtual Character

An animatable 3D object, which may consist of one or more triangle mesh parts. For skeletal animation, may contain a hierarchy of bones (joints) which are 3D transforms (position, rotation, scale) to which the object's vertices refer for vertex deformation (skinning.) Any bone hierarchy can be used so the library is not limited to humanoid characters, though helper functions can be provided for common tasks related to humanoid characters, such as turning the head bone to look at a particular point in the 3D space.

Virtual Character Description

A lightweight description file utilizing JSON format, that lists the assets (mesh parts, materials and animations) used for instantiating a character (also referred to as an avatar) into the scene. The file also specifies how the parts are connected. Additionally, the character description can contain metadata, for example defining the name of the character's head bone. This metadata is utilized by the helper functions.

Skeletal Animation

A dataset which describes the changes to individual bone transforms over time to produce for example a walking animation. Typically this involves changing the position and rotation of the bones, and less typically scaling. All the animation data pertaining to a specific value (for example the position or rotation of a specific bone) is commonly referred as an animation track. An animation track consists of time,value -pairs called keyframes. When playing back the animation the successive keyframes are interpolated to give the appearance of smooth animation.

Applying skeletal animation to the mesh vertices (the process of skinning or vertex deformation) requires each vertex of the mesh to specify which bones influence its ultimate position, and to how large degree (blending weight)

Multiple animations can be played back on a character simultaneously (animation blending). This requires the possibility to specify the "priority" of animations in relation to others. If two animations both try to control the same bone simultaneously, the one with higher priority will ultimately take effect. Animations can also be played back only in part of the bone hierarchy, and the magnitude of their influence (ie. blending weight) can be controlled from "no effect" to "full effect". An example of playing animations only partially would be to play a walk animation only on the leg bones of a character, while its hands bones play a different animation, for example waving hands. This way the animations can avoid disturbing each other.

Vertex morph animation

A simpler form of animation, which similarly stores changes over time, but instead of animating bone transforms, it animates directly the positions of individual mesh vertices. This requires storing a larger amount of data, but can be useful for animations where the desired result can not be achieved with skeletal animation only. For example facial animations.

Generic Architecture

This GE is implemented as a JavaScript library, which can be divided into the following parts:

  • Component implementations: AnimationController, Avatar. The Placeable and Mesh components are also needed, but are considered to be part of the 3D-UI GE implementation.
  • Character description file parser
  • Helper functions

Main Interactions

A character can be instantiated either by creating an Avatar component into an entity in the scene, which refers to a character description file, and automatically instantiates the necessary components, or by manually creating the necessary components (Placeable, Mesh, AnimationController) into an entity in the scene.

Once instantiated, animations specified for the character can be enabled and disabled, and the individual properties of animations can be controlled:

  • Time position
  • Blending weight
  • Playback speed
  • Whether an animation loops or not

The character's individual bones can also be controlled directly by setting their position, rotation or scale. So that animations and direct control do not conflict, doing this will disable keyframe animation on the bone in question. Animation on a bone can be re-enabled when direct control is no longer necessary.

A simple example of the interaction between the application and the GE is illustrated in the following sequence diagram:


Basic Design Principles

  • Genericity. The Virtual Characters GE should be usable for any animating characters, not just humanoids.

Detailed Specifications

Re-utilised Technologies/Specifications

Terms and definitions

This section comprises a summary of terms and definitions introduced during the previous sections. It intends to establish a vocabulary that will be help to carry out discussions internally and with third parties (e.g., Use Case projects in the EU FP7 Future Internet PPP). For a summary of terms and definitions managed at overall FIWARE level, please refer to FIWARE Global Terms and Definitions

Annotations refer to non-functional descriptions that are added to declaration of native types, to IDL interface definition, or through global annotations at deployment time. The can be used to express security requirements (e.g. "this string is a password and should be handled according the security policy defined for password"), QoS parameters (e.g. max. latency), or others.
AR → Augmented Reality
Augmented Reality (AR)
Augmented Reality (AR) refers to the real-time enhancement of images of the real world with additional information. This can reach from the rough placement of 2D labels in the image to the perfectly registered display of virtual objects in a scene that are photo-realistically rendered in the context of the real scene (e.g. with respect to lighting and camera noise).
IDL → Interface Definition Language
Interface Definition Language
Interface Definition Language refers to the specification of interfaces or services. They contain the description of types and function interfaces that use these types for input and output parameters as well as return types. Different types of IDL are being used including CORBA IDL, Thrift IDL, Web Service Description Language (WSDL, for Web Services using SOAP), Web Application Description Language (WADL, for RESTful services), and others.
Middleware is a software library that (ideally) handles all network related functionality for an application. This includes the setup of connection between peers, transformation of user data into a common network format and back, handling of security and QoS requirements.
PoI → Point of Interest
Point of Interest (PoI)
Point of Interest refers to the description of a certain point or 2D/3D region in space. It defines its location, attaches meta data to it, and defines a coordinate system relative to which additional coordinate systems, AR marker, or 3D objects can be placed.
Quality of Service (QoS)
Quality of Service refers to property of a communication channel that are non-functional, such a robustness, guaranteed bandwidth, maximum latency, jitter, and many more.
Real-Virtual Interaction
Real-Virtual Interaction refers to Augmented Reality setup that additionally allow users to interact with real-world objects through virtual proxies in the scene that monitor and visualize the state in the real-world and that can use services to change the state of the real world (e.g. switch lights on an off via a virtual button the the 3D scene).
A Scene refers to a collection of objects, which are be identified by type (e.g. a 3D mesh object, a physics simulation rigid body, or a script object.) These objects contain typed and named data values (composed of basic types such as integers, floating point numbers and strings) which are referred to as attributes. Scene objects can form a hierarchic (parent-child) structure. A HTML DOM document is one way to represent and store a scene.
Security is a property of an IT system that ensures confidentiality, integrity, and availability of data within the system or during communication over networks. In the context of middleware, it refers to the ability of the middleware to guarantee such properties for the communication channel according to suitably expressed requirements needed and guarantees offer by an application.
Security Policy
Security Policy refers to rules that need to be fulfilled before a network connection is established or for data to be transferred. It can for example express statements about the identity of communication partners, properties assigned to them, the confidentiality measures to be applied to data elements of a communication channel, and others.
Synchronization is the act of transmitting over a network protocol the changes in a scene to participants so that they share a common, real-time perception of the scene. This is crucial to implementing multi-user virtual worlds.
Type Description
Type Description in the context of the AMi middleware refers to the internal description of native data types or the interfaces described by an IDL. It contains data such as the name of a variable, its data type, the hierarchical relations between types (e.g. structs and arrays), its memory offset and alignment within another data type, and others. Type Description are used to generate the mapping of native data types to the data that needs to be transmitted by the middleware.
Virtual Character
Virtual Character is a 3D object, typically composed of triangle mesh geometry, that can be moved and animated and can represent a user's presence (avatar) in a virtual world. Typically supported forms of animation include skeletal animation (where a hierarchy of "bones" or "joints" controls the deformation of the triangle mesh object) and vertex morph animation, where the vertices of the triangle mesh are directly manipulated. Virtual character systems may support composing the character from several mesh parts, for example separate upper body, lower body and head parts, to allow better customization possibilities.
WebGL → (Web Graphics Library) is a JavaScript API for rendering 3D and 2D computer graphics in web browser.
Personal tools
Create a book