Artanim is a Swiss research institute specialized in motion capture technologies. Projects conducted by Artanim target two strategic axes of research. The first axis develops virtual and augmented reality applications with a focus on real time interaction and using cutting edge motion capture and 3D body scanning technologies. The second axis is medical research where motion capture combined with 3D medical imaging is used to better understand human joint structures and to improve diagnosis and treatment of musculoskeletal disorders. Artanim also provides motion capture services for the creation of 3D animated movies, video games and (live) performances.
Vantage is Vicon’s flagship range of cameras. The sensors have resolutions of 5, 8 and 16 megapixels, with sample rates up to 2000Hz – this allows you to capture fast movements with very high accuracy. The cameras also have built-in temperature and bump sensors, as well as a clear display, to warn you if cameras have moved physically or due to thermal expansion. High-powered LEDs and sunlight filters mean that the Vantage is also the best choice for outdoor use and large volumes.
Shogun has been designed for the requirements and workflow of VFX/Entertainment users. The focus is to save time, and this is achieved by doing as much in real time as possible. To this end, Shogun introduces unique features such as real time actor calibration, automatic recovery of bumped cameras and unbreakable solving – the latter enables you to get a good skeletal solve even when a high proportion of markers is occluded, which again minimizes clean-up time afterwards.
Tracker has been designed for the requirements and workflow of Engineering users wanting to track the position and orientation of objects with as little effort and as low latency as possible. Perfect for many applications in robotics, UAV tracking, VR and human-machine interaction, Tracker lets you define what you want to track with a couple of mouse clicks – and then you can just leave in the background tracking. A simple SDK lets you connect the output data stream to your own software.
The aim of this project is to develop a multi-user immersive platform that combines a 3D environment – which can be seen and heard via a VR headset – with a real stage set. Users are captured by a motion capture system that allows them to see their own bodies, physically move around the virtual environment and interact with physical objects.
The solution developed addresses the following technical challenges: 1) We generate a full body animation using inverse kinematics from a minimal number of markers to keep the user setup time as short as possible while ensuring good tracking accuracy; 2) the platform is multi-user and the VR headset is wirelessly connected to the motion capture system; 3) the interaction with the objects is flawless and is correctly realigned; 4) the position and orientation of the user’s head is appropriately taken into account to minimise latency (which would lead to possible inconvenience with the VR headset) and to maximise positioning accuracy.