Real Time Motion Capture and Virtual Reality gets one step closer
Remember how years ago virtual reality (VR) was going to be the next big thing? We were all going to wear those funky Daft Punk style visors and enter virtual worlds of wonder. I remember a couple of arcade games that used the basic technology. They were not very good.
Then VR fell by the way side as the consoles took off and the graphics got better and better.
Yet for games such as Skyrim and the Halo series it would be incredible to have the immersion that VR would give.
Unsurprisingly, the technology is still being worked on and the video below shows how far the technology has advanced. The real time motion capture on display is incredible and would work wonders in a multiplayer video game.
They use 17 YEI 3-Space Wireless Sensors and 3 3-Space Wireless Dongles motions for the motion capture.
The YEI 3-Space Sensor™ product line is a family of miniature, high-precision, high-reliability, Attitude and Heading Reference Systems (AHRS) / Inertial Measurement Units (IMU). Each YEI 3-Space Sensor uses triaxial gyroscope, accelerometer, and compass sensors in conjunction with advanced processing and on-board quaternion-based Kalman filtering algorithms to determine orientation relative to an absolute reference in real-time.
The product family offers a breadth of communication, performance, and packaging options ranging from the ultra-miniature TSS embedded to fully integrated battery-powered wireless and data-logging versions.
Orientation can be returned in absolute terms or relative to a designated reference orientation. The proprietary multi-reference vector mode increases accuracy and greatly reduces and compensates for sensor error. The YEI 3-Space Sensor system also utilizes a dynamic sensor confidence algorithm that ensures optimal accuracy and precision across a wide range of operating conditions.
The YEI 3-Space Sensor system features are accessible via a well-documented open communication protocol that allows access to all available sensor data and configuration parameters using a variety of communication interfaces. Versatile commands allow access to raw sensor data, normalized sensor data, and filtered absolute and relative orientation outputs in multiple formats including: quaternion, Euler angles (pitch/roll/yaw), rotation matrix, axis angle, two vector(forward/up).
The motion capture aspect could also be used to remotely control robots for things such as bomb disposal, surgery, or how about drones on Mars or further afield?
It could also be used as to teach different subjects with the teacher reaching many different people around the world.
Of course you would probably need one of those 360 treadmills to get the whole real time movement sorted out, but it is definitely one step closer to the whole cyberpunk setting.