IMPROVING HUMAN PERFORMANCE
←
→
Page content transcription
If your browser does not render page correctly, please read the page content below
TESLASUIT is a human-to-digital interface designed in the form of a full-body AR/VR suit. Three integrated systems include haptics, motion capture, and biometry to provide realistic immersion and accelerate mastery. TESLASUIT is compatible with all major game engines and has an open API and SDK to allow for integration into legacy simulated training environments. teslasuit.io
Haptic Feedback TESLASUIT’s full body haptic feedback system is built into the suit and can be engaged on actions, on demand, or in response to motion capture comparison. This electro-stimulation improves the learning experience by increasing immersion, fostering 360- degree awareness and engaging muscle memory. Functionalities: teslasuit.io Calibration system Provides a wide range Delivered via EMS/TENS 80+ channels of realistic sensations using dry electrodes 3
Motion Capture Integrated skeletal and 3D kinematic motion capture tracks human body interaction within the virtual training environment. This capture is essential to the delivery of correctly placed haptic experiences, and allows for professionals to lay down baselines users can compare against. Using motion capture in training improves motor skills by enabling haptic guidance and error augmentation. Functionalities: teslasuit.io 10 inertial sensors Accurately transfers movements Tracks pose, positions, Sophisticated drift from the real to the virtual world and movements reduction algorithms
Biometry TESLASUIT’s integrated biometric system gathers real-time data from users while training – which can be used to relay emotional state, stress level, and key health indicators. This enables interactive VR/AR training content that adapts to the trainee for personalised experiences, and measurement of key baselines to understand improvement or degradation over time Functionalities: teslasuit.io Electrodermal Detailed real-time performance data Electrocardiography activity (EDA) (ECG) 5
NASA One of our very first customers, NASA has purchased our early Developer Kits to be used on a number of projects at Johnson Space Centre labs in Houston, TX. The Teslasuit DevKits continue to enable engineers and physiologists in their search for new ways to train and study human cognitive and physical performance beyond our normal capabilities. In 2021 NASA will be working on a number of new research projects using the TESLAGLOVE teslasuit.io 6
ISS SPACE MISSION - STARDUST TECHNOLOGIES Together with a French-Canadian company Stardust Technologies, TESLASUIT has successfully accomplished the Phase I of a study on long-term space travel effects on physiology and psychology of astronauts and the relief VR simulation with physical sensations can provide to alleviate the former. The study is initiated and sponsored by CNES, part of European Space Agency. During the 42 parabolas flight the experiment has been conducted using VR with TESLASUIT to better understand the body conditions under variable gravity. The simulated g-force ranged from 2g to 0g during the 42 parabolas on a specially equipped AirZeroG plane. The experiment is now progressing to Phase II and will teslasuit.io take place in Q2 2021. 7
DTEK DTEK companies produce coal and natural gas, generate electricity at the thermal power plants and renewable energy power plants, supply heating and electricity to end consumers, and provide energy services. Virtual reality training course delivered for DTEK employees, based on a set of tasks relating to inspections and repairs at its power plants In a recent test environment, TESLASUIT enabled VR training highlighted a 30% reduction in errors at the re-test phase, confirmed by a independent neuro scientific study. Functionalities: teslasuit.io Motion Capture Biometry Haptic Feedback 8
SCHLUMBERGER SCHLUMBERGER, on-shore platform training emergency scenario. The demo application showcases user behavior and training of an Oil Rig emergency scenario delivered in virtual reality utilising the TESLASUIT technologies. TESLASUIT was used to simulate the danger of oil spills in a safe-to-fail environment, the haptics sensors was used to create awareness of the complications one would meet while going through a disaster recovery exercise. Functionalities: teslasuit.io Motion Capture Biometry Haptic Feedback
VOLVO Volvo Cars “ultimate driving simulator” uses latest gaming technology to develop safer cars. Using cutting-edge technology from the real-time 3D development platform Unity and mixed reality experts Varjo, the simulator involves driving a real car on real roads. It combines life-like, high definition 3D graphics, an augmented reality headset, and a full-body Teslasuit that provides haptic feedback from a virtual world, while also monitoring bodily reactions. “Working together with great companies like Varjo, Unity and Teslasuit has allowed us to test so many scenarios that look and feel totally real, without having to physically build anything,” says Casper Wickman senior leader of User Experience. “ teslasuit.io
TESLASUIT DEFENCE AND GOVERNMENT SERVICES TESLASUIT can be deployed at the point-of-need, allowing trainees in multiple locations from the comfort of their living room, to a base or during a deployment to train for individual and collective skills enabling effective training across active and dynamic missions. Group XR-Training -TESLASUIT used in an extended reality simulation allows training and monitoring of multiple individuals at the same time. This gives an opportunity to simulate various group and individual scenarios. Real-time postural correction - Within synthetic training it is difficult to correct the trainee, using motion capture and real-time data an instructor can signal and correct the trainee immediately using haptics notification. Stress Generation -The TESLASUIT can act as a controlled stress generator with a biometrical feed-back loop to train and assess the capabilities. Develop resilience to stress in a safe-to-fail environment. Projectile Simulation - Our technology can replicate shots, explosions and impact forces. The ability to physically connect with the simulated world improving immersion, allowing trainees to focus fully on the subject of the specific skills. teslasuit.io
SPECIAL FORCE TRAINING - XR TACTICS & TESLASUIT Tactical training for police forces using advanced XR simulations for immersive training. Using TESLASUIT full capabilities to simulate a number of realistic environments. The suit provides the ability to simulate bullet impacts, sustain injuries , record biometrics in stressful realistic situations for later review. VR and TESLASUIT powered police force training scenarios provide the most immersive and realistic experience possible. On top of all things, what is very important in a near real- life training - TESLASUIT helps building up high stress resistivity in a controlled environment, thus relieving a possibility of internal organs damage due to blood pressure and saturation changes occurring under duress. Functionalities: teslasuit.io Motion Capture Biometry Haptic Feedback 12
FIRE SERVICE TRAINING - INLIGO XR & RELYON NUTEC - NORWAY teslasuit.io INLIGO XR & RELYON NUTEC recently tested the TESLASUIT on a number of Fire Rescue training scenarios. REYLON NUTEC train on average 250K personnel per year.
XTAL SIMULATION INTEGRATION teslasuit.io TESLASUIT is currently integrated into both fixed-winged and rotary flight simulation programs for a number of our partners world-wide.
SPORTS & REHABILITATION teslasuit.io TESLASUIT is currently being used in a number of academic sport science and rehabilitation research projects, including HSS, John Hopkins, Coventry University, Reede Institute and many more.
LEWIS HAMILTON & F1 FEELTHE DRIVE Feel the Drive is the world-first multi-sensory fan experience designed by combining prototype mixed reality technology, Lewis Hamilton’s telemetry data and full-body haptics. A true data-driven immersive activation powered by TESLASUIT, twinned with prototype Lumus Vision XR smart-glasses. Combining Teslasuit haptic technology and data we were able to replicate Lewis Hamilton's acceleration, gear change, steering and breaking of his victorious record breaking 2018 Spanish GP pole lap. teslasuit.io
LEWIS HAMILTON & F1 FEELTHE DRIVE . To replicate the driving actions of Lewis Hamilton (7 data channels in total), we identified the muscle groups used for each of the actions by using EMS and TENS to stimulate the corresponding muscle groups during the experience. We were able to replicate the forces from the following data channels. - Speed - Gear throttle - Breaking - DRS Control - Steering - Break Balance - Gear Change Breaking data channel: we replicated inertia of the body by applying haptic forces to the shoulders and abdominal muscle groups. Steering replication was achieved by applying haptic sensations/forces to the deltoid, trapezius, and glenohumeral joints. This activation was the first time we have mapped telemetry data to the suit to understand its true capabilities. teslasuit.io
VODAPHONE USE CASE The world’s first demonstration of the power of 5G to transmit touch using TESLASUIT haptic technology. Two players from Wasps rugby team were able to run a training session despite being more than 100 miles apart. The impact of a rugby tackle made by Will Rowlands at the Ricoh stadium in Coventry was transferred via 5G to teammate Juan de Jongh on stage in London. Juan, in a specially developed haptic Teslasuit, was able to feel the force of the tackle in real- time thanks to Vodafone’s high speed and super low latency 5G. teslasuit.io 18
VODAPHONE USE CASE teslasuit.io 19
VODAFONE 2020 5G CONNECTED CAMPAIGN . “Using the power of 5G, we wanted to demonstrate the future and full potential of this technology in helping to maintain connections in those moments where we cannot be together.” Nick Read CEO Vodafone.
TESLASUIT GLOVE teslasuit.io
TESLASUIT GLOVE The TESLASUIT GLOVE allows users to feel virtual objects using a combination of our award- winning technology. A light-weight exoskeleton glove combines four systems to deliver immersion like never before. A combination of haptics, force-feedback, motion capture and biometry add even greater realism to virtual reality. The ability to now feel and touch virtual objects opens up endless possibilities to accelerate mastery and amplify human performance in training, rehabilitation and simulated environments. teslasuit.io
TESLASUIT GLOVE FORCE FEEDBACK Spatial effect Resistance effect Vibration effect MOTION CAPTURE Finger position Wrist position BIOMETRY Impendence measurement Pulse oximeter teslasuit.io HAPTIC DISPLAY Touch effect Texture effect 3x3 Haptic display per finger
TESLASUIT GLOVE -AEROSPACE Aircraft Design: Virtual dashboard configuration User interface visualisation Interact with cad designs Pilot Training: Human Factor Real Time biometry data Cognitive overload mitigation Aircraft Maintenance: Practise complex repairs & assembly Reduce training time & user engagement teslasuit.io
TESLASUIT GLOVE -AUTOMOTIVE Vehicle Ergonomics Effects on driver and passenger Simulate human interaction in cars under design Virtual Prototyping: Evaluate new designs and features Enable Real-time collaboration Cut time-to-market costs Rapid ideate and iterate design Training: Enable trainees to learn safety Onboard workers quicker Increase effectiveness teslasuit.io
TESLASUIT GLOVE -MANUFACTURING teslasuit.io
TESLASUIT STUDIO Full application development support for deploying Teslasuit and Teslaglove in xR environments. Systems: EMS/TENS/FES (haptic feedback) Motion capture Biometry Development: Unity Unreal Cryengine Android VBS teslasuit.io
READY TO ENTER THE XR ERA? Email p.nickeas@teslasuit.io For more information visit teslasuit.io Get the latest TESLASUIT news TESLASUIT AWARDS @teslasuit @teslasuit @teslasuit @tesla-suit teslasuit.io
You can also read