Smart Haptics 2020: 5 ideas for haptics in avatar tech

Jan 27 2021

Article cover image

XPRIZE’s avatar experts came together at the global haptics conference to discuss how haptics in avatar tech could transform our lives for the better 

Shaking a client’s hand at a meeting when you’re at home, sick. Hugging a loved one who lives on the other side of the world. Feeling the patter of snow on your skin when outside, it’s a heatwave. Haptics – also known as “kinaesthetic communication” or “3D touch” – refers to the use of technology to simulate the various aspects of touch – and it can make all of this possible. 

Whether through applying forces, vibrations, or movements, haptic tech can allow us to experience things like the sensation of touch on our skin or the weight holding an object in our hand – but from thousands of miles away. 

Cool, right? But when you marry haptics tech with AR, VR, and AI to integrate it into avatar technology, the possibilities become more than just exciting, they begin to hold the potential to greatly benefit humanity. 

We could be looking at a future where we can send an advanced robotic avatar into space, or into a forest fire, or into a humanitarian crisis, and control it as though we were there ourselves. 

Discussing how big a role haptics can play in avatar tech in the future, David Locke – Executive Prize Director for the ANA Avatar XPRIZE – recently appeared at Smart Haptics 2020, a global conference that brings together the greatest minds working on haptics, from application to cutting edge innovation to opportunities. 

Locke was joined on a panel by Jacki Morie, Prize Technical Advisor, and Ed Colgate: Co-Founder at Tanvas, Professor of Mechanical Engineering at Northwestern University and ANA Avatar XPRIZE Judge, as well as Roberta Klatzky, Professor of Psychology and ANA Avatar XPRIZE Advisor. 

Below are 5 key takeaways from the discussion, including where we’re at with haptics in avatar tech today, and where we could be tomorrow if we dream big enough.


At XPRIZE, we’ve been interested in haptics since long before the global pandemic struck. The ANA Avatar XPRIZE was launched in 2018, and is, at its core, “focussed on connecting humans to humans”, explained David Locke, when opening the panel. “That’s what separates this prize from other competitions, the sense of presence in the avatar and the avatar acting as a conduit.” 

However, last year, in the wake of COVID-19, many of us found ourselves isolated and disconnected, and the power and potency of technology that can recreate human touch across vast distances were thrown into sharp focus. 

“I haven’t seen my elderly parents to give them a hug for all of 2020,” Ed Colgate told the panel. “This is where an avatar could come in, and where we could use smart haptics to make that hug feel convincing, even if it was remote. Across the pandemic, more and more of us turned to digital technology and with it, avatars, to stay in communication. But physical touch is incredibly valuable, and a robotic avatar that can convey this touch would be a powerful thing, on an emotional level and a wellbeing level.” 

2020 also highlighted the need for avatars in healthcare, Jacki Morie explained. “I haven’t had a stethoscope put on me by a doctor in over a year.” With doctors and nurses overworked and hospitals understaffed, avatars could one day be used as support staff – and, by virtue of being a robot, it would be immune to a virus like the one we’re living through. 


Don’t underestimate the power of tiny vibrations. Right now, bodysuits can already recreate an array of simulations of touch by using vibration over the body. “There are haptics illusions one can exploit, like the timing of vibrators can create the sensation of something moving along the skin and patterns too – you can use that to create the sense of standing in rain,” says Ed. 

Yet there are some parts of touch that are trickier to recreate. When asked what the biggest immediate challenges in haptics are, Ed pointed to two things. One was recreating the accuracy and dexterity of human hands and fingers. “That would require millimeter-scale detailed touch sensors. We don’t have the tech for that yet.” 

The other was whole body haptics. “It depends on the task or the location, but I would love to see a system that would let me feel like I was exploring a great barrier reef, for example. I think those experiences require whole-body systems. I care about the thermals, currents, wind… that type of tech. I don’t think we’ve quite got there yet, so that’s a big challenge.” 


Ok, so you might have guessed already, but haptics tech can be complicated. The human body has some of the most advanced technology in the world, simulating it was never going to be easy. 

We have a very complex network of nerves and sensors, Jacki explained. We can guess the weight of something or recognize pain. We can experience active touch and passive touch – meaning we can tell when we’re being poked, prodded, and squeezed, but we can also do this to others. We can explore the world with these abilities, but in order to do so, we are constantly sending and receiving information, and processing that information in real-time. 

“We’re at the very beginning of what haptics can and will be, but to really advance this technology we will also need to be a two-way communication system.”

Roberta explained that what is so powerful about the human brain is that it tells us why we’re using our senses, as well as how. Our brain tells us when we’re trying to smell something as opposed to just sniffing through our nose for respiration, for example. All of our senses interact to provide the context for the instructions that tell our body how to function. 

“So we have to get into the hallmarks of what you want the avatar to be doing. That will affect the design, and it’s going to be important.”  


Through touch, humans are able to form bonds with one another – because touch can release hormones or “bonding chemicals” in the body. A question that came up for the panel was the extent to which we replicate this with avatars. 

Crucially, building trust will partly be about designing an avatar system that’s convincingly human-like. Or as Jacki put it, achieving “those subtleties that make us feel like we have somebody behind the mask.” 

“We’re all familiar with the uncanny valley and the idea of robots being creepy,” Ed added, referring to the hypothesized relationship between an object – or in this case an avatar – and its level of resemblance to a human being, and how that affects our emotional response to such an object. In other words, he said, there’s a balance to be struck in design when it comes to functionality and instilling a sense of reliability, and that balance will be crucial for trust. 

Roberta offered another idea: “Something else occurred to me in terms of uncanny valley… we always think about the face of the avatar. But what about motion? Motion comes from the juxtaposition of limbs around joints – there are types of movement we can implement in an avatar so that it’s less like Frankenstein's Monster and there’s more human movement to create trust.”


Towards the end of the panel, David and Roberta touched on the idea that, in a way, many of the components that will be integral to avatar tech in the near future have already arrived. From cell phones to computers to Alexa, we’re surrounded by mini-robots in our homes by way of smart devices, so it’s just a matter of time until these technologies are integrated to form avatars. 

When Jacki threw a question to the panel about how long it will take for the use of integrated tech in robotic avatars to reach critical mass, Roberta said: “We’re on an incipient level but there is still a long way to go. There will be so many steps along the way where impacts are meaningful for human life, and we should be elated about all those steps as they come!” 

Ed estimated about twenty years, but it will largely depend on how these technologies fit into markets. “Look at the Ansari XPRIZE for space travel – it paved the way for where we are today but it took some time. With avatars, we’ve got a number of challenges, technical and also marketplace challenges, it’s going to take some real innovation on the business side as well.”

Telehealth looks like a leading contender to drive this forward, Ed added. “You can imagine going to your local health provider and into a pod where haptics is part of the experience, whether it’s shots, physical exams, or something else,” he explained. 

“I think it comes down to entrepreneurs and what they think we want and need. The thing that tends to get you over the hump is a great fit between someone’s burning need and your avatar technology. We’ll find out and I’m on the edge of my seat waiting to see where it’s going to go.”


Here at XPRIZE, we want to realize those possibilities. That’s why we launched the ANA Avatar XPRIZE. The $10M Prize is there to incentivize innovators and scientists from around the world to create a robotic avatar system that can transport human presence to a remote location in real-time. 

The winning avatar should be able to see, hear, and touch, accurately relaying all of this sensory information back to its operator. It could allow us to explore places we can’t reach, share skills, or deliver critical healthcare or disaster relief. 

However long it takes for avatar tech to hit the mainstream one thing is for sure: The ANA Avatar XPRIZE will speed this process up. “When I first talk about the PRIZE people usually look at me like I’m crazy,” laughed David Locke. “Or like I’m talking 50 years from now…. but then when I actually explain it, you can see them thinking, ‘OK, I see where you’re going with this…’” 

David urged experts in haptics who want to pull this future forward to step up and get involved. “We’re working with a lot of partners in haptics development and we have space for late registering teams up for the Prize until May 2021.” 

Teams come from all walks of life, David summarised: some have years of experience with haptics, some don’t. If the challenges and possibilities above excite you, reach out to collaborate. It could be, he said, that through a new train of thought comes a big, big breakthrough.

The $10M ANA Avatar XPRIZE aims to create an Avatar System that can transport human presence to a remote location in real-time. Find out more at

Ever wondered what exactly is an avatar? Find out more here.