What is Empathic Computing? Introducing the Empathic Computing Laboratory

 

The Empathic Computing Laboratory (ECL) is researching how new technology can be used to create systems that enhance understanding between people that are face to face or remote from one another. The laboratory is directed by Professor Mark Billinghurst and is located in the Auckland Bioengineering Institute at the University of Auckland and in the Australian Research Centre for Interactive and Virtual Environments at the University of South Australia.

The research of the laboratory is at the convergence of three trends in computer interfaces and communication. “First, increasing internet bandwidth means that people can share more natural communication cues. Second, the development of 360 cameras and depth sensors mean that people can capture their surroundings and experiences. Finally, computers can listen to us, and watch us and so understand our implicit actions,” Mark explains. 

“The field of Empathic Computing sits at the overlap of the three areas of Natural Collaboration, Experience Capture, and Implicit Understanding.”

So, what does this look like?

Staff and students at the lab are exploring how emerging technologies like Augmented and Virtual Reality and physiological sensing can be combined to create new types of collaborative experiences.

“For example, researchers at the University of Auckland are developing technology using depth-sensing cameras to live-capture a person’s surroundings and then stream it to a remote user in Virtual Reality to make them feel like they are in the same place as the local person,” says Mark.

Another project explores how EEG (Electroencephalography) hardware can be used to measure brain activity, and especially looking for periods of brain synchronisation between pairs of people doing the same task. 

“When this happens, people report feeling more connected and being able to communicate more clearly. Recent studies at the ECL demonstrated brain synchronisation in VR for the first time,” Mark adds.

VR Brain Synchronisation Experiment
VR Brain Synchronisation Experiment

Mark explains the ECL’s work also has commercial impact. ECL research has been spun out into a company called Envisage AR that uses 360 video streaming to enable remote tourism and educational experiences.

“Through their web browser people can be taken to remote locations and freely look around the live video feed and point at objects of interest. Trials with the technology have been very positive and widespread roll out should happen later in 2021.”

There are more areas still to explore in the field of Empathic Computing though, and ECL staff and students are keen to find other academic or industry partners who would like to explore this space together.

 

Mairi Gunn and Alaeddin Nassani are two of the researchers working at ECL. Get to know them and their research projects below. 

 

Mairi GunnMairi Gunn is a PhD candidate based at the Elam School of Fine Arts and at the ECL. She is currently experimenting with captures of real people displayed as VR in headsets and cylindrical projections and AR in headsets. “My aim is to bring people from different cultures together around a dining table.” 

 Mairi’s background is in cinematography and documentary filmmaking, and her main interest is in overcoming racism and constructing a common space through relationship building. She mainly works with Māori although she is Pākehā.

She loves to collaborate with scientists and local people who have concerns and problems that they need to solve. “I am particularly keen to bring different stakeholders to the table and record the ensuing discussion in 360/3D video. Imagine how compelling a roundtable conversation between pig hunters, DOC rangers, scientists and local Māori could be?”

Mairi has been Artist in Residence at StaplesVR since 2017. “Aliesha Staples and I share many similar values, including supporting women and minorities in technology. I have been given a desk at her newly opened co-working space, Click Studios, access to high-end 360 stereoscopic camera equipment and the team’s knowledge. I also share her aspiration that NZ can become an international hub for XR production.”

Mairi believes that our adventurous and pioneering heritage sets New Zealand up for navigating the XR terrain. “I would like to see XR helping people feel safer, be smarter and kinder. I hope one day immersive VR content will be readily available in schools, universities and public libraries, AR applications that educate freely available to download, and mixed reality to help people learn more safely, giving opportunity where it is cost-prohibitive to experience something entirely physically. It sounds cliche, but with XR, the potential is truly limitless.”

Get in touch with Mairi at mgun018@aucklanduni.ac.nz

 

Alaeddin Nassani works as a postdoctoral research fellow at the Auckland Bioengineering Institute where he splits his time between the ECL and the Augmented Human Lab (AHL) led by Associate Professor Suranga Nanayakkara. Before joining the University of Auckland, Alaeddin worked in the software industry for over 16 years in New Zealand and overseas. 

Alaeddin Nassani

At the AHL, Alaeddin’s main focus is on Kiwrious, a company created by Associate Professor Suranga Nanayakkara and his team at the Auckland Bioengineering Institute. Kiwrious develops electronic sensors that enable young people to engage in technology-enhanced science experiences. Check out their talk at the NZ XR Summit, a conference brought to life by the University of Auckland and UniServices.

At the ECL, Alaeddin works with EvisageAR on building a 360-degree video conferencing web-based application. Co-Founder and CTO Huidong Bai was also featured at the NZ XR Summit and you can watch his talk here

Alaeddin first came across AR/VR thanks to Mark Billinghurst’s MagicBook – a mixed reality interface that uses a real book to seamlessly transport users between reality and virtuality.

He completed his PhD at HIT Lab NZ, directed by Professor Robert Lindeman. “My study focused on human computer interaction of wearable AR and how we use them to share social experiences. During my PhD, I completed projects using AR devices such as the HoloLens, Google Glass and Google Tango.”

One of Alaeddin’s notable projects is Antarctica AR which was an outdoor AR experience on tablet to simulate expeditions in Antarctica. Learn more about the project here.

“It would be great to see XR being used to improve the quality of our lives rather than distracting it. Applications of XR will soon be essential in many industries such as health and education, and in collaboration in general. This demand will require easy access to hardware/software needed to develop AR/VR as well as the technology to be mature and stable enough for a wide adoption.”

Get in touch with Alaeddin at alaeddin.nassani@auckland.ac.nz