News

SUMMARY: Open gathering at VILA, November 27 2019, Aalborg University

On November 27 2019, VILA (Video Research Lab, a research community under DIGHUMLAB) hosted an informal open gathering on Big Video for all interested researchers from the humanities. This is a short summary of the afternoon.

The gathering was led by Jacob G. Davidsen, Associate Professor at Department of Communication and Psychology and community lead in VILA, and Paul B. McIlvenny, Professor at the Department of Culture and Learning, both Aalborg University.

Professor Paul B. McIlvenny opened the informal gathering with an introduction to Big Video. The screen shows a 360° video clip of an experiment about collaboration.

A massive potential to change the ways of working qualitatively with video data in the humanities

The purpose of the gathering was to demonstrate cutting-edge video technology and software tools developed by VILA and Big Video that hold the promising potential of changing and improving the ways in which humanities scholars work qualitatively with video data.

The technology and software presented at the gathering include:

  • advanced 360° stereoscopic cameras
  • action cameras
  • spy glasses
  • eye trackers
  • spatial audio microphones
  • Virtual Reality head-mounted displays)
  • Ground-breaking software tools such as  AVA360VR and CAVA360VR developed by VILA and Big Video.

Read more about the tech and software below.

Specialised microphones like this one from Zylia can support and optimise research processes.

Locating sound sources with a microphone that can separate sounds such as voices into separate audio tracks

During the presentations, Jacob and Paul showed us a quite spectacular microphone from Zylia with 21 in-built microphones. 

The microphone is 3rd order ambisonic which means that it can records “spatialised sound”. Through algorithms, the microphone can reconstruct the recording in terms of sound sources and directions. It allows for an even more immersive experience when audio and video are united this way and experienced in a 360° virtual reality environment.

The Zylia microphone was originally developed for bands to be able to separate audio input from various instruments. This feature is very valuable as it allows researchers to separate each speaker in human interaction onto different audio tracks. This feature makes the transcription process easier, especially the transcription of overlapping speech. At the gathering, we did a live experiment and demonstration of this.

Professor Paul B. McIlvenny demonstrates the software AVA360VR. The orange silhouette on the screen represents his "head" or "viewpoint" within the 360° video while the two hand controllers mirror the ones he's holding in his hands. The video clip is a recording of an experiment about collaboration.

AVA360VR - the software that empowers users to be "sucked into their video data worlds"

At the gathering, the software tool AVA360VR was demonstrated. It is a tool developed by VILA and Big Video which is now under beta testing.

AVA360VR is short for Annotate, Visualise and Analyse 360° video in Virtual Reality.

AVA360VR is a unique and flexible tool for annotating, visualising and analysing 360° video in virtual reality. The tool empowers users – researchers, students and educators alike – to relive any 360° video-recorded situation in virtual reality (with a VR headset). 

You carry two hand controllers that you use to navigate the software with, and you wear a VR headset that allows you to orient yourself in the video recording by simply turning your head or looking up and down. 

All participants were given the option to try out the software. Some commented that they lost track of time after entering the VR video. Others that they almost ended up believing that they were actually “in” the VR room.

In addition, the tool allows users to work directly in the video with a broad range of features that are all relevant for researchers:

  • Adding annotations such as external images, videos and notes on to it. 
  • Jumping from camera to camera
  • Exporting frames or video clips easily
  • Highlighting data objects
  • Animating your notes or drawings so they move around with the e.g. mobile data subjects).

The software will be released in the beginning of 2020.

A broad range of applications for AVA360VR

Originally, the Big Video team behind AVA360VR designed the software as an analytical tool for researchers who engage with huge and rich amounts of qualitative 360° video data.

However, in addition, the tools carries massive potential for supporting pedagogical and teaching practices as well. AVA360VR is ideal for teaching and training practitioners across sectors such as health and education, and stakeholders from these areas have already shown interest:

360° video recordings of authentic video from a nursery, an operating theatre or a classroom can be put to good use in a professional teaching context. Together students and their lecturers can dive into video data in VR and use this data to support their learning and training in the kinds of work practices and situations that they’ll encounter within their profession.

Associate Professor Jacob G. Davidsen (to the left + "codeslayer" on the screen) and Professor Paul B. McIlvenny (to the right + the orange silhouette on the screen) are co-inhabiting the 360° video with the software CAVA360VR.

CAVA360VR - doing data sessions together in virtual reality over vast distances

AVA360VR is being further developed into CAVA360VR. The “C” in CAVA360VR is short for “Collaborating” and is a feature that enables researchers to perform collaborative analyses, while immersively inhabiting 360° videos together.

The team behind CAVA360VR already attended a conference in Finland, The AFinLA Autumn Symposium 2019, from Aalborg, Denmark through virtual reality. For the demonstration of their software at the conference, they did a live “data session” with one of the participants at the conference venue while two others were remotely connected from Denmark.

CAVA360VR will go through beta testing in January 2020, and the expected release is early 2020.

New lead developer on the team since October 1, 2019: Meet Artúr

Artúr Barnabás Kovács has been working as the lead developer on AVA360VR and CAVA360VR since October 1, 2019.

Read the full interview with Artúr

The future is now – would you like more information about it?

VILA is looking to expand the user group of their research community. In other words, they would like to invite colleagues working with or interested in video analysis to become users of VILA.

As an affiliated user of VILA, you will have the opportunity to use our labs (eg. xLab) and our equipment (eg. GoPros, 360° cameras, camcorders, etc.), as well as draw upon our experience and expertise.

Moreover, you will have an opportunity to discuss your video-based research with fellow researchers in a friendly, supportive environment.

Are you interested in the software AVA360VR and CAVA360VR that the team is developing? Then stay tuned at vila.aau.dkbigvideo.aau.dk or dighumlab.org or any of our social channels for more information. 

About Paul B. McIlvenny

Paul McIlvenny is a Professor at the Department of Culture and Global Studies at Aalborg University. He holds a PhD from Edinburgh University, Scotland, and is active in the following centres and research groups: Centre for Discourses in Transition (C-DIT), Centre for Mobility and Urban Studies and VILA (Video Research Lab, a part of DIGHUMLAB) at Aalborg University.

About Jacob Davidsen

Jacob Davidsen is an Associate Professor at the Department of Communication and Psychology at Aalborg University. He holds an MA in Information Science and is community lead in VILA (Video Research Lab, a part of DIGHUMLAB) at Aalborg University.

Menu