Prague Quadrennial’s 36Q˚ exhibits The State of Darkness

The State of Darkness is exhibited at BLUE HOUR of the Prague Quadrennial’s 36Q˚ June 8-16, 2019.

Enactive Virtuality Lab is presented by associated team members Tanja Bastamow (Virtual Cinema Lab, Aalto ARTS) and Victor Pardinho (Sense of Space Oy). Biosensor adaptation for  the event by Ilkka Kosunen.

Prague Quadrennial’s 36Q˚ (pronounced “threesixty”) presents the artistic and technical side of performance design concerned with creation of active, sensorial and predominantly nontangible ironments. Just like a performer, these emotionally charged environments follow a certain dramatic structure, change and evolve in time and invite our visitors to immerse themselves in a new experience.

WORKSHOPS, MASTERCLASSES
Curated by Markéta Fantová and Jan K. Rolník
8 – 16 June
Small Sports Hall

Our global society seems to be obsessed with fast paced progress of technology and elevates rational intellectual and scientific pursuits above arts that are intuitive and visceral in their nature. And yet creative minds based in the arts are proving that the boundless imagination paired with new technological advancements often result in original and highly inspiring mind-expanding projects. Even though performance design doesn’t need to use modern technology and is often the most inspiring when it uses simple human interaction, we need to explore and experiment with wide range of possibilities new technologies have to offer. PQ Artistic Director Marketa Fantova established 36Q˚ with those thoughts in mind and with a focus on the young, emerging generation of creatives.

Blue Hour

An experimental, interactive environment that fills the entire space of the Industrial Palace Sports Arena will welcome visitors on 8 June and remain open until the end of PQ 2019. The project, based on intensive team work that brings together experienced artists with emerging designers to collaboratively create, will be led by renowned French visual new media artist Romain Tardy.  The curatorial team seeks to experiment with the shifting boundaries between the “non-material” or “virtual” and the “real” world, to explore the capacity of performance design to enlist technology in cultural production.

See more here

EEVR #21 Community meeting @SuperNova Kino Dec 15

TIME Dec 15, Noon

LOCATION SuperNova Kino, room 406, 4th floor, Narva mnt 27

 

An inspiring EEVR community event organised by MEDIT, including presentations, vivid discussions, technical and artistic demos with highlights by visiting  Finnish media artist Hanna Haaslahti (middle) and producer Marko Tandefelt (right).

Announcement by Madis Vasser:

EEVR #21 will once again find itself in familiar territory on the fourth floor of the BFM school in Tallinn, but this time around our host is MEDIT – TLU Center of Excellence in Media Innovation and Digital Culture. We’ll be mixing film, photogrammetry, and some very interesting hardware. Everyone interested in VR/AR are very welcome! The event is free, but do click the attend button early if you plan to show up! Go to FB.

On the schedule:
* Hanna Haaslahti (http://www.hannahaaslahti.net/) – some cool photogrammetry projects
* Madis Krisman & Johannes Kruusma (Avar.ee) – some more cool photogrammetry projects
* Rein Zobel (MaruVR.ee) – VR Days 2018 recap etc

Demos:
* State of Darkness VR – Enactive Virtuality Research Group
* Magic Leap (curtesy of https://www.operose.io/)
* “Hands-on” with some prototype hardware (top secret)

 

Highlighting:
CAPTURED

Captured is a narrative simulation about social injustice where your digital double has a role to play. In the installation, people are captured as 3D Avatars who become actors in a scenario where individual freedom is taken over by collective instincts.

Team

 

 

 

 

 

Hanna Haaslahti is a Finnish media artist working with ideas from technological theater, expanded image and interaction. She holds MFA from Medialab in University of Arts and Design Helsinki (Aalto). Currently Hanna Haaslahti lives and works in Helsinki. She has been artist-in-residence at MagicMediaLab, Brussels (2000), Nifca NewMediaAir, St.Petersburg (2003), Cité International des Arts, Paris (2008) and SculptureShock organized by Royal British Society of Sculptors, London (2015). She has received honorary mention at Vida 6.0 Art and artificial life-competition (2003) and was selected in 50 best category in ZKM Medien Kunst Preis (2003). She has received the most prestigious Finnish media art award, AVEK-award (2005).

 

Marko Tandefelt is a Helsinki based concept designer, educator and musician with extensive experience in art, design, media and technology fields. Among his interests are: Concept design, sensorbased interface prototyping, immersive multisensory cinema, and experimental visualization systems.

Marko has lived in New York for 20+ years, working at companies such as NEC R&D Labs, ESIDesign, Antennadesig and the Finnish Cultural Institute. During 2007-2015 Marko worked as the Director of Technology & Research/Senior Technology Manager at Eyebeam Art & Technology Center. Marko taught Masters Thesis courses at Parsons School of Design MFADT program in New York from 2001 til 2016.

In his native Helsinki Finland, Marko has worked since 2016 as a Technology Consultant and Producer in various interactive projects, including Hanna Haaslahti’s realtime 3D Body scanning installation system “Captured”. Marko works currently at Kunstventures as a media art producer, concept designer and prototyper.

Marko holds a B.M. degree Summa Cum Laude in Music Technology from NYU, and a Master’s degree from NYU Tisch School of the Arts Film & TV School Interactive Telecommunications Program ITP. He is a longtime member of ACMAESIEEESIGGRAPH, and SMPTE, and has worked as a paper reader and jury member for SIGGRAPH and ACE conferences.

 

 

 

Enactive VR research project “The State of Darkness” – ICIDS2018 Dublin

In the VR-mediated experience of the State of Darkness the participant will meet face-to-face with a humanlike artificial character in immersive narrative context. See ICIDS2018 Art Expo catalogue here

Human mind and culture rely on narratives people live by every day, narratives they tell to one another, narratives that allow them to learn from others, for instance, in movies, books, or social media. Yet, the State of Darkness connects the notion of non-human narratives to the stories experienced by our virtual character, Adam B. Trained by an exhaustive range of human facial repertoire, Adam B has gained access to control his facial expressions when encountering with humans.

Our concept builds on the idea of a symbiotic interactive co-presence of a human and non-human. Adam B will be experiencing his own non-human narrative that draws to some extent from the behavior of the participant, yet driven mainly by Adam B’s own life story hidden from the participant, emerging within the complexity of Adam B’s algorithmic mind. The State of Darkness is an art installation where human and non-human narratives coexist, the first experienced and lived-by our participant and the latter experienced by our artificial character Adam B, as they meet face-to-face, embedded in the narrative world of the State of Darkness.

 

Above: Janet H. Murray meeting face-to-face our Adam B. at ICIDS2018 Art Expo, Trinity College Science Gallery, Dublin (Nov 5)

 

 

 

Image above: Enactive Scenographer Tanja Bastamow testing Installation at ICIDS2018 Trinity College, Science Gallery, a day before opening of Art Exhibition Dec 5.

Team

Idea, Concept, Director Pia Tikka; Script & Dramaturgical  supervision Eeva R Tikka; Enactive Character design and production pipeline design Victor Pardinho; Enactive Scenography Tanja Bastamow; Sound design Can Uzer; Sound design II Iga Gerolin; Technical 3D Artist Maija Paavola; Symbiotic Creativity Ilkka Kosunen; Machine learning consultation Paul Wagner; Unreal engine consultation Ats Kurvet.

Team funding

Finnish Cultural Foundation Huhtamäki Fund; Virtual Cinema Lab Aalto University School of ARTS, ; Digidemo Promotion Center of Audiovisual Culture with Oblomovies Oy;   VR equipment by Creative Lab, the Center of Excellence in Media Innovation and Digital Culture & Empatica4 biotracking by Digital Technology Insitute, Tallinn University; Tikka & Kosunen: EU Mobilitas Pluss Top Researcher Grant (2017-2022), Estonian Research Council in association with Tallinn University.

For more information, contact: piatikka@tlu.ee

The International Conference on Interactive Digital Storytelling ICIDS 2018 5-8 December 2018, Trinity College Dublin, Ireland. The State of Darkness VR-installation premiered in the ICIDS 2018 Art Exhibition, a platform for artists to explore digital media for interactive storytelling from the perspective of a particular curatorial theme: Non- Human Narratives. See https://icids2018.scss.tcd.ie

Aalto University News

 

Neuroadaptive dance project “Trisolde”

TRISOLDE – Neuroadaptive Gesamtkunstwerk: The Biocybernetic Symbiosis of Tristan and Isolde”

Exploring the final frontier of human-computer interaction with a neuroadaptive opera…performed by the audience, dancers and computational creativity .

Team of “TRISOLDE” (Tiina Ollesk, Simo Kruusement, Renee Nõmmik, Ilkka Kosunen, Hans-Günther Lock, Giovanni Albini) performed in Festival “IndepenDance” in Göteborg, nov 29 and Dec 2, 2019.

A symbiotic dance version of Wagner´s “Tristan and Isolde” where dancers are controlling the music via body movements and implicit psychophysiological signals. This work explores the next step in this coming-together of man and machine: the symbiotic interaction paradigm where the computer can automatically sense the cognitive and affective state of the user and adapt appropriately in real-time. It brings together many exciting fields of research from computational creativity to physiological computing. To measure audience and to use the audience’s reactions to module the orchestra is new way of doing “participatory theatre” where audience becomes part of the performance.

“Tristan and Isolde” is widely considered both as one of the greatest operas of all time as well as beginning of modernism in music, introducing techniques such as chromaticism dissonance and even atonality. It has sometimes been described as a “symphony with words”; the opera lacks major stage action, large choruses or wide range of characters. Most of the “action” in the opera happens inside the heads of Tristan and Isolde. This provides amazing possibilities for a biocybernetic system: I this case, Tristan and Isolde will communicate both explicitly (through movement of the dancers) but also implicitly via the measured psychophysiological signals.

Dance artists: Tiina Ollesk, Simo Kruusement

Choreographer-director: Renee Nõmmik

Dramaturgy and science of biocybernetic symbiosis: Ilkka Kosunen

Composers for interactive audio media: Giovanni Albini, Hans-Gunter Lock

Video interaction: Valentin Siltsenko

Duration: 40’

This performance is supported by: The Cultural Endowment of Estonia, and Enactive Virtuality Lab and Digital Technology Insitute (biosensors), Tallinn University.

Presentation of project: November 29th-30th and December 1st, 2018 at 3:e Våningen Göteborg (Sweden) at festival Independance. The event is dedicated to the centenary of the Republic of Estonia and supported by program “Estonia100-EV100”.

PREMIER IN TALLINN FEBRUARY 2019 (see more Fine 5 Theater)

 

Neurocinematics @ the Worlding the Brain Conference, Aarhus University

Enactive Virtuality Lab presented the collaborative research with the Brain and Mind Lab of the Aalto School of Science at the Worlding the Brain Conference in Aarhus University, Nov 27-29.

Image: The son (Juha Hippi) confronting his father (Vesa Wallgren). Short film The Queen (Kuningatar) is directed by Pia Tikka, Production Aalto University in collaboration with Oblomovies Oy 2013.

 


TITLE: Narrative priming of moral judgments in film viewing

Authors: Pia Tikka, Jenni Hannukainen, Tommi Himberg, and Mikko Sams

How does narrative priming influence the moral judgements of the film viewers? In two studies we focus on the evaluation of the rightness of the perceived action of the characters and the acceptability of these actions, in relation to the viewers experience of sympathy and filmic tension.
Providing additional narrative information beforehand for the viewers is an effective method to manipulate how they perceive and make sense of the film narrative. Our experiment data is collected from two different studies, behavioral and psychophysiological. In both experimental settings two groups receive additional background information of either the male or the female character, while the third controls are not primed. All subjects view the same 25 minute long drama film and reply to post questionnaires online.
Based on the collected data in the first experiment using parallel mixed method analysis we showed that the narrative priming itself does not increase the spectrum of the moral judgment statements and the acceptance of the wrong-doings by the characters but more influential factor seems to be the type of the action and its relation to the generally accepted moral norms. Yet, the narrative priming increased the explanatory spectrum of the subjects, which showed to some extent the trend for accepting or trying to understand actions that embody socio-emotionally complex situations. In the second currently on-going psychophysiological study (HR, EDA; EEG) we expect the explanatory spectrum collected via online questionnaires to correlate with the results of the first behavioral study. However, we also expect to show more priming dependent and spatio-temporal film-event dependent differences in arousal between all groups, indicating the influence of priming to the unconscious emotional and cognitive processes related to moral judgements

Embodying Creative Expertise in Virtual Reality Zurich ZhDK May 29-31

In collaboration with BeAnotherLab (The Machine to Be Another), Lynda Joy Gerry taught a workshop, “Embodying Creative Expertise in Virtual Reality” to Masters in Interaction Design students at Zürcher Hochschule der Künste (ZhDK), as part of a course, “Ecological perception, embodiment, and behavioral change in immersive design” led by BAL members.

Image: Poster for the students’ final project presentation and exhibition.

Lynda specifically taught students design approaches using a semi-transparent video overlay of another person’s first-person, embodied experience, as in First-Person Squared. The focus of the workshop was on Leap Motion data tracking and measurements, specifically how to calculate compatibility and interpersonal motor coordination through a match score between the two participants, and how to send this data over a network. The system provides motor feedback regarding imitative gestures that are similar in form and position, and also for gestures that occur synchronously (at the same time), ideally trying to support both types of interpersonal motor coordination. Lynda taught students the equations used and data input necessary to calculate this algorithm for the different match scores, and also how to add interaction effects to this data. Lynda showed students how to implement Leap motion hand tracking on top of stereoscopic point-of-view video and how to record user hand movements. On the 31st, students premiered their final projects at an event entitled “Scattered Senses.”

Mimic Yourself: Mo-cap Workshop Zurich ZhDK May 30 

On March 30th, Lynda Joy Gerry visited the Innovation Lab at Zürcher Hochschule der Künste (ZhDK) for a workshop entitled “Mimic Yourself.”

This workshop involved collaborations between psychologists, motion-tracking and capture experts, and theater performers. The performers wore the Perception Neuron motion capture suit within an opti-track system. The data from the performer’s motion was tracked onto virtual avatars in real-time. Specifically, the team had used the Structure Sensor depth-field camera to create photogrammetry scans of members of the lab. These scans were then used as the avatar “characters” put into the virtual environment to have the mocap actors’ movements tracked onto. A screen was also programmed into the Unity environment, such that the screen could move around the real world in different angles and three-dimensional planes and show different views and perspectives of the virtual avatar being tracked relative to the human actor’s movements. Two actors playfully danced and moved about while impacting virtual effects with their tracked motion – specifically, animating virtual avatars but also cueing different sound effects and experiences.

Image above: Motion capture body suit worn by human actor and tracked onto a virtual avatar. Multiple avatar “snap shots” can be taken to create visual effects and pictures. Images below: Creating a many-arm shakti pose with avatar screen captures created through mocap.

Image above shows examples of photogrammetry scans taken with the Structure Sensor.

Aalto MA evaluation on Robots & Poetics

Examination of Johanna Lehto’s MA thesis “Robots and Poetics – Using narrative elements in human-robot interaction” for the Department of Media, programme New Media Design and Production on the 16th of May 2018.

As a writer and a designer, Johanna Lehto sets out to reflect upon the phenomenon of human-robot interaction through her own artistic work. To illustrate plot structure and narrative units of the interaction between a robot and human, she reflects upon how Aristotle’s dramatic principles. In her work, she applies Aristotelian drama structure to analyse a human-robot encounter as a dramatic event. Johanna made an interactive video installation in which she created a presentation of an AI character, Vega 2.0 (image). The installation was exhibited in Tokyo in Hakoniwa-exhibition on 22.-24.6.2017 and in Musashino Art University Open Campus -festival 10.-11.6.2017.