Multisensory Experiences: Where the senses meet technology 2020932770, 9780198849629

Most of our everyday life experiences are multisensory in nature; that is, they consist of what we see, hear, feel, tast

512 98 5MB

English Pages [113] Year 2020

Report DMCA / Copyright

DOWNLOAD FILE

Polecaj historie

Multisensory Experiences: Where the senses meet technology
 2020932770, 9780198849629

Table of contents :
Cover
MULTISENSORY EXPERIENCES
Copyright
Dedication
CONTENTS
FOREWORD
PREFACE
ACKNOWLEDGEMENTS
1 The Exciting World of
Multisensory Experiences
2 Fundamentals of
Multisensory Experiences
3 The Human Senses
Meet Technology
4 Beyond the Known and
into the Unknown Future
5 Laws of Multisensory
Experiences
APPENDIX: xSense Design Cards for Multisensory Experiences
REFERENCES

Citation preview

Multisensory Experiences

“A must read for researchers and practitioners learning about multisensory experiences and creating them artificially. Authors provided a solid framework using real life examples and connected broad set of disciplines to dissect the complex “experience” we live everyday, and our expectations from the tech­ nologies to enable them. The book is engaging, thought provoking and easy to read, with ample examples resonating with readers, jolting their thinking, and resolving with satisfying closures. It covers fundamental of “what is an experience?”, its essential constitu­ ents, and “why, what, when, how, who and whom” of futuristic technologies ­enabling it.” Ali Israr Research Scientist Haptics, Facebook Reality Labs, USA

“It’s easy to overlook given our contemporary screen-based dependence on swiping and typing, but our experience of the world goes beyond sight and touch, to include smell, taste, and sound—an entire universe of rich perceptual experiences our senses create in concert with each other. Who better to serve as our guides to the opportunities and responsibilities that accompany multisen­ sory design than Marianna Obrist and Carlos Velasco, with their very different backgrounds and shared sense of curiosity and playfulness.” Nicola Twilley Co-host of Gastropod, USA

MULTISENSORY EXPERIENCES Where the Senses Meet Technology

Carlos Velasco Marianna Obrist

1

1 Great Clarendon Street, Oxford, OX2 6DP, United Kingdom Oxford University Press is a department of the University of Oxford. It furthers the University’s objective of excellence in research, scholarship, and education by publishing worldwide. Oxford is a registered trade mark of Oxford University Press in the UK and in certain other countries © Carlos Velasco and Marianna Obrist 2020 The moral rights of the authors have been asserted First Edition published in 2020 Impression: 1 All rights reserved. No part of this publication may be reproduced, stored in a retrieval system, or transmitted, in any form or by any means, without the prior permission in writing of Oxford University Press, or as expressly permitted by law, by licence or under terms agreed with the appropriate reprographics rights organization. Enquiries concerning reproduction outside the scope of the above should be sent to the Rights Department, Oxford University Press, at the address above You must not circulate this work in any other form and you must impose this same condition on any acquirer Published in the United States of America by Oxford University Press 198 Madison Avenue, New York, NY 10016, United States of America British Library Cataloguing in Publication Data Data available Library of Congress Control Number: 2020932770 ISBN 978–0–19–884962–9 DOI: 10.1093/oso/9780198849629.001.0001 Printed and bound by CPI Group (UK) Ltd, Croydon, CR0 4YY Links to third party websites are provided by Oxford in good faith and for information only. Oxford disclaims any responsibility for the materials contained in any third party website referenced in this work.

Dedicated to Carlos J. Velasco & Ana I. Pinzón, Sriram Subramanian, Josefine & Michael Obrist

CONTENTS

Foreword

ix

Preface

xiii

Acknowledgements

xvii

1. The Exciting World of Multisensory Experiences

1

2. Fundamentals of Multisensory Experiences

13

3. The Human Senses Meet Technology

29

4. Beyond the Known and into the Unknown Future

53

5. Laws of Multisensory Experiences

71

Appendix: xSense Design Cards for Multisensory Experiences 

83

References 

89

FOREWORD

I

n this unusual and wide-ranging work, Carlos Velasco and Marianna Obrist explore concepts we experience every day but don’t necessarily understand fully. We are familiar with the five senses (sight, sound, touch, taste, smell). Our neural systems transduce these physical phenomena into neural pulses that flood into our brains along many pathways and interact in ways that we don’t normally disarticulate. Interestingly, all these senses are translated into essen­ tially similar neural signals, but these are processed in a complex and intercon­ nected web of neurons producing what we call ex­peri­ence. It is well-known that our brains adapt and respond to sensory input. These inputs stimulate the organs of the brain, which responds by reinforcing some paths, diminishing or even pruning others. Moreover, we now know that the brain remains plastic into adulthood. In some sense, there is a kind of eternal battle among the organs of the brain over neural territory. Mask inputs from one eye, for example, and the other eye begins to use neural territory once ‘assigned’ to the other. Sound input is vital to learning to speak. Hearing sound causes the brain to adapt to understanding and generating it. A child born deaf may find it hard to generate speech for lack of the sensory adaptation the brain experiences simply from hearing sound (and speech). Even ‘silent’ thought produces neural signals that would otherwise cause us to voice our thoughts but for voluntary suppression. Indeed, mechanisms have been developed to detect neural signals that would normally cause muscle movement to produce speech, allowing a computer to estimate the sounds that would have been produced. This is probably as close to mind reading as we are ever going to get. When we speak of having a conversation with ourselves, we are literally correct. The authors explore other senses than the familiar five. Temperature (thermo­ ceptors) and pain (nociceptors), pressure, textures, shapes, and locating our body parts (proprioception). We know where our hands and feet are relative to the rest of our bodies. All of this adds up to a remarkable exploration of how the brain

x | Forewor d gathers, analyses, and interprets sensory signals, blending them together to produce the phenomenon of experience. When we recall experiences, we often regenerate the sensory experience. When I smell cigar smoke, I am transported back to my early childhood with my grandfather and mother during the Second World War. The olfactory memory regenerates sights, sounds, scenes, and other sensory phenomena. Our senses are linked, not only in real time, but also in recall. We are learning more about our senses as our ability to measure phenomena improves. Taste is more than sweet, sour, bitter, salty, umami, and metallic, since it is also heavily combined by our brains with smell; for example, most of the taste of wine is olfactory rather than purely palate. We are also learning that there are neural circuits in the brain that literally function as spatial maps allowing us to navigate to places we have been before. It is remarkable how quickly these maps can be constructed. It is my conjecture that these multisen­ sory brain functions are deeply invested in another phenomenon that I will call binding. We remember or experience certain associations, some of which are very long term (e.g. our connection to our biological forebears and descend­ ants) and some of which are very brief (e.g. my hotel room number, the parking spot in which I parked this morning). These bindings help us navigate a com­ plex world and sensory recall is a part of that process. This book posits that it is possible to deliberately create multisensory ex­peri­ ences in the form of art and to explore how experience can be altered with subtle changes in sensory input. My son is a film editor who specializes in sound. In one experiment, he took the same video scene from a film he had shot, but played very different background music. The scene went from cheerful and uplifting to scary and threatening depending on the choices he made for back­ ground sound. For me, this illustrated viscerally how experience is influenced by multimedia inputs. In one section of this book, the authors show a multisensory virtual reality ‘headset’ that turned the person wearing it into a kind of insane Darth Vader—I had to laugh. But the idea of artificially generating sensory experiences is not crazy. In Brave New World, Aldous Huxley posited ‘The Feelies’ as an extension of the cinema experience by adding haptic interfaces to the seats in the fictional theatre. Attempts to create olfactory ex­peri­ence have also been undertaken. I recall a ride at Disney World that involved a virtual visit to the ancient past when dinosaurs roamed the Earth. Puffs of volcanic and ­petrochemical smells wafted over the riders to evoke a sense of the early Earth. My dentist tells me that he makes use of touch, sight, and sound while doing his work in order to understand the condition of his patients’ teeth. My wife wears two cochlear implants that artificially stimulate the auditory nerves in each of her cochlea. She lost her hearing at the age of three as a result

 Forewor d

| xi

of spinal meningitis that brought very high, life-threatening temperatures. The hair cells of the cochlea were destroyed, rendering her totally deaf. She remained that way for fifty years until she had her first cochlear implant in 1996. This then-experimental therapy used a speech processor about the size of today’s mobile phone. This device detected sound with a microphone, processed it to identify which frequencies were present and at what amplitudes and then used that information to send signals to the cochlear implant that transduced the incoming signals into electrical neural stimuli, which the brain interpreted as sound. As she was post-lingually deaf, her brain already knew how to interpret these signals. Shortly after the speech processor was activated, she actually called me on the phone—for the first time in our then-thirty years of marriage. Since that time, the technology has only gotten better and in the most recent replacement of her implant and speech processor, she has been able to enjoy complex music (think Bach) which earlier just sounded like a garbage disposal! Understanding and appreciating multisensory experience can come in many forms. The famed chef José Andrés (whom I am privileged to call a friend) cre­ ated a special restaurant he called the Mini-bar and later expanded in a new restaurant called Somni. In these restaurants, especially talented chefs produce tiny, bite-sized tapas that produce unexpected multisensory effects. One dish is made with liquid nitrogen (I kid you not!) and after you pop this into your mouth, your outgoing nasal breath makes you look like a dragon! Another looks like a little palm frond beach hut, but it is actually a deconstructed Caesar salad! Your eyes tell you one thing but your mouth and nose tell you otherwise. You will find many examples of these kinds of artificially generated multisen­ sory experiences accounted for in the book. There are stories about food consumption in zero gravity—how might that change the way in which we sense what we eat? Liquids definitely look different in those conditions! There is evidence that food tastes different depending on ambi­ ent sound and light, to say nothing of texture and appearance. I confess that I am not a blueberry fan because, for some reason, my brain objects to blue food. What about a purple potato? A blue tortilla? I remember doing an ex­peri­ment in which I was blindfolded and told I was about to drink a glass of milk. It was actually grapefruit juice and I spit it out thinking it was a glass of horribly spoiled milk! We have all likely experienced what the French call trompe l’oeil (‘fool the eye’) art. That which is actually flat looks three dimensional. Thus, we think of the deconstructed Caesar salad and a kind of trompe la langue and trompe l’oeil combined. As I read this book, I also thought about another phenomenon called syn­ aesthesia in which a person hears sound but also sees associated colours, or sees colours and hears sound. The cross-combining of the senses synthesizes

xii | Forewor d ex­peri­ence in unusual ways. One can imagine deliberately trying to produce such effects artificially as an art form. When I listen to some music, I visualize shapes and colours to go with them—a kind of minor synesthetic effect. I think you will find it an eye-opening experience to read what these authors have to say about the conflation of our senses produced by high-dimensional synthetic effects. There is nothing simple about our senses, and this book explains why. Vinton G. Cerf

Vinton  G.  Cerf is Vice President and Chief Internet Evangelist for Google. He contributes to global policy development and the con­ tinued spread of the Internet. Widely known as one of the ‘Fathers of the Internet,’ Cerf is the co-designer of the TCP/IP protocols and the architecture of the Internet. In 2004, Cerf was the recipi­ ent of the ACM Alan M. Turing award (sometimes called the “Nobel Prize of Computer Science”). He has served in executive positions at MCI, the Corporation for National Research Initiatives, and the Defense Advanced Research Projects Agency, and on the faculty of Stanford University. His personal interests include fine wine, gourmet cook­ ing, and science fiction.

PREFACE

W

riting this book has been an experience—a very memorable one. But what makes it an experience and why should you care? This book aims to make the concept of multisensory experiences— those designed with the senses (e.g. sight, hearing, touch, taste, smell) in mind and which can be enabled through existing and emerging technologies— tangible and available to everyone. We hope that Multi­ sensory Experiences empo­ wers you to shape your own and other people’s experi­ ences by considering the multisensory worlds in which we live. These worlds are increasingly trans­ formed through techno­ logic­ al advancements like novel multisensory devices and interfaces. Such multi­ sensory technologies not only stimulate our eyes (think of screens) and ears (and audio systems), but also consider how and what we touch, smell, and taste, among others, in our lives. Technology enables the creation of multisensory experiences that enrich and augment the way we interact with the world around us. This book takes you through a journey that goes from the fundamentals of

xiv | Preface multisensory experiences, through the relationship between the senses and technology, to what the future of those experiences may look like, and our responsibility in that future. We first met in April 2013 when Carlos was doing his D.Phil in Experimental Psychology at the University of Oxford and Marianna was a Marie Curie Fellow at the School of Computing at Newcastle University. We started discussing a study on taste experiences that could inform the design of future human– computer interfaces. This first encounter marked the beginning of a shared ex­plor­ation of multisensory experiences enabled and transformed through technology. We expanded our work to other projects that included touch ­experiences, and from there to projects with other senses. As part of this shared curiosity on what technology could enable us to do in the study and design of multisensory experiences, Carlos became the first per­ son to join Marianna’s newly founded laboratory, the Sussex Computer Human Interaction Lab—in short SCHI (pronounced ‘sky’) Lab, at the School of Engineering and Informatics at the University of Sussex. We worked together for three months; Carlos then left the SCHI Lab and went to Malaysia (Imagineering Institute) and Singapore (Nanyang Technological University) to work on a number of research and industry projects on multisensory ex­peri­ences. This event was playfully named, by the lab’s members, as the moment of ‘the fallen angel’. After Carlos’ departure, Marianna continued setting up her lab and created an interdisciplinary team of researchers to work on touch, taste, and smell experiences for interactive technologies. Despite the geographical dis­ tance, we never lost touch, and thus our collaboration and shared passion for multisensory experiences continued and became even stronger over the years. This book attempts to capture and share our curiosity for the senses, multi­ sensory experiences, and multisensory technology with you, the reader. It pro­ vides a general overview of multisensory experiences, their scope, and possible future scenarios for them. In the five chapters, we create a playground for in­spir­ation and reflection on the known and unknown elements of multisensory experiences. In doing so, we bring two different perspectives and stories into the narrative of this book. Each of us grew up in very distinct environments and have different disciplinary backgrounds. Carlos, whose background is in Experimental Psychology, was born in Bogotá, the capital city of Colombia, with a population of over seven million people. Marianna, on the other hand, grew up in Latzfons, a small municipality in South Tyrol in northern Italy with a population of about five thousand people, and her background is in Human– Computer Interaction. Although we have wandered along different personal

Preface

| xv

and professional paths, we have shared most enthusiastically our appreciation, fascination, and excitement for multisensory experiences. After all those years of research and reflection together, we felt it was time to share our impressions and insights on multisensory experiences with a wider audience, going beyond academia. This is our aim with this book. Chapter 1 presents the exciting world of multisensory experiences. In it, we describe how we sense and perceive the world around us with all of our senses. We depict a number of initiatives in academia and industry that consider the different senses and emerging technologies when designing experiences such as eating, flying, and going to an art gallery. We show how multisensory tech­ nologies stimulate our senses beyond what we typically see and hear and bring the physical and digital worlds closer together. Chapter 2 describes the fundamentals of multisensory experiences. Here, we address questions like ‘What is an experience?’ We ask what role the senses play in our experiences, and if it is possible to design specific experiences with the senses in mind. These questions have fascinated generations of philosophers, scientists, artists, and technologists. We present our own definition of multi­ sensory experiences building on the opinions of experts in different fields and industries, as well as our own take on the topic. Chapter 2 also provides key concepts about the senses and how to consider them when designing ex­peri­ ences. Traditionally, scientists have focused on studying each of senses inde­ pendently, although our senses constantly interact with one another while we perceive the world around us. Multisensory experiences are shaped by the evolving technological advances that we witness every day. We have all forgotten our smartphone at home, and experienced when the Internet connection did not work or our computer refused to turn on. And yet, we crave for new ways of interacting with and through technology. Chapter 3 illustrates how the senses are increasingly meeting tech­ nology and how our experiences are moving from reality, to mixed reality (involving both physical and digital worlds), and ultimately to full virtuality (digital only). Chapter 3 presents a series of examples where technology enables the design of multisensory experiences along the reality–virtuality continuum. As technology advances, and our understanding of the human senses grows, we will not only be able to enhance existing experiences, but also to create pre­ viously unimaginable ones. For example, we are just starting to understand how our senses change when we are in outer space. We, and several other researchers, are developing technology-enabled multisensory experiences that consider these changes to facilitate future space travels. Chapter 4 presents the

xvi | Preface scope for multisensory experiences in the years to come. It merges science and fiction and discusses the possibilities around designing multisensory experi­ ences as well as the challenges that might arise from them. Finally, Chapter 5 synthesizes the current state of multisensory experiences and reflects on their implications for individuals and societies. In it, we postu­ late our three laws of multisensory experiences, which are based on reflections about who designs multisensory experiences, what experiences we design, why, when, and how we do it, and for whom we may or may not do it. We hope you will enjoy reading this book as much as we enjoyed writing it! Carlos and Marianna

ACKNOWLEDGEMENTS

M

ultisensory Experiences was possible thanks to the support of our families, friends, and colleagues. We start by expressing our grati­ tude to BI Norwegian Business School and the University of Sussex for their unparalleled support in our research and work. We are also grateful for the support from our national and European funding bodies, especially the European Research Council. Additionally, without the following people, this book would not have been possible. Carlos thanks Carlos Jaime Velasco, Ana Illonka Pinzón, Diana Velasco, Cristina Velasco, and Juno, for their intellectual curiosity, constant encourage­ ment, and support. Carlos also thanks ‘El Parche’ (Santiago Garcia-Herreros, Francisco Gómez, Eduardo Ordoñez, Maria Jose Pardo, Eduardo Perez, and Alejandro Plata) for all the conversations, discussions, and insights about human experiences. Likewise, Carlos also thanks Charles Spence, his PhD supervisor and friend, for being an inspiration, an infinite source of encourage­ ment, and for all the immersive conversations about multisensory experiences. Carlos also thanks Lawrence E. Marks, Andy T. Woods, Anton Nijholt, Olivia Petit, Alejandro Salgado, Charles Michel, and all Asahi Breweries’ Research and Development team, for the many discussions and inspiration about multisen­ sory experiences. Marianna thanks Sriram Subramanian for being her better half and a con­ stant reminder that she can go from good to great, and that nothing is impos­ sible. Marianna also thanks her parents Josefine and Michael Obrist for their continuous support and belief in her since leaving their little farm in the Italian Alps, and her sister Elisabeth Obrist for her support in all she does. Marianna also thanks the original MO team at the University of Salzburg who shaped her first years in academia and inspired her to venture out to Newcastle where the Culture Lab members supported her initial studies on taste and smell, and her

xviii | Acknowle d g ements SCHI Lab team who truly inspired and enabled many of the multisensory tech­ nology and research examples presented in this book. Special thanks go to Emanuela Maggioni and Patricia Ivette Cornelio Martinez for their encourage­ ment and support. Marianna also thanks Stefanie Mueller, Pattie Maes, and Maggie Coblentz for inspiring conversations on food 3D printing, wearable olfactory systems, and food design for outer space. We both thank Anil Seth, Ann-Sophie Barwich, Olivia Petit, Elizabeth Churchill, and Pablo Naranjo for helping us shape Chapter 2 with their expert views on experiences and the role of the senses in them. We know how busy they are, and we truly appreciate their time and input. We are also grateful to several people who provided us with thoughtful and constructive comments and suggestions at different stages and parts of the book. We thank especially Carlos’ mother Ana (all chapters), Florian Foerster (all chapters), Andreas Keller (book proposal), and Roberto Trotta (Chapter 3) for their comments and suggestions during the process of writing this book. We also express our grati­ tude to Ticha Sethapakdi, who did an amazing job in developing the cover and chapter illustrations. She worked with us closely and captured the ideas of each chapter brilliantly. All her illustrations truly enrich the experience of the book. A big thank you goes to Vinton G. Cerf for being an inspiration—not only for the computing community, but also the wider audience, and for writing the foreword of our book. Lastly, we thank all the people who have inspired and supported us through­ out our professional and personal lives. Even if we cannot list everyone here, please know that we are very grateful for every single one of you for being part of our lives.

C H A P TE R 1

Illustration 1  Multisensory experiences involve all of our senses (e.g. sight, hearing, touch, taste, smell). Our senses do not work independently but instead, they communicate with one another when we perceive the world around us. By considering our multisensory worlds and their sensory elements, we can transform existing, and design novel, experiences.

Multisensory Experiences: Where the Senses Meet Technology. Carlos Velasco and Marianna Obrist Oxford University Press (2020). © Carlos Velasco and Marianna Obrist. DOI: 10.1093/oso/9780198849629.003.0001

The Exciting World of Multisensory Experiences Men who are lovers of wisdom must be inquirers into many things indeed. – Heraclitus

D

o you remember the first time you touched a pebble, ate a strawberry, or smelled a jasmine flower? You may not remember these specific first encounters, but it is likely that you recall the sensations of touching, tasting, and smelling those things, or at least a general sense of what the experience felt like. And perhaps, if the experience was particularly special (your first time at the beach) or dramatic (there was a vast thunderstorm at the beach), you may have a stronger recollection or impression of the moment. Those instances define us when it comes to the exploration of our surroundings, the world at large, and also how we experience future encounters and events. But how do experiences form?

1.1  How Do We Experience the World Around Us? We perceive the world around us with all of our senses, that is, through what we see, hear, touch, taste, and smell (and more, as we discuss in Chapter 2). The senses, in turn, contribute to form the countless experiences that we have in our lives. We, humans, are equipped with multiple sensory channels, which, at any

TH E E XC ITI N G WO R LD O F M U LTI S E N S O RY E XPE R I E N C E S

| 3

given moment, allow us to detect and process different kinds of information to form specific impressions about ourselves, others, and the world around us. Notably, most of our daily life experiences are multisensory in nature, that is, they involve many, if not all, of our senses. Furthermore, our senses have evolved to work symbiotically while we sense and perceive the world—i.e. they do not work independently. Multisensory experiences are a central part of our everyday lives. However, we often tend to take them for granted, at least when our different senses function normally (say, when we have normal sight functioning) or are corrected-to-­ normal (when we use glasses). However, closer inspection to any, even the most mundane experiences, reveals the remarkable multisensory world in which we live. Consider the experience of eating a regular meal. At first, it may seem like an ordinary experience, although it is actually a fusion of the senses.1 We first eat with our eyes (Figure 1.1), but we are also exposed to countless sensory signals that influence our eating experience such as food textures, tastes, and smells. And it does not stop there. Even the sounds that come both from the atmosphere in which we are immersed while eating and our interactions with the food

Figure 1.1  The image on the left is a salad developed by Chef Charles Michel, which was inspired by Wassily Kandinsky’s Painting #2012, and used in a study to demonstrate that our eating experiences are influenced by what we see, that is, we eat first with our eyes. Figure reprinted from ‘Plating manifesto (II): the art and science of plating’ by Spence, Piqueras-Fiszman, Michel, and Deroy (2014)/CC BY 4. The image on the right showcases ‘TastyFloats’3 a contactless food delivery system that exploits principles of acoustic levitation to float pieces of liquid and solid foods in mid-air and creates futuristic tasting experiences that can also change our eating experiences.

4 | H ow D o W e E xperience the World Around U s ? (such as chewing or slurping) and the utensils we use to eat, can influence our eating experience. Not only the sensory ‘ingredients’ or cues specific to the food (e.g. tastes, smells) but also what happens at more or less the same time that we eat (e.g. background music) can be carefully considered in order to craft a given eating experience. In fact, in this book we suggest that crafting experiences, by carefully considering the senses, is something that can be done with any human experience. Importantly, current technological developments allow us to go further and better control sensory stimulation and delivery (the what and how of presenting specific sensory signals, e.g. light, sounds, smells) to attain the experiences that we want to create. We are at the dawn of a new era of innovation where technology meets the senses (multisensory technologies), which enables the creation of previously unthinkable experiences. For example, many people now control both lighting settings and sound delivery (radio, music) in their own homes through smart devices like Google Home or Alexa. Other, perhaps more experimental, initiatives at this stage include ongoing work on multisensory virtual reality (VR) and augmented reality (AR) devices. These devices not only involve graphics and sounds but also are starting to integrate smell and touch inputs through the VR headset or other devices. Those technologies allow us to recreate contexts and situations that we know, but also allow us to create completely new experiences that we have never had before, such as eating levitating food like Princess Amidala from Star Wars, when she dines with Jedi Anakin Skywalker (Figure 1.1).3 How would it feel like to eat a levitating strawberry for the first time? More than ever before, by carefully considering different senses and their possible interrelations it may be possible to design and shape specific human experiences. The last few decades have seen an explosion of research on the role of the human senses in the formation and development of our experiences, as well as on novel multisensory technologies. Those technologies are increasingly becoming an extension of us, providing us with new and enhanced experiences. Our growing understanding of the senses and multisensory technologies, as well as the increased symbiosis between humans and technology, have both facilitated the development of the concept of multisensory experiences (sometimes also referred to as multisensory experience design). This concept is based on the idea that we are not just passive receivers present in, but instead can be active creators of, the multisensory worlds in which we live. In other words, we can carefully consider the different senses, the way in which they work together,

TH E E XC ITI N G WO R LD O F M U LTI S E N S O RY E XPE R I E N C E S

| 5

and the available multisensory technologies to shape our experiences. We can explore sensory arrangements that can change the way we perceive and interact with our environment and ultimately form new experiences. Multisensory experiences, as a concept, is clearly distinct from experiences that naturally occur in our everyday lives, such as taking a walk in the woods or smelling a flower in the field. While such experiences are multisensory in nature, because they involve different sensory cues (the breeze, the colours, the textures, the smells), they are not multisensory experiences as we understand them in this book. For us, a multisensory experience is carefully designed by someone, like a walk in the woods that has been designed by a landscape architect in order to evoke specific impressions of the walk. Hence, when we talk about multisensory experiences, the ‘design’ part is implied. For this reason, we use the concept of multisensory experiences and multisensory experience design interchangeably.

1.2  Multisensory Experiences in Academia and Industry Multisensory experiences have found their way into both academia and industry. In the academic world, researchers from fields as diverse as computer science, engineering, especially human–computer interaction (HCI), psychology, marketing, and the arts have become increasingly interested in understanding, explaining, and creating guidelines for multisensory experiences. Within the computing and engineering community, researchers are working towards developing an understanding of the human senses as means for inter­ action. For example, researchers are asking how we can use tactile and olfactory cues in a VR environment to create more realistic and immersive experiences. By answering this and other questions, and by developing new multisensory technologies, people may be able to see, e.g. a flower in a virtual word, but also to touch it and smell it. As we discuss later, new haptic (stimulating our sense of touch) and smell technologies increasingly enable the creation of those experiences. In marketing, researchers are trying to understand how consumers perceive and interact with brands and brand touchpoints (e.g. packaging, ads), which are multisensory devices capable of transforming consumer experiences. Different senses are considered in order to facilitate the formation of specific impressions, judgments, and behaviours associated with brands. For instance, consumers’

6 | MULTISENSORY EXPERIENCES IN ACADEMIA AND IN DU STRY judgment about the quality or taste of a drink can be formed based on the sound properties associated with its packaging (e.g. opening and pouring sounds). Sounds are diagnostic cues, and as such can guide the way in which we evaluate brands.4 Multisensory experiences have also become a key aspect, or even the centre, of many initiatives in industries (e.g. airlines, restaurants). For example, inspired by research on the idea that sounds can influence our eating experiences, Finnair, Finland’s largest airline, designs novel eating experiences (an initiative called ‘Hear the Taste’) for their flights. In particular, in 2017, Finnair chef Steven Liu created three dish + sound food experiences (see a video about it here: https://www.youtube.com/watch?v=roSjcStVB2c). The sounds were based on field recordings from different Nordic countries (e.g. streams, crackling tree branches, wind) that included properties that would enhance the perception of the food. For instance, sweet corn and chicken soup with coriander oil was featured with bubbling sounds. Their suggestion was that this would boost freshness, and that harmonic frequencies would emphasize the sweet taste of the dish. The initiatives in industry are, in many cases, based on intuition and perhaps creative exploration and experimentation. However, there is also a growing collaboration between artists, scientists, technologists, marketers, chefs, and other practitioners that aims to capitalise on our growing scientific understanding of the human senses when designing experiences. For example, through a collaboration between the Centre for Multisensory Marketing at BI Norwegian Business School (Oslo, Norway) and Le15 Café (Mumbai, India), Chef Pablo Naranjo and one of us (Carlos) developed a multicourse dining experience called ‘Awaken Your Senses’. This multisensory culinary experience was created to illustrate to the diners the role of the senses in eating experiences (see https://www.bi.edu/research/business-review/articles/2019/06/ what-can-marketers-learn-from-chefs/). One of the dishes, ‘Synaesthesia’,* consisted of oak-smoked salmon ceviche, served with a spicy tiger’s milk made with home-made confit tomatoes, ginger, coconut milk, lime, and celery salt, topped with puffed flat rice and coriander. This dish, however, had a particular ingredient: music. Before the dish was served, saxophone musician Ryan Sadri tasted the dish and was asked to ‘play the sound of the dish’ in his saxophone. He improvised a piece that represented the textures, aromas, and taste of the dish *  Synaesthesia is a condition present in around four to five percent of the population. It occurs when a stimulation of one sense leads to experiences in a second sense.5 For instance, some synaesthetes might hear music when tasting foods, or see colours when hearing music.

TH E E XC ITI N G WO R LD O F M U LTI S E N S O RY E XPE R I E N C E S

| 7

(https://soundcloud.com/carlos-velasco-4/salmon-ceviche-by-ryan-sadri). When the dish was presented to the diners in ‘Awaken Your Senses’, it was accompanied by this music. Overall, the diners reported that the dish and music blended well together and enjoyed the flavour experience, seasoned by the music.

1.3  On How Multisensory Experiences Can Transform the Way We Experience Art Beyond eating experiences, we can now also see museums and art galleries experimenting with the concept of multisensory experiences to engage their audience in new ways. For instance, artists may ask questions such as: Can we augment a painting’s feel through textures, smells, and even tastes? How can the addition of sensory cues transform the way we experience a given art form? An example project that aimed to address those questions is ‘Tate Sensorium’, a multisensory art exhibition at Tate Britain in London, England, that we ­co-created with a multidisciplinary team of researchers and practitioners. We designed the exhibition so that museum visitors could experience art through sight, hearing, touch, taste, and smell.6 Specific sensory elements (sounds, tactile sensations, smells, and even foods) were designed to augment the experience of four different paintings of the Tate collection (Figure 1.2). The intended multisensory experience was carefully crafted by the team based on the artists known intentions (artwork description and ­documentation), and the advice of the art curator at the gallery. Each of the visitors was taken on a multisensory art journey, through each of the paintings, described next.

Figure 1.2  Four paintings and their corresponding multisensory components (described below) used in Tate Sensorium (from left to right): Interior II by Richard Hamilton, Full Stop by John Latham, In the Hold by David Bomberg, and Figure in a Landscape by Francis Bacon. Photo: Tate.

8 | MULTISENSORY ART EXPERIENCES 1. Interior II (1964) by Richard Hamilton. The experience designed for this painting integrated smells and sounds. The sounds were presented using four speakers, one in each corner of the room, to create quadraphonic sound (digital surround sound). Smells were delivered using three Olfactive Spirit Pro perfume diffusers (http://www. signatureolfactive.com/), which were placed on the side walls of the room. Each of the scent diffusers delivered a specific smell: (a) The smell of the late 1940s (spicy carnation fragrances), which fits the look of the woman in the painting; (b) the solvent and glue aroma related to the materiality of the work shown at the back of the room in the picture; and (c) the smell of cleaning products related to the construction of the interior/parquet surfaces. 2. Full Stop (1961) by John Latham. Visitors experienced this painting together with sound and tactile sensations. For the latter, an ultrasonic mid-air haptic device was used to create tactile sensations on people’s hands.7 This is a novel technology that uses the mechanical properties of sound waves to create a pressure point in mid-air, without physical contact. It is also referred to as contactless haptic technology as no attachments such as controllers or gloves are required to perceive the tactile stimulation, which is sometimes described as a feeling of dry rain on the hand. The soundscape, created by the sound designer, emphasized the interplay between the positive and negative space of the artwork, especially the painting’s duality of black and white. This was further highlighted and synchronized with the tactile sensation designed as a combination of a circle-shaped pattern, ­mirroring the roundness of the painting, that changed in size and scale, and a rain-like pattern, which referenced artist John Latham’s use of spray paint. 3. In the Hold (c. 1913–14) by David Bomberg. This painting was ­experienced together with smells and sounds. The sound was presented using four directional speakers. The sound stimuli were designed to bring the listener toward the painting, through two planes of sound. One plane addressed the geometry of the painting (David Bomberg’s quest for ‘pure form’), with acute angles and jagged sounds. The second plane explored the subject matter of the ‘hold.’ The smell stimuli had a similar function, with two scents integrated in two 3D-printed objects: one scent aimed to be shrill, to bring out the blue in the painting, while the other was a diesel and tobacco blend. Both scents were presented at low concentrations and were paired with the sounds.

TH E E XC ITI N G WO R LD O F M U LTI S E N S O RY E XPE R I E N C E S

| 9

4. Figure in a Landscape (1945) by Francis Bacon. Visitors experienced Francis Bacon’s painting with taste, smell, and sound stimuli. The sound was presented to the visitors via headphones. The taste ­stimulus was delivered in form of chocolate (praline) on a plinth, in a bed of tiny chocolate bits that evoked soil. This taste depicted the painting’s dark, harsh nature and the wartime era with multiple ingredients, namely, charcoal, sea salt, cacao nibs, and smoky Lapsang souchong tea. It also aimed to reference the London park setting and flashes of colour with burnt orange. The accompanying smells aimed to convey a sense of Hyde Park’s smellscape of grass, soil, and earth, but also a horsey scent. The sound stimuli referenced the colour palette, visual texture, and the place depicted in the painting. Overall, the exhibition created extensive publicity within the UK and worldwide and was awarded the 2016 Design Week award in the exhibition category. Tate Sensorium is just one example demonstrating the beginning of a new way of thinking about art, the senses, and experiences, that is, multisensory experiences in art galleries and museums. Where else, do you think, could the careful consideration and integration of multiple senses transform existing human experiences?

1.4  Old Meets New in the Context of Multisensory Experiences This is a very exciting time for understanding and designing multisensory experiences, which is being further enabled by new advances in multisensory technologies. Not only new ideas for multisensory experiences are blossoming, but also old ideas are now actually becoming possible and have been revisited in recent years. For instance, innovators such as Walt Disney envisioned film to go beyond silent images and created novel audio-visual formats such as the classic film Fantasia, which revolutionized the world of audio-visual animation. However, Disney’s idea was to venture beyond visuals and audio.8 This proved difficult to develop during Disney’s lifetime, given the existing technical and sense-specific functioning, delivery, and control challenges. For example, how does a filmmaker synchronize a smell with a film scene and avoid multiple scents lingering in the movie theatre? There are, of course, still challenges associated

10 | M U LTI S E N S O RY E XPE R I E N C E S : O LD M E ET S N E W with these formats, such as those related to synchronizing, say, vibrations with movie scenes, so that they become part of the experience and not just a onetime gimmick.9 However, it is now common in many cities across the globe to see 4D theatres, which integrate vibrations, smells, and other sensory inputs into the movie experience. We are witnessing a revival of old ideas due to new knowledge, novel technical possibilities, and overall growing appetite to think beyond audio-visual experience design (Figure  1.3). One of such old ideas that reverberates today is SmellO-Vision, a film concept from the 1960s that integrated smell in films. Smell-OVision’s first film, Scent of Mystery, was advertised with the following sequence: ‘First they moved (1895)! Then they talked (1927)! Now they smell (1960)!’ However, from its beginning, this concept was technically challenging and the lack of consideration of the specificities of the sense of smell resulted in its failure.8 The proliferation of sensory stimulation devices and rising control over and accuracy of scent delivery are providing designers, developers, and innovators with new tools to reconsider the role of smell in films and other media formats, such as mixed realities (VR/AR environments). Time will tell whether or not renewed initiatives around the potential of smell in media experiences will capture people’s interest. However, it is worth

Figure 1.3  The image showcases Sensorama, a machine from the 1960s, contemporary of Smell-O-Vision, that allowed the presentation of immersive 3D film with stereo sound, aromas, and even wind. Source: http://i.imgur.com/CIjMvQg.jpg.

TH E E XC ITI N G WO R LD O F M U LTI S E N SO RY E XPE R I E N C E S

| 11

noting that both our understanding of the senses and the development of new multisensory technologies are leading us to rethink the way in which entertaining experiences are thought of and delivered. It is also worth mentioning that the concept of multisensory experiences does not stop at entertainment, but, as discussed throughout the book, has entered into other areas of our daily lives, including education, science communication, health, and transportation, among others. The exciting world of multisensory experiences is only starting to flourish. We are witnessing an explosion of knowledge on, and technological advances in relation to, the senses. Academics and practitioners alike are taking their chances and stepping into this exciting world to gain even more knowledge on how our senses work, develop powerful new multisensory technologies, and thus open up the opportunities for multisensory experiences. The next chapters dive deeper into the world of multisensory experiences. They introduce fundamentals of multisensory experiences, illustrate how such experiences are increasingly enabled through technology, and envision the future developments and yet-unexplored potentials around these experiences in light of current and future technological advances.

C H A P TE R 2

Illustration 2  Think about how we interact with the world around us. How do we experience a sunflower? Through all of our senses. We can see the flower, touch it, smell it, and even taste its sweetness while listening to the sounds that a light breeze creates around it. We are not only passive receivers of multisensory experiences, though. Instead, we can actually shape them.

Multisensory Experiences: Where the Senses Meet Technology. Carlos Velasco and Marianna Obrist Oxford University Press (2020). © Carlos Velasco and Marianna Obrist. DOI: 10.1093/oso/9780198849629.003.0002

Fundamentals of Multisensory Experiences . . . what is essential in the sensuous-perceptible is not that which separates the senses from one another, but that which unites them; unites them among themselves; unites them with the entire (even with the non-­sensuous) experience in ourselves; and with all the external world that there is to be experienced. – Erich M. von Hornbostel

W

e all use the word ‘experience’ to describe a range of events in our lives. For example, we say ‘it was a great experience’ when referring to the last concert we went to or to a memorable meal we ate in a restaurant. We also use the word ‘experience’ to describe knowledge that we have acquired and to describe ­people who are particularly skilled at something, e.g. ‘I have experience running companies’ or ‘She has twenty-five years of experience in the financial sector’. While in the latter people usually agree that experience is a sort of synonym of knowledge, they usually struggle to come up with a definition of the former. What is an experience? What is similar between that concert and that meal that leads us to call them ‘experiences’? And what is the role of the senses in our experiences? This chapter discusses those questions and also introduces our own understanding of multisensory experiences. To define multisensory experiences (see the box below), we need to understand what experiences are but also what the role of the human senses is in forming

FUNDAMENTALS OF MULTISEN SORY EXPERIENCES

| 19

The experts appear to agree with the idea that we interface with the world around us with our senses, and therefore, they are a gateway for acquiring information from us (our bodies) and the world around us (our environments). Our experiences are not only determined by the physical (e.g. light) and chemical (e.g. flavour) information our senses capture from the environment, but also are formed through the value or meaning that our brain gives to such information (how it captures our attention, how it makes us feel, and what it means to us), which is, at the same time, influenced by previous experiences we have had. In that sense, each of our experiences are unique in a way and different from those that others have as they arise from the interactions that we have with our environments and the impressions that remain. Some of the researchers working on human perception suggest that our brain is constantly trying to decipher what our internal (e.g. emotions) and external worlds (e.g. a given event) may be like, by developing the best guesses of the sources of sensory information that are present in the events we encounter.3 It appears that these guesses, or impressions, are based on both the sensory information that we are presented with in a given event, as well as our expectations about the event, which are based on previous encounters with such sensory information. For example, if you look at Figure 2.14 for a few moments, what do you see? Initially you may see just a series of black and white patterns without much meaning.

Figure 2.1  Seemingly nonsensical black and white patterns.

20 | TH E S E N S E S , AN D HOW TH EY FORM OU R E XPE R I E N C E S However, after looking at Figure 2.2, the effects of knowledge on perception will change what you see when you look again at Figure 2.1. Do you now see a sunflower field?

Figure 2.2 A sunflower field, from which the nonsensical black and white patterns in the previous image were created.

These two images (Figures 2.1 and 2.2) demonstrate that experiences can also be formed through the process of contrasting sensory inputs associated with an event (the non­sens­ical black and white patterns from the first image) and our ­expectations of the most likely state of such an event in the world (using information learned from the second image, and applying it to the first). This is just an example related to our visual world, but there are many more senses that supply information from which our experiences are formed (see other examples here: https://www.ted. com/talks/anil_seth_your_brain_hallucinates_your_conscious_reality/ transcript?language=en).

2.3  The Senses, and How They Form Our Experiences Since the fourth century bce, the five major senses have been identified as sight, hearing, touch, taste, and smell (the so-called Aristotelian senses).

FUNDAMENTALS OF MULTISEN SORY EXPERIENCES

| 21

However, today the scientific community widely accepts that we have more than the five basic senses. For example, proprioception is the sense that allows us to keep track of the position and movement of our body. Interoception, a lesser-known sense, allows us to recognize internal body cues like heart beats or respiration. Some scientists even suggest that we can sense Earth’s magnetic field.5 Closer to home, however, we may subjectively experience touch as a single sense—something that we perceive through our skin—via the somatosensory system. Touch involves a number of receptors and sensations. The human body has specialized receptors to process temperature (thermoreceptors), pressure (mechanoreceptors), and pain (nociceptors). Thus, our sense of touch alone is a very complex system that helps us distinguish, for example, textures, shapes, and different levels of warmth and discomfort (what we consider as annoying or painful). This kind of fine granularity of sensations can be found in relation to all our five senses, and beyond. Each of our senses contributes collaboratively to our experiences. In other words, our senses do not act independently of each other, although we sometimes may subjectively experience it as such. Instead, they work together to decipher how we perceive the world around us. Flavour perception is a good example of how several of our senses interact to form our experiences.6 Although we tend to think about taste as how we sense what we eat and drink, the sense of taste only processes a subset of elements associated to certain receptors in the tongue. At the very least, these include sweet, sour, bitter, salty, and umami (and perhaps others such as metallic and fatty). But the richness of flavour perception—what we experience when we eat any given food—involves also the sense of smell (also referred to as olfaction). When we eat, a ‘smell path’ occurs in the mouth (called retronasal olfaction), and our brain integrates taste and retronasal olfaction—and perhaps other sensations such as burning/spiciness and mouthfeel—when we perceive the flavours that we experience in foods. You only need to tap your nose while eating to realise how part of the experience disappears, because you are partially removing the sense of smell from it. In fact, because the smells of foods are so important for flavour perception, as well as the tactile sensations that occur in the mouth, we often experience an illusion called ‘oral referral’.7 While we think that flavours come from the mouth, much of them actually come via retronasal olfaction. Notably, our expectations about what we eat and drink also play a role in terms of the value that our brain assigns to a given food. Just think of the difference in experience if you expect a sweet food but it is actually bitter (e.g. lemon sorbet that looks like chocolate ice cream), compared to when you know in advance that it is bitter.

22 | TH E S E N S E S , A N D H OW TH E Y FO R M O U R E XPE R I E N C E S It is clear that the senses interact with each other as we perceive and experience the world around us. Our brains then merge information from different senses to create our experiences. To further illustrate this point, imagine that you are walking through a field of sunflowers on a warm summer day (Figure 2.3). You may hear the breeze drifting through the sunflowers, your hands may touch the petals, and you may even be able to smell the scent emer­ging from the flowers. The light and bright colours of the sunflowers might create a pleasant experience, and if you eat the seeds, you may even taste their salty, nutty, gentle flavour. All those sensations together will form an experience that, potentially, leaves you with a cohesive memory about what the sunflower or the field of sunflowers feels like, and allows you to remember this moment again in the future. Can you describe all the sensory details of any experience you particularly like?

Figure 2.3  A sketch of the multisensory world of a sunflower field. We represent some of the information associated with the five ‘traditional’ senses and how they create an impression that forms this multisensory experience. All sensory information make its contribution to this formation process. Moreover, what we hear can influence what we see, what we see can influence what we smell, and so on. And not only that, but in many cases, what we perceive is more than just the sum of the different sensory elements8.

FUNDAMENTALS OF MULTISEN SORY EXPERIENCES

| 23

However, note that if you take the sunflower out of the described context and try to relive that moment in a busy street in an urban environment, the experience may feel dramatically different. One reason for this is that the brain merges the different streams of sensory information, i.e. the scent of the flower with the surrounding light, the shapes, the colours, etc. Our brains then contrast this sensory information with our expectations and existing knowledge, which then helps us form the experience of the sunflower or the sunflower field. Several concepts help us analyse and explain how our senses interact with each other while we perceive the world around us. Table  2.1 shows six key concepts that illustrate why it is important to not only consider each sense individually, but to also think about the way in which they interact with each other to form multisensory experiences.9 As we explain later, these concepts can be used to guide the reflection and design process around multisensory experiences. Table  2.1 presents possible implications that these concepts may have when it comes to multisensory experiences: sub-additive, non-additive, and super-additive. Let us consider again the example of the sunflower field in order to illustrate these concepts from Table 2.1 and their possible implications for multisensory experiences. Consider that you want to recreate the experience of a sunflower, but you do not have the actual sunflower. Perhaps first you would collect a series of elements in order to represent the experience: colours, shapes, textures, scents, and so on. You may place them, for example, in a large room so that people experiencing your ‘sunflower’ walk into the room (spatial congruence) and find such colours, shapes, smells and so on, representing the sunflower. As they get closer, they may feel the warmth coming in through the windows and then when close to the elements, they may smell your chosen scents (all in temporal synchrony). Probably the colours you pick are yellow, green, and brown/black, which are colours that are closely related to sunflowers. Similarly, you would probably choose shapes that resemble the way a sunflower looks like (semantic congruence). The said colours and shapes likely aid you in delivering a sunflower experience (super-additive effect). Of course, you could use a different colour, e.g. pink, but the effect would probably be sub-additive, i.e. the experience of the sunflower would not be as compelling; instead, it would be something else (perhaps a variant experience of a sunflower). It is important to note, however, that sometimes using incongruent elements may result in a positive surprise. For example, a restaurant known for its innovative dishes may serve, say, an egg-looking food which is

24 | TH E S E N S E S , A N D H OW TH E Y FO R M O U R E XPE R I E N C E S Table 2.1  Concepts that help analyse and explain how the senses interact with each other while we perceive the world around us9 and their possible implications for multisensory experiences. Concept

Meaning

Examples

Temporal congruence

Whether or not two or more senses are presented with information at the same time.

When you watch a movie and the sound is dubbed (the lips of the actor and the sound do not match), the experience of the movie does not feel as right as when you view the original.

Spatial congruence

Whether or not information presented to two or more senses comes from the same place (source in space).

When you hear the bell of a bicycle, you would usually attribute the sound to the bike that is ‘closer’ to the listener.

Semantic congruence

When information presented to two senses shares the same identity or meaning.

If you taste tomato-flavoured crisps, it is likely that you associate them with the colour red (or perhaps green), because both flavour and colour map onto the concept/ meaning of ‘tomato’.

Crossmodal correspondences

Refers to the associations/ compatibility of features (such as colour brightness, shape curvature, sound pitch) across the senses.

Most people associate sweet tastes with rounder shapes and sour tastes with angular ones. There are some surprising connections between features across the senses.

Sensory dominance

There are certain situations where one sense may dominate over the others.

If you buy a coffee maker, it is likely that, at first, what you see is the most important element in how you experience it. With time and use, though, what you hear (the sound of the machine) may become more important in how you experience the coffee machine.

Sensory overload

When ‘too much’ sensory information is presented to one or more senses, we may be overloaded, which can negatively affect our experience.

Imagine visiting a place like Times Square in New York City for the first time. At first, it is difficult to make sense of the environment because of all the lights, sounds, smells, etc. Therefore, it may be difficult to relive the experience of a sunflower field in this context, at least relative to other contexts.

Possible implications for multisensory experiences

⎫ ⎪ ⎪ Sub-additive: ⎪ Incongruence in terms of these four ⎪ concepts can lead to ⎪ less-compelling experiences. ⎪ Non-additive: ⎪ Sometimes congruence or ⎬ incongruence may lead to more⎪ not or less-compelling ⎪ experiences. ⎪ Super-additive: in terms ⎪ Alignment of these concepts ⎪ can lead to more compelling ⎪ experiences. ⎪ ⎪ ⎭ ⎫ ⎪ Not all senses are equally ‘weighted’ in ⎬ all our experiences. ⎪ ⎭ ⎫ ⎪ ⎪ ⎬ ⎪ ⎪ ⎭

Sometimes ‘too much’ sensory information can have detrimental effects for the experience.

FUNDAMENTALS OF MULTISEN SORY EXPERIENCES

| 25

actually an ice cream!10 In such a case, the semantic incongruence can potentially elicit positive surprise. If we return to the sunflower example, we can consider adding the sound of the breeze in a sunflower field, which may not affect the sunflower experience significantly (non-additive), as it is not necessarily specific to the sunflower, nor fully needed to experience it. The latter may also be an example of sensory dominance, where sound is not as dominant for the experience of the sunflower as perhaps what you see, touch, and smell. Note, though, that trying to develop this experience in a space that is very different from the sunflower’s natural context, such as a busy street in a big city, may be quite difficult—your senses may be overloaded to the point that it becomes difficult for your brain to filter what is relevant to the experience of the sunflower (sensory overload). The sunflower example is only one of many multisensory experiences we can design for. The same analysis and reflection using the six concepts in Table 2.1 can apply to things like eating an apple, watching a movie, or reading a book. All experiences have spatial and temporal elements, crossmodal correspondences, and semantic aspects to them, which in a way define how they are formed. Perhaps in some experiences some senses dominate more than others. For example, when you are reading, sight may be salient, or when tasting wine, both taste and smell may be essential. However, it is important to remember that all the senses have the potential to shape our experiences.

2.4  What Are Multisensory Experiences? At this point, we hope that you have a better idea of what experiences are, what role the senses play in them, how our senses collaborate with one another, and some of the concepts that we may use to analyse, reflect upon, and design multisensory experiences. Considering all these concepts discussed in this chapter, we now revisit our own definition of multisensory experiences, which we presented at the beginning of this chapter. By stating our viewpoint on multisensory experiences, we expect that we can open up the space for thinking about how to understand and design multisensory experiences. To start with, note that, in a way, all experiences are multisensory. What makes the concept of multisensory experiences different from an ‘experience’ is that, first, it places the human senses at the centre of how the experience formation process is understood, and second, it involves a certain level of

26 | W HAT AR E M U LTI S E N S O RY E XPE R I E N C E S ? intentionality. In other words, the sensory elements associated with an event are carefully crafted by someone in order to leave a given impression on another person or group of people (the receiver/s). Moreover, it is important to keep in mind that it is implicit in the definition of an event that it has a beginning and an end. We suggest that multisensory experiences are impressions formed by specific events where someone has carefully crafted the sensory elements in them. For example, to create the impression of a sunflower, colours, shapes, textures, and smells are considered. The senses and their interrelations, and what we know about them (see Table 2.1 for key concepts), are placed at the centre of the formation of the impression of the sunflower, even in the absence of a real flower. Multisensory experiences can be associated to something that we have ex­peri­enced before, like a sunflower. But they can also be something completely new to us, for example, the experience of smelling an alien flower on an alien planet, by using novel mixtures of smells and new emerging multisensory ­technologies. Multisensory experiences vary in terms of how the different sensory elem­ ents are combined in an event. Each sense and its overall congruency (whether it matches or not with the intended impression) has a role in creating a given overall impression. The concepts presented explain in part whether two or more sensory elements (such as a smell and a colour) will be in harmony or not. Importantly, harmony does not necessarily mean that the experience needs to be pleasant. A multisensory experience can also involve the impression of something unpleasant (say, the experience of a flooded sunflower field), but the senses may still be in harmony around that intended impression (the joy of a walk through a sunflower field versus a sunflower field in the context of climate change). In order to make our definition of multisensory experiences tangible and practical, we have put together a companion tool for this and the following chapters, which we call xSense Design Cards (xSense, see Appendix). xSense is a tool designed to empower you to analyse and design multisensory experiences by yourself through considering the fundamental elements presented in this chapter. In order to illustrate how to use this tool, the examples of multisensory experiences that we present in Chapter 3 follow the xSense format. As discussed in the introduction, multisensory experiences are increasingly influenced, transformed, and enabled through technology. As such, the sensory elements that are crafted in an event can be either physical, digital, or a com­bin­ ation of both (something that we consider in xSense; see Chapter 3 for i­ nformation

FUNDAMENTALS OF MULTISEN SORY EXPERIENCES

| 27

about what technology means regarding our definition of multisensory experiences). Chapter 3 presents multisensory experiences in light of emerging multisensory technologies and provides eight examples of multisensory experiences created both without as well as (and foremost) enabled through technology. We show you how multisensory experiences are now designed along the reality– virtuality continuum. In other words, most of our experiences are currently taking place in a sort of mixed reality, where physical and digital worlds merge. As such, the senses meet with technology, and new horizons open up for multisensory experience design.

C H A P TE R 3

Illustration 3  Technology has changed how we experience the world, how we communicate, work, relax, and share experiences with each other. Some technologies such as virtual and augmented reality can transform what and how we experience, for instance, a sunflower. New sensory elements can be added, and novel forms of mixed reality can be created.

Multisensory Experiences: Where the Senses Meet Technology. Carlos Velasco and Marianna Obrist Oxford University Press (2020). © Carlos Velasco and Marianna Obrist. DOI: 10.1093/oso/9780198849629.003.0003

The Human Senses Meet Technology The scent organ was playing a delightfully refreshing Herbal Capriccio— rippling arpeggios of thyme and lavender, of rosemary, basil, myrtle, tarragon; a series of daring modulations through the spice keys into ambergris; and a slow return through sandalwood, camphor, cedar, and new-mown hay (with occasional subtle touches of discord—a whiff of kidney pudding, the faintest suspicion of pig’s dung) back to the simple aromatics with which the piece began. The final blast of thyme died away; there was a round of applause; the lights went up. – Aldous Huxley, Brave New World

M

ultisensory experiences are affected by today’s proliferation of, and advances in, technology. Technology has revolutionized how we communicate, work, relax, and share experiences. In effect, tech­ nology has not only changed the way in which we experience the world around us, but it has also become an ex­peri­ence in itself. Remember how you felt when you left your smartphone at home, when there was no Internet connection, or when your computer didn’t turn on and the screen stayed black. Experiencing those moments of being ‘online’ and ‘offline’ can be frustrating. It can disconnect us from others and at the same time reminds us about what it means to be human, without extensions or augmented capabilities, in an increasingly digital world. Even events that were only partially influenced by technology until recently, such as eating, have become a playground for introducing technology to provide us with completely new dining experiences (Figure 3.1).



TH E H U M A N S E N S E S M E ET TE C H N O LO GY

| 31

Figure 3.1  Traditional eating (left, free stock picture) and eating with a VR headset (right, from ‘Tree by Naked’ restaurant’, https://www.dw.com/en/tokyo-virtual-reality-restaurant­combinescusine-with-fine-art/g-44898252).

How would it feel to eat your dinner wearing special glasses that make your vegetables look more appealing, or even to eat in zero gravity without the use of a plate (Figure 1.1)? It may sound like science fiction, but advances in levitation technology make it possible.1 For example, you could—like Anakin Skywalker in Star Wars: Attack of the Clones—levitate a piece of pear to impress your din­ ner date. Unexpected and novel experiences can become a reality through techno­logic­al advances, and thus not only change the experience, but become the experience itself. Considering the way in which new technologies are transforming our ex­peri­ ences, we must specify what technology means for our definition of multisen­ sory experiences presented in Chapter 2 (and also xSense in the Appendix). We may interact with the sensory elements in an event through technologies that stimulate our senses. As an example, let us image that some of your friends are very excited about their new project of growing sunflowers and they want you to experience them (Figure 3.2). They may invite you to either go through their sunflower field without any technology (event A) or to go through the same field with the aid of an app that uses augmented reality (AR) to provide a description of different sunflower types in the field (event B). Another alterna­ tive is that they take you through a virtual reality (VR) tour around the sun­ flower field (event C). While there is no technology used in event A, in event B it transforms and augments the experience. In event C, the experience is not just enabled by, but created through, VR technology. Thus, technology can influence the event or become the means for creating the event itself. Advances in technology have made it possible to generate AR and VR solu­ tions that allow humans to overlay computer graphics onto our immediate

32 | TH I N K I N G A BO UT M U LTI S E N S O RY E XPE R I E N C E S

Figure 3.2 Three events to experience a field of sunflowers. Event A (left): walk in the ­sunflower field; Event B (middle): AR app use while walking through the sunflower field; Event C (right): experience a sunflower field in VR. Source: free stock picture from https://www.pexels.com/.

environment (in AR) and to explore fully immersive computer-generated worlds (in VR). We can now experience real and virtual environments with breath-taking graphics and high-fidelity audio. However, without stimulating other senses such as touch and smell, and in some cases even taste, those virtual and augmented environments may lack realism and the experiences may not be compelling enough. Above all, our ability to interact and engage with those environments would remain limited to a subset of our sensuous-perceptible capabilities. We believe that augmentation and virtuality are something that can involve all the senses in the context of multisensory experiences. This chapter presents a selection of examples that illustrate how technology, particularly multisensory technologies, is used in the context of multisensory experiences. It explores examples in the physical reality we know and goes beyond and provides cases where multisensory technologies enable the cre­ ation of completely new realities that augment, immerse, and present novel experiences. As mentioned before, with multisensory technologies we refer to those technologies that stimulate several senses. In a digital and interactive world, we can see, hear, and feel, but perhaps also taste or smell.

3.1  Thinking About Multisensory Experiences and Technologies: Eight Examples Along the Reality–Virtuality Continuum Since the advent of digital technologies, our experiences have moved from ‘real’ (physical reality) or offline to increasingly mixed reality (a combination of both real and digital reality) and ultimately ‘virtual’ (VR). In other words, our daily life experiences are becoming a product both of what we perceive from the

34 | TH I N K I N G A BO UT M U LTI S E N S O RY E XPE R I E N C E S This chapter presents a set of eight examples of multisensory experiences that are positioned along this reality–virtuality continuum (Figure 3.3). Each of the examples in this chapter is described following the structure of the xSense Design Cards (Appendix). We designed xSense as a tool to analyse, reflect upon, and guide the design of multisensory experiences. In other words, ­following our definition of multisensory experiences, xSense guides us through the process of multisensory experience design. It allows us to specify the ­‘background’ for the ‘impression’ we want to create in a specific ‘event’. It also allows us to detail the ‘sensory elements’ that we may use, the ‘concepts’ that we ought to consider, and the ‘enabling technology’ that we may use or imagine using in the design of the multisensory experience. Mixed reality Real

Virtual Augmented reality

Augmented virtuality

(1)

(2)

(3)

(4)

(5)

(6)

(7)

(8)

Wine Tasting

Programmable Pasta

Dark Matter

Crystal Universe

Meta Cookie

Season Traveller

FLY VR

TREE VR

Figure 3.3  Selection of eight multisensory experience examples situated at different points of the reality-virtuality continuum.

3.1.1  Wine Tasting3: What if Different Lights and Sounds Could Season What You Eat and Drink? Background: This experience was developed to demonstrate the influence of ambient light colour and background soundscapes on wine taste perception (Figure  3.4). Multiple companies are already using multisensory design as a way to enhance their customers’ food and drink experiences and to differentiate them from those of their competitors. This specific research project was based on a collaboration between Oxford University researchers, the Campo Viejo wine brand, and Pernod Ricard. At the wine tasting event, participants were offered a glass of Campo Viejo Reserva 2008 (Rioja) in different rooms that varied in light colour and accompanying soundscapes.



TH E H U M A N S E N S E S M E ET TE C H N O LO GY

| 35

Figure 3.4  Photos of the rooms in which the white (top), red (middle), and green (bottom) lighting were used for the multisensory wine tasting experience. Figure reprinted from ‘A large sample study on the influence of the multisensory environment on the wine drinking ex­peri­ ence’ by Spence, Velasco, and Knoeferle (2014)/CC BY 4.

36 | TH I N K I N G A BO UT M U LTI S E N S O RY E XPE R I E N C E S Impression: Specific taste notes in a wine, namely, sweet and sour. Event: Multisensory wine tasting session staged in a central London location (The Southbank Centre) that included different tasting rooms and correspond­ ing light and music atmospheres. Sensory elements: Red atmospheric light to bring about the sweet notes of the wine, green atmospheric light to emphasize the sour notes of the wine, and white light for a more neutral taste. In addition, two soundscapes were used in this experience. The ‘sweet soundscape’ was legato, low in auditory roughness and sharpness, and highly consonant. The ‘sour soundscape’ was staccato, high in roughness and sharpness, and moderately consonant (the soundscapes can be accessed at: https://soundcloud.com/crossmodal/sets/tastemusic). Concepts: The concept underlying the experience refers to crossmodal cor­res­ pond­ences, particularly the associations between colour and taste, and music and taste. People tend to associate the sweet and sour tastes with colours and sounds, such that they can be used in a congruent or incongruent manner in order to influence the experience of a given taste note. Both the lights and soundscapes were presented in dedicated rooms (spatial congruence) and ­people sampled the wine while exposed to them (temporal congruence). Enabling technology: Lightbulbs of different hues and a high-definition sound system. Further information: The multisensory psychology of wine tasting: https:// www.vice.com/en_us/article/ypwp77/the-multisensory-psychology-of-winetasting

3.1.2  Programmable Pasta4: What if Your Pasta Changes its Shape While Cooking or Eating into Whatever 3D Form You Imagine? Background: This experience was designed to allow users or chefs to custom­ ize their food cooking and eating experiences by ‘programming’, for ex­ample, the shape of Italian pasta (Figure 3.5). Researchers are exploring food design through edible material innovation, novel fabrication techniques, and computeraided design to enrich eating experiences and save food storage space. A research project carried out at the HCII Institute at Carnegie Mellon University in Pittsburgh, Pennsylvania specifically explored the seamless integration of technology into



TH E H U M A N S E N S E S M E ET TE C H N O LO GY

| 37

Figure 3.5  Visualisation of shape-changing pasta examples that transform from flat 2D into ­different 3D shapes. Photo credit: Ye Tao, Youngwook Do, Humphrey Yang, Yi-Chin Lee, Guanyun Wang, Catherine Mondoa, Jianxun Cui, Wen Wang, and Lining Yao.

the cooking process of flour-based foods such as Chinese dumplings, Japanese ramen, Mexican tortillas, French bread, or Italian pasta. Impression: Shape-changing, flour-based food. Event: Cooking or eating a meal that includes a pasta that transforms its shape from 2D to 3D. Sensory elements: The shape of the pasta is flat and transforms into any pos­ sible shape during the process of cooking or when presented to the diner, thus including visual movement. Imagine a chef pouring hot water (temperature) on pasta sheets in a soup bowl. While the water covers the pasta, the sheets trans­ form from a 2D to 3D curly shape. The pasta can have specific colours, shapes, textures, and flavours. Concepts: The visual sense dominates the experience and it comes with an elem­ent of surprise, given that your expectations based on static pasta are con­ fronted with the movement of the transforming pasta. The shape changing factor allows flexibility in terms of the identity or meaning of the pasta (semantic

38 | TH I N K I N G A BO UT M U LTI S E N S O RY E XPE R I E N C E S congruence). For example, it would be possible to custom-design a specific favourite pasta type (farfalle), a heart-shaped pasta for a loved one, or even design a completely novel shape that serves a given purpose (to facilitate eating in the elderly or children). Enabling technology: This experience is enabled through new fabrication and 3D printing techniques that allow control over the physical appearance of food while still using common food materials (protein, cellulose, or starch). Users can customize food shape transformations through a pre-defined software where they can select the shape, tune the density, orientation, thickness, ­texture, and other properties of the food. Based on this design, they can then fabricate the designed pasta patterns using a 3D printer. The transformation process is triggered by water adsorption during the cooking process, which can be part of the presentation at the diner’s table as described above. Further information: Programmable Pasta video: https://vimeo.com/199408741

3.1.3  Dark Matter: What if Something Invisible, Like Dark Matter, Becomes Visible and Feelable? Background: This experience was designed to explore new ways to commu­ nicate scientific concepts, especially the concept of dark matter, to the wider public. Dark matter makes up most of the universe, but it is invisible and can only be detected via its gravitational effects (https://home.cern/science/ physics/dark-matter). Dark matter is often discussed in the general media and the concept can be difficult to grasp for non-experts; hence, efforts are being made through the use of technology, sensory experiences, and new ways of storytelling to make it more accessible. Science communication aims to create personal responses to science and thus foster awareness, enjoyment, interest, opinion-forming, and understanding of scientific concepts like dark matter (Figure  3.6). This project was a research collaboration between the group of Professor Roberto Trotta at Imperial College London and the Sussex Computer-Human Interaction (SCHI) Lab at the University of Sussex, in the UK. Impression: The concept of dark matter. Event: An installation on dark matter in a science museum (part of The Great Exhibition Road Festival 2019).



TH E H U M A N S E N S E S M E ET TE C H N O LO GY

| 39

Figure 3.6  Multisensory elements in the dark matter experience. (a) Fluorescent body outline indicating where people need to lie on a beanbag, (b) Mid-air haptic box with fluorescent hand outline, (c) Dark matter simulation project inside of the inflatable dome where the experience takes place, (d) Wireless noise-cancelling headphones, (e) Box containing popping candy pills, (f) Mid-air haptic device used in b, (g) Scent releasing device. Photo credit: (a, b, d, e, f) Roberto Trotta, (c) Aquarius Simulation rendering for planetarium by European Southern Observatory from: https://www.eso.org/public/videos/aquarius_springel), (g) SCHI Lab.

Sensory elements: People step inside an inflatable dome with a friend, lie down on a bean bag, and wear headphones while staring into a simulation of the dark matter distribution in the universe. Key human senses (sight, hearing, touch, smell, and taste) are stimulated throughout the experience. The user can hear an artificially engineered sound, a storm-like but unfamiliar auditory sensation, that varies in intensity, pitch, and texture to represent the concepts of dark matter wind during an earth-year and its density profile in our galaxy. Moreover, the unflavoured pop­ ping candy that dissolves into a sweet taste inside the users’ mouth, creates a crack­ ling effect inside the mouth and skull, amplified by the participant’s headphones. Concepts: The narrative creates semantic congruence around the concept of dark matter through the different sensory elements, which are spatially and temporally coordinated. Temporal synchrony is also used throughout the ex­peri­ence. For example, the sound is presented in synchrony with the mid-air haptic sensations displayed on the palm of one hand. These elements are further extended with an automatic release of the scent of black pepper to boost the memory retention through crossmodal properties such as freshness, coldness, and pungency.

40 | TH I N K I N G A BO UT M U LTI S E N S O RY E XPE R I E N C E S Enabling technology: This installation is facilitated through the integration of multiple technologies, including an ultrasonic mid-air haptic device to create tact­ ile sensations on people’s hands, a scent-delivery device to release the smell at spe­ cific moments, a projector to create the visual effect of the universe inside the dome, and noise-cancelling headphones to follow the audio narrative. All sensory stimuli are controlled and synchronized via a central computer. The mid-air haptic device was provided by Ultraleap and the scent-delivery device by OWidgets. Further information: Dark Matter video: https://www.youtube.com/watch?v= Tlj-XNDl7sU and https://jcom.sissa.it/archive/19/02/JCOM_1902_2020_N01

3.1.4  Crystal Universe Experience: What if You Could Interact with the Universe Through Digitally Controlled Lights and Sounds? Background: This experience was designed to create impressions and memories that do not naturally occur in our everyday lives. Here a person is placed inside a Crystal Universe composed of varying lights and soundscapes (Figure 3.7). This is one of several art installations created for the general public by the Art Collective

Figure 3.7  Photo showing visitors wondering through the Crystal Universe created by the Art Collective teamLab. Photo credit: Patricia Ivette Cornelio Martinez.



TH E H U M A N S E N S E S M E ET TE C H N O LO GY

| 41

teamLab. The work by teamLab is known for their creative experimentation on the intersection between art and technology, which also engages different senses. Impression: Wander through the Crystal Universe. Event: An art installation representing a light sculpture of the universe that people can interact with by using a mobile device. Sensory elements: Lights varying in colour, hue, and brightness ac­com­pan­ied by soundscapes. The sense of proprioception (movement) is also important as visitors are required to move around the installation. The surface on which visitors walk is a mirror, which creates a characteristic tactile sensation. Concepts: In this installation, there is spatial congruence in that all signals are aligned in the same location. There is also temporal synchrony between some of the lighting patterns and the soundscapes. Semantic congruence is observed in that both lights and sounds are representing elements of the universe, such as stars and shooting stars, vastness, emptiness, wonder, and darkness. Enabling technology: This is an augmented environment experience that uses thousands of suspended coloured LEDs in a 3D space. Sensors capture visitors’ movements and create changes in the installation. Visitors can also interact with the installation through an app that allows them to select the installation’s elements, such the Crystal Universe’s colours and light patterns. This is enabled through teamLab’s interactive 4D vision display technology. Further information: Crystal Universe website: https://www.teamlab.art/w/ dmm-crystaluniverse/

3.1.5  Meta Cookie Experience5: What if You Could Season a Plain Food Digitally? Background: This experience is designed to augment reality through the use of new technology. In this case, through a carefully engineered multisensory AR headset, users can modify perception of a plain cookie such that it is inter­ preted as, say, a chocolate cookie (Figure  3.8). Researchers and practitioners alike have been interested in augmenting food experiences through emerging technology. Carried out at the University of Tokyo, Japan, project researchers explored the augmentation of a cookie’s flavour through a multisensory AR experience called Meta Cookie, which superimposes visual and smell information on a plain cookie in order to change its perceived flavour.

42 | TH I N K I N G A BO UT M U LTI S E N S O RY E XPE R I E N C E S

Figure 3.8 Photo of the AR flavour display that augments different flavours onto a plain cookie. Photo credit: Takuji Narumi.

Impression: A flavoured cookie (though it is actually a plain cookie). Event: Eating a digitally augmented plain cookie. Sensory elements: The experience involves the senses of sight, smell, taste, and touch. There are several combinations of visuals and aromas available to choose from, including those associated with chocolate, lemon, almond, tea, straw­ berry, and maple. The cookie itself has a characteristic sweet taste and evokes a particular mouthfeel. Concepts: Spatial, temporal, and semantic congruence are key in the cre­ation of this multisensory experience. For example, spatial and temporal congruence are achieved by increased aroma intensity when the cookie is taken closer to the user. On the other hand, the look and aromas are combined following a given flavour (semantic congruence), such as chocolate. Enabling technology: The technology involves two cameras, a head-mounted display (HMD), and a multi-scent delivery device. The cookie has a marker that is recognized by the camera in order to superimpose a given visual appearance to the cookie, seen through the HMD. An air pump sprays out the smell of the chosen cookie, increasing its concentration as the system ‘sees’ the cookie approaching the user’s nose.



TH E H U M A N S E N S E S M E ET TE C H N O LO GY

| 43

Further information: Meta Cookie video: https://www.youtube.com/watch?v= 3GnQE9cCf84

3.1.6  Season Traveller6: What if You Could Immerse Yourself into a Virtual World Where You Can Easily Switch from Season to Season? Background: This experience was designed to allow a person to travel four seasons (spring, summer, autumn, winter) in a fully virtual environment through stimulating multiple sensory modalities. Given the promising land­ scape of VR systems, researchers are looking for ways to make virtual worlds more compelling and realistic. Carried out at the National University of Singapore, Season Traveller was a research project that used a carefully en­gin­ eered range of sensory cues integrated with a VR headset in order to explore the potential for convincingly recreating the different seasons (Figure 3.9). Impression: The feeling of different seasons. Event: Travelling through different seasons. Sensory elements: This example recreates the four seasons in a virtual en­vir­on­ ment through integrating smell, tactile (temperature, wind), and visual elem­ents.

Figure 3.9  Illustration of the multisensory headset including wind, thermal, and smell ­elements. Photo credit: Nimesha Ranasinghe.

44 | TH I N K I N G A BO UT M U LTI S E N S O RY E XPE R I E N C E S For example, summer uses a lemon scent, mild wind strength, and heating. For the winter season, a mint scent is combined with medium wind strength, and cooling. Concepts: Spatial and temporal congruence are the way in which all sensory elements are joint together in the experience. All sensory elements are charac­ teristic of, or evoke sensations associated with, each season (semantic congru­ ence). For example, cooling is typically associated with winter, and while mint is not necessarily linked to winter (perhaps incongruent), it evokes the cooling sensation associated with this season. Enabling technology: A novel wearable multisensory VR system that inte­ grates temperature, tactile, and olfactory stimuli into a traditional audio-visual HMD. These functionalities are implemented through Peltier elements (ther­ mal), a set of fans (wind), and micro air-pumps (smells), which are controlled by commands that are sent from the HMD via Bluetooth. Further information: Season Traveller video: https://www.youtube.com/ watch?v=vTqqhIaYeIE

3.1.7  FLY VR: What if You Could Sense Major Events in the History of Aviation by Flying Through Time and Space? Background: This experience was designed to turn everyone into a timetravelling pilot and experience the evolution of flying from its early days to a possible, though unknown, future (Figure 3.10). To celebrate British Airways’ centenary, the airline sponsored FLY, an interactive multisensory VR ex­peri­ ence. The experience is built by immersive artist Charlotte Mikkelborg and a team of multiple-award-winning VR creators (NoveLab) and an Oscar-winning practical effects team (Moco FX). The overall lead of the project was taken by Picture This Productions and OWidgets was responsible for creating the multi­ sensory experience. Impression: Flying through time and space. Event: An installation in a gallery (Saatchi Gallery) on the evolution of flying. Sensory elements: FLY is experienced inside a big interactive ‘egg’ (eggshaped structure). The egg represents the place where the mystery of flight first



TH E H U M A N S E N S E S M E ET TE C H N O LO GY

| 45

Figure 3.10 Illustration of the multisensory FLY VR experience on a motion platform, pilot seat, and integrated wind, thermal, and smell elements. Photo credit: FLY Nick Morrish/British Airways.

takes physical form. At the front of the egg is a conductive placenta. In response to human touch, the egg changes colour from cool blue to vital red. If enough people incubate the egg with their hands, the entire egg will turn red and then white as it cracks open, releasing the life inside. Inside the big egg-shaped structure, movement, wind, temperature, and smells are combined to fully immerse a person into the wonders and innovation of flying from the past through the present and into the future. For example, seven custom-designed smells were presented to the user and included leather, paint oils, and fireplace scents of Leonardo DaVinci’s workshop to an ocean breeze on the Wright Brothers’ beach. Concepts: Temporal and spatial congruence are used to bind all sensory elem­ ents together to the act of flying. In addition, the sensory elements convey the concept of ‘flying’ through semantic congruence, and also the concept of a given time in history, through the use of visuals, temperature, and smells asso­ ciated with that time and space. The delivery of the smells was synchronized with visual scenes or triggered by people’s movements. For example, looking in the direction of a painting in Leonardo DaVinci’s laboratory will trigger the delivery of the corresponding smell.

46 | TH I N K I N G A BO UT M U LTI S E N S O RY E XPE R I E N C E S Enabling technology: Users stand on a large real-time motion platform and wear a VR headset while stimulated with a fan (wind), heater (temperature), and a multi-scent delivery device. All sensory elements are controlled and syn­ chronized through one computer. The egg-structure is equipped with LED strips, which are activated by a touch sensor in the front of the egg. Further information: The website of the producer of FLY: http://www.­ picthisproductions.com/

3.1.8  TREE VR: What is it Like to Be a Tree in a World of Deforestation? Background: TREE was designed to create empathy toward the challenges that deforestation brings to the world. With this piece, the creators wanted to make deforestation appear as something deeply personal. In TREE, climate change happens to the user. Beyond that, it is an intimate and solitary ex­peri­ence that hopefully increases respect for nature and how it functions (Figure 3.11). This is a multisensory experience installation created by the team of New Reality Co. in collaboration with, and support from, a variety of com­pan­ies and organiza­ tions, most notably Rainforest Alliance.

Figure 3.11  Picture of a person becoming a tall Peruvian tree in a virtual forest environment, augmented through vibration sensations on the back, and smells throughout the lifespan of the tree. Photo credit: Ioannis Gkolias.



TH E H U M A N S E N S E S M E ET TE C H N O LO GY

| 47

Impression: The lifecycle of a tree, in first-person perspective, in the context of deforestation. Event: An installation at various public festivals and gatherings such as the World Economic Forum, World Government Summit, Sundance Film Festival, TED, and Tribeca Film Festival. Sensory elements: Wind, vibrations on the user’s back, and smells are inte­ grated into a multisensory VR experience. Users feel, through visuals, vibra­ tions, and smells, as if they are a Kapok tree consistently and steadily growing from a seed to a tall tree, to later be cut down. This is achieved by dynamic lighting and shadows, and vibration patters, generated in real-time, as well as changes across three scents (Earth peat, foliage, living gun smoke). Concepts: Temporal, spatial, and semantic congruence bind all sensory elem­ ents. As a seedling, the user emerges from underground into the middle of a forest, smells the foliage scent, and at the end when the tree burns and is cut down, smells a smoke smell—a reminder of the hardship society is facing in light of increased deforestation (semantic congruence). Alongside the scent, the haptic feeling on the user’s back intensifies as the tree falls (temporal syn­ chrony). People are also visually represented in the virtual world, as their arms and hands become branches of the tree (spatial congruence). Enabling technology: The experience is enabled through a VR headset, a haptic feedback device (Subpac—a backpack that creates vibrations on the back), a multiscent delivery device (OWidgets system), and ambient heating and fan units. Further information: The official TREE VR website: https://www.treeofficial.com/

3.2  What do the Multisensory Experience Examples Teach Us? The first two examples describe an event where technology changes the con­ text of the event or the object itself. For instance, the light and sound in a room can affect how we taste a glass of wine (Section 3.1.1), as shape, like pasta trans­ forming from a 2D into a 3D shape in front of your eyes, can affect how we taste food (Section 3.1.2). The seamless integration of technology in these two examples makes them sit at the ‘reality’ end of the continuum (Box 3.1). Both events consist of real elements. The pasta example, however, transforms a

4 8 | LESSON S FROM MULTISEN SORY EXPERIENCE E X AM PLE S known reality—how pasta typically behaves when cooked—and creates a new ex­peri­ence. In the first two examples, technology remains in the background, while in the following two examples the technology augmentation of an event is much more visible. What if you make something invisible both visible and feelable, and thus create an experience that is otherwise not possible? Have you ever wondered what dark matter is and what it would be like to walk through the universe? Section  3.1.3 presents the design of a multisensory dark matter ex­peri­ence enabled through a combination of devices that stimulate touch and smell, alongside vision, hearing, and taste to make this imperceptible scientific concept perceivable. In Crystal Universe (Section  3.1.4), the presence of the person is changing the event, that is, what the users see, hear, and feel. Both examples are situated in the continuum’s augmenting reality space, enabling the formation of new and unknown impressions that do not naturally occur in nature/everyday life. Mixed realities are created through the use of multisen­ sory technologies that do not require any attachments to the user’s body. To illustrate a next stage of designing immersive multisensory experiences, further examples showcase advances in AR and VR technology. The user attaches devices to their body, such as a headset, and the creator can digitally control what the user sees and smells, thereby turning a plain cookie into a ‘choc­ olate’ cookie (Section  3.1.5). This example still has a real element, a cookie, involved in the event, while Season Traveller (Section 3.1.6) replaces real elem­ ents with a virtual environment and allows the user to travel across four seasons (winter, spring, summer, autumn) using multisensory VR technology. In this experience, the user sees scenes of a wintery landscape and feels a cold sensation on her skin, and in the next moment feels warmth and a lemony scent transport­ ing her into a summery scene. While the experience in the Meta Cookie example happens in augmented reality (the perception of a plain cookie’s flavour is changing through visual and smell cues delivered through the headset), the Season Traveller example is situated and takes place mostly in a virtual space. The experienced seasons are recreated in a virtual world in absence of many real elements of a physical environment (a wintery season). The virtual world is aug­ mented through digitally controlled temperature and scentscapes. In the same vein as Season Traveller, the final two examples are situated in the virtuality end of the continuum. FLY (Section 3.1.7) recreates a multisensory experience of flying with a motion platform simulating the change in direction while the user perceives wind, fragrance, and other sensory cues from the en­vir­ on­ment. Finally, TREE (Section 3.1.8) takes current multisensory ex­peri­ences



TH E H U M A N S E N S E S M E ET TE C H N O LO GY

| 49

further and not only allows the user to have a full-body experience, but also to become something other than human, e.g. a tree. Here the virtual world com­ pletely transforms the user’s perspective and presents a reality that would not be possible without the multitude of technological innovations along the reality– virtuality continuum. Not all of the experiences and multisensory technologies illustrated in these eight examples are designed based on the concepts of multisensory experiences presented in this book. Various efforts are motivated by engineering challenges of how to digitally create and recreate sensory effects, such as in Season Traveller, in which the user feels immersed in a wintery scenery without leaving home. The integration and understanding of the impact of those technological advances on the resulting experiences is not well understood to date. However, through the proliferation of multisensory technologies it is possible to make better use of the full potential of human sensory capabilities, and through those enabling technologies, to achieve the creation of more realistic, believable, and compelling impressions that will form novel multisensory experiences along the reality–virtuality continuum. The examples presented here are an illustration of recent representative efforts associated with multisensory experiences. There are many more examples in the making within academic and industry laboratories and advocated by ­artists and innovators. We encourage you, the reader, to venture out to discover more examples, and also to become more aware of your surroundings and the sensory cues you notice and experience in your everyday life. You can do so through the lens of xSense, our definition of multisensory experiences and related concepts presented in Chapter 2, and the emerging multisensory tech­ nologies you have encountered here. As a final reflection on the examples presented in this chapter, we point out that, in neither of the presented examples, could we see a clear indication of sensory overload (one of the concepts presented in Table 2.1). However, there are some more salient senses in some experiences than others (sensory dom­in­ ance). For example, vision appears to dominate the Programmable Pasta ex­peri­ ence, given that its focus is on how we see the pasta changing its shape. The principles of sensory dominance and sensory overload are concepts to be aware of when designing multisensory experiences. It is about finding the right bal­ ance between the different sensory elements in relation to the intended impres­ sion. In other words, it is unnecessary to stimulate all of the senses in a given event, although it is important to capitalise on the best possible configurations of the sensory elements in order to deliver a desired impression.

50 | TOWARDS THE FUTURE OF MULTISENSORY EXPERIENCES

3.3  Towards the Future of Multisensory Experiences We are only beginning to understand the design space for multisensory ex­peri­ ences and enabling technologies. As illustrated through the eight examples here, technology is no longer limited to what we see and hear (such as screens and audio systems), but digital, virtual, and interactive experiences are increas­ ingly engaging all our senses. As also discussed in Chapter 2, in many cases having two sensory cues (e.g. visual and auditory) or more (adding tactile or olfactory) results in novel experiences that are much more than just the sum of their parts (super-additivity). The feelings evoked by the visual and the auditory elements may be stronger when presented together than when each is presented independently. Similarly, tactile or olfactory stimuli can enhance visual ex­peri­ ences and/or create even more compelling and realistic experiences. It is important to mention that, while multisensory experiences have received some public interest, they are still not the norm. Potentially, some reasons for this include the lack of a common language and tools to design such ex­peri­ ences. For this reason, we believe that the xSense tool included in the Appendix may help address this challenge. We suggest that today is one of the best moments to design multisensory experiences in that both science and technol­ ogy are evolving faster than ever and providing us with a deeper understanding of our senses as well as multisensory technologies that can both stimulate and extend them. Nevertheless, there are still many questions and unknown answers. For example, what if your sense of smell could go beyond the reach of your sniffing abilities—for example, sensing the smell of a jasmine flower from afar based on a system that converts its visual properties to scents? What if technol­ ogy can augment our sensory abilities and enable us to have experiences that are not only moving us from reality to virtuality (on the continuum) but also those that transform us as humans in an increasingly computerised world? With the design and development of multisensory technologies, questions arise about the extent to which those technologies may change human daily life and become an extension and augmentation of human capabilities (e.g. phys­ic­al, perceptual, cognitive). The growing degree of integration between humans and technology, starting from today’s mixed reality spaces, makes us wonder how technology will keep changing us (Figure 3.12) and consequently also inform the design of humanoid robots and future artificial intelligence (AI) systems. Humanoid robots have come a long way since the WABOT-1, the first an­thropo­morph­ic (human-like) robot, was unveiled in 1973 at Tokyo’s Waseda



TH E H U M A N S E N S E S M E ET TE C H N O LO GY

| 51

Figure 3.12  Does this represent the evolution of humankind?. Source: https://wallpapersafari.com/w/NFzPRE.

University. The robot consisted of a limb-control system, a vision system, and a conversation system. The humanoids abilities to jump, see, converse, or sense their environment (see e.g. BostonDynamic’s Parkour Atlas https://www.youtube. com/watch?v=LikxFZZO2sk&feature=emb_logo) have massively improved over the last decades, especially through advancements in actuators (motors that help in motion and making gestures) and sensor technology (to sense the world around them). In recent years, the use of AI in humanoids has gained momen­ tum through the introduction of Sophia Hanson (https://www.hansonrobotics. com/sophia/), the world’s first robot citizen. She was introduced to the United Nations on October 11, 2017. On October 25th, she was granted Saudi Arabian citizenship, making her the first humanoid robot ever to have a nationality. While humanoid robots still have a long way to go before becoming part of our daily life, they are increasingly prepared for it, especially with the progress being made in the field of AI. Soon, humanoid robots will be equipped with sight, hearing, touch, smell, and even taste when it will, if at all, become necessary for them to fully function. Today, humanoid robots do not need to eat to survive. Hearing is, however, a key sense for robots in order to receive and interpret instructions. Moreover, touch sensors are used to prevent them from bumping into things, which is extended with a sensor to balance movement and maybe even a heat sensor to recognize potential danger. In the future, smell and taste sensors can help detect toxic gases and foods, which may not be harmful for the robot itself, but can be dangerous for humans, and could therefore be a meaning­ ful ability to warn humans in, for instance, collaborative working scenarios (e.g. space exploration). Finally, cameras not only provide the humanoids with the basic ability to ‘see’ and scan their environment, but that are also useful to detect and interpret facial expressions, especially when robots move from research and

52 | TOWARDS THE FUTURE OF MULTISENSORY EXPERIENCES factory contexts into private spaces such as homes or care homes (see CARESSES project as an example, http://www.caressesrobot.org). Detecting a person’s facial expressions to decipher their emotions is only one part of the interaction; adjust­ ing their own facial expressions to their human counterpart is an important sec­ ond part to achieve a more seamless interaction. To simulate and train a humanoid robot, the subtleties of human–human interaction will require much more research and innovation, especially as we humans, for example, often misinterpret other peoples’ emotions and facial expressions. And yet, it appears that humanoid robots are here to stay, particu­ larly with the advances of AI. In the future, we might find them exhibiting behaviour we consider impossible today and they will become part of our daily lives.7 This goes hand in hand with an acceleration of computing power and an increase in our understanding of the human sensory systems that will ul­tim­ ate­ly benefit the evolution of humanoids. Only time will tell what the future relationship between humans and technology will look like, although we can be assured that it will be both promising and challenging in the context of multi­ sensory ex­peri­ences. We dive deeper into these and other issues that concern the future of multisensory experiences in Chapter 4 and Chapter 5.

C H A P TE R 4

Illustration 4  When we visit exoplanets, we might encounter flowers or foods (if any) that we haven’t evolved to discriminate in terms of their toxicity, edibility, and so on. With multisensory technology, we may be able to make what is invisible to the senses visible, so that we can better navigate these new contexts.

Multisensory Experiences: Where the Senses Meet Technology. Carlos Velasco and Marianna Obrist Oxford University Press (2020). © Carlos Velasco and Marianna Obrist. DOI: 10.1093/oso/9780198849629.003.0004

Beyond the Known and into the Unknown Future Nothing in life is to be feared, it is only to be understood. Now is the time to understand more, so that we may fear less. – Marie Curie

A

s our scientific understanding of the senses grows, and our creative explorations and experimentations with novel multisensory technologies develop, we will be able to enhance existing experiences and create previously unimaginable ones. In other words, devices and prototypes can and will be created to enrich ‘traditional’ experiences. For instance, we will be able to facilitate remote dining experiences, where two people eat together while they are at different locations. In addition, technology will aim to go beyond its current status and develop novel experiences by, for instance, augmenting our bodily capabilities by adding two extra arms1 or a tail2 to our bodies. These sorts of development will allow us to sense and interact with our environments in novel, different, and perhaps augmented, ways. The possibilities of multisensory experiences with new technologies have been captured in science fiction. For example, in Star Trek: The Next Generation, the character of Geordi La Forge is an excellent example of the exciting vision of the future. La Forge is blind but is perfectly able to navigate his environment and experience the world around him. Cutting-edge technology allows La Forge to ‘see’. The ocular prosthetic implant device he wears over his eyes enables him to ‘see’ much of the electromagnetic spectrum—radio, infrared, and ultraviolet



BEYOND THE KNOWN AND INTO THE UNKNOWN FUTURE

| 55

waves. Although this is a snippet from a science fiction television series, history shows that fiction often inspires science to make that fiction become reality.3 Today, we may not be able to restore a person’s eyesight, or to deliver ‘superhero’ capabilities like seeing through objects and walls, but the intense and rapid evolution of technology over the last century will continue to surprise us and challenge our way of thinking. Indeed, researchers have been advancing systems, based on crossmodal correspondences (see Chapter 2), that map colour changes to sound properties to inform blind people about objects in their ­surroundings. This is a growing topic of research and practice which deals with the development of sensory substitution devices. These devices change information in one sense (sight) to another (hearing)—i.e. seeing through sound.4 This chapter discusses the scope for multisensory experiences in the years to come. While there are a number of possible future directions for technologyaugmented and enabled multisensory experiences, here we focus on space exploration, which we find particularly exciting. The chapter merges science and fiction and discusses the possibilities around multisensory experiences as well as the challenges that might arise from them in the specific scenario of designing eating experiences for future space travels. Can you imagine a future where human settlements have extended from Earth to outer space, perhaps to a new planet in a distant planetary system? What will this be like? How will we live, interact with our environment, and, fundamentally, how will we eat and experience food that comes from another place, and not Earth? We can observe global efforts to prepare humanity for life beyond Earth and although we do not know yet what the experience will be like, we know it will be different. We know, from present examples, such as eating in the international space station (ISS), that eating is experienced differently when compared to Earth. It is not just eating, but everything about food, in the ISS that has required adjustments in food production, transportation, storage, food format, eating means, and food types, among others. The next sections take a closer look at the case of food experiences in space. In these, we explore some exciting multisensory food experience concepts that we and other researchers have been proposing in order to deal with some of the challenges of combining food and space travels.

4.1  Designing Multisensory Experiences for Space Travels Space agencies like the National Aeronautics and Space Administration (NASA) in the United States, the Japan Aerospace Exploration Agency (JAXA), the

56 | D E S I G N I N G M U LTI S E N S O RY E XPE R I E N C E S FO R S PAC E Indian Space Research Organisation (ISRO), and China National Space Administration (CNSA) have ambitious plans to send probes and robots not only to comets, asteroids, and other planets and their moons, but also to eventually by 2024 to send humans back to the Moon, for the first time in the 2030s to Mars, and beyond. Not only is our curiosity driving these ambitions, but also it drives the potential developments and innovations that space exploration carries with it, as well as possible resources and potential economic interests. In addition, national space agencies as well as private endeavours such as Virgin Galactic and SpaceX are looking into commercial space flights, something which may make space more accessible for profit and non-profit organizations, as well as the general population.5 Space travels and exploration have captured the imagination of people for decades, and as technology advances, it is possible to get a glimpse of what it looks like through both the multiple interactive media contents (e.g. NASA’s app) that space agencies and astronauts create, through the contents created by private enterprises (SpaceX founder Elon Musk’s social media feeds), but also through experiences facilitated by technology and designed to evoke feelings associated with being in space (Box  4.1 shows an overview effect and a VR experience designed to evocate it on Earth). Surely, it will not stop here. At some point in the near future, it will be possible for non-astronauts to go to space, when private companies achieve the aims of their research and development programs. There are multiple challenges related to both space exploration and commercial space flights. This chapter focuses on one key challenge that applies to both: multisensory space food experiences.6 Regardless of where we are in the universe, we all need to eat if we want to survive! Eating is a unique event occurring typically several times a day that usually involves most, if not all, of our senses

Box 4.1 The overview effect The overview effect refers to the unique feeling of awe evoked by seeing the Earth from a unique perspective, i.e. from outer space, which leads to a cognitive shift and to deep changes in the astronaut’s awareness and perspective of Earth and an enhanced sense of responsibility about it (see an inspirational video on the effect here: https://www.youtube.com/watch?v=CHMIfOecrlo). Given the seemingly powerful effect, researchers have tried to replicate this experience through VR technology7. The upper part of the figure visualizes the step-by-step of the

58 | E ATI N G : A F U N DA M E NTA L E V E RY DAY LI F E E XPE R I E N C E (see Chapter 2). Moreover, our understanding of the role of the senses in food experiences has grown steadily in recent years and we have some ideas of what we can do with them to shape our eating experiences (Chapter 1 and Chapter 3).

4.2  Eating: A Fundamental Everyday Life Experience Currently in space, food sources are limited or non-existent, and yet astronauts (and future non-astronauts) require the correct quantity and quality of food to live. But beyond nutrition, food plays a much broader role in human society. Experiences go well beyond nutrition to affect almost every aspect of our human lives. Astronaut and former ISS captain Scott Kelly’s book Endurance8 captures at its start the broader role of foods for humans, in the context of current space exploration: “I’m sitting at the head of my dining room table at home in Houston, finishing dinner with my family [ . . . ] It’s a simple thing, sitting at the table and eating a meal with those you love, and many people do it every day without giving it much thought. For me, it’s something I’ve been dreaming of for almost a year. I contemplated what it would be like to eat this meal so many times, now that I’m finally here, it doesn’t seem entirely real. The faces of people I love that I haven’t seen for so long, the chatter of many people talking together, the clink of silverware, the swish of wine in a glass—these are all unfamiliar. Even the sensation of gravity holding me in my chair feels strange, and every time I put a glass or fork down on the table there’s a part of my mind that is looking for a dot of Velcro or a strip of duct tape to hold it in place. I’ve been back on Earth for forty-eight hours.”

Kelly wrote this book after a year on the ISS, his final space mission with NASA. Perhaps unsurprisingly, his book starts with a reflection about being around the dining table. In it, he highlights social (family and loved ones), environmental/ cultural (cutlery), and sensory elements (associated with gravity), among others, which we often take for granted when we eat. Food is a source of energy for humans, and it also plays key roles in our emotions, social interactions, and cultures, all of which shape individuals and societies. It may be difficult to imagine being in space for a year, but we can find parallel experiences that may facilitate putting ourselves in that position and seeing some of the key roles that food plays in our lives.



BEYOND THE KNOWN AND INTO THE UNKNOWN FUTURE

| 59

Put yourself in the following situation. You are selected, with a group of ten others, to go on three camping trips in the next ten years in a remote area of Norway’s Svalbard archipelago in the Arctic ocean. The trips will last for six days, six months, and eighteen months, respectively. Each trip involves the same sort of food that you usually bring with you when camping—some canned food, perhaps some snacks, but nothing too elaborate because you have to travel ‘light’. For the longer travels, every now and then a team will fly by the area where you are and drop additional food provisions. What would you miss most in each of the travels? Would you be fine living on ‘camping’ food regardless of the length of the trip? What would you miss most, relative to the food experiences that you have in your everyday life? Would you miss eating with somebody, somewhere, with some tools, and/or specific kinds of food? Many of us would not like to miss the opportunity to eat a bar of chocolate every now and then, to be able to cook and eat with our relatives, and/or to have special foods for special occasions (perhaps turkey during Christmas time). However, when camping, you probably do not have the chance to have all of these or have easy access to them. And for some of us, a camping trip of six months or a year and a half already appears long enough to live only on camping food! Although the duration of space journeys to the Moon and Mars can vary depending on several factors, e.g. orbits, going to the Moon may take approximately three days and going to Mars approximately six to eight months (Figure 4.1). These experiences are similar to camping in that probably the food options, as they stand today, will be relatively limited and mostly designed to keep space travellers nurtured. The more time passes, the more likely you will realize the value of different aspects of eating (eating with relatives, having access to special foods in specific events, and so on). Nevertheless, eating experiences in space are very different from camping. Travelling to your space destination means being confined in a relatively small spaceship that will be your home, as well as that of another few humans, and which will be where all your living takes place. Additionally, you may experience microgravity, or if technology allows, artificial gravity. You may be surrounded by the same spaceship and space landscapes for most of the time, and you will be exposed to the same people, who, while familiar to you, may not be as close as a family member or a close friend back on Earth. To make things more complicated, your senses may go through certain changes in space. For example, astronauts sometimes report that what feels flavourful on Earth sometimes feels bland in space.9

60 | E ATI N G : A F U N DA M E NTA L E V E RY DAY LI F E E XPE R I E N C E

6–8 months 6hr ISS International Space Station

Earth

3 years & 2 months

3 days

Transit (short - long transits)

Moon

Lunar and Planetary Surface

Mars

Figure 4.1  The image on the top shows the approximate duration of possible space flights. The lower figures show most important things in relation to eating that a group of 215 British participants reported when eating on Earth and those things that they would desire and would not want to miss on a space trip to the Moon and Mars10. Figures reprinted from “Space food experiences: Designing passenger’s eating experiences for future space travel scenarios’ by Obrist, Tu, Yao, and Velasco (2019)/CC BY.

In summary, designing food experiences for space travels requires thinking about nutrition, but also the sensory, emotional, environmental, and social aspects of eating, all of which support us as individuals and in society. Multisensory experiences, with the aid of technology, can inspire the design of future space travel concepts that enhance or augment food experiences, so that humans can be supported in enduring through space travels. With this, we do not want to minimize the importance of the functional aspects of space foods. Indeed, development of food for space travellers is not an easy task. Researchers have been working intensively on this for decades and there are still many unsolved challenges, including how to germinate and grow plants in space. Importantly, eating is a fundamental aspect of our lives and involves a number of additional elements that need to be considered to support space travellers in their journeys. We have recently contributed to address this challenge, specifically, how to design multisensory eating experiences for future space travels that account for



BEYOND THE KNOWN AND INTO THE UNKNOWN FUTURE

| 61

aspects beyond functional needs (food for health) and that also account for the experiential needs, e.g. the act of eating, the communal aspect of cooking. This may sound far-fetched and it is certainly not something that we need to develop immediately, but in the next decade we will increasingly be facing questions linked to human experiences beyond the basic need for survival in space, given the proliferation of space exploration and commercial space projects. The next sections describe the multisensory experience design concepts we have been co-developing that aim to tackle nutritional, emotional, environmental, and sociocultural aspects of eating during space travels. In particular, we present three design concepts upon which we have collaborated with our colleagues Yunwen Tu (Tutu Food Design) and Lining Yao (Carnegie Mellon University).10 As in Chapter 3, this chapter follows the same presentation format (xSense) of each concept.

4.2.1  Spice Bomb Mixing: What if You Could Season Space Food with Multisensory Rich Flavour Elements, Emotions, and Social Interactions? Background: This experience was designed to augment space travellers’ flavour experiences in a playful and collaborative way (Figure 4.2); they can mix foods together and thus enhance the emotional and social dynamic of the ­eating process. Spice Bomb Mixing is a flavour-enhancing seasoning design concept that aims to increase food flavour in the rather limited food landscape associated with space travels.10 While food is at the centre of this design, the idea is motivated by the fact that eating in space can be rather isolating and bland. Coupled with the reduced sensory perception of food in microgravity, this could negatively affect someone’s mood, eating behaviour, and consequently impact a person’s emotional wellbeing over time. Impression: Flavour richness in space. Event: Preparing and eating in space with others. Sensory elements: Your senses may work slightly different as they do on Earth due to the effects of microgravity. The sense of proprioception (movement) is important as you are possibly floating. There are different spice bomb modules (that contain flavours like black pepper, cumin, and yellow curry), which can each enrich the flavour of space food. These different flavours are visually rich in terms of colour and shapes and mixed in a mixing pod, which may also create a characteristic sound when the flavours are mixed.

62 | E ATI N G : A F U N DA M E NTA L E V E RY DAY LI F E E XPE R I E N C E

Figure 4.2  Top image illustrates space travellers mixing spice bombs and ingredients by shaking and throwing the mixing pod in microgravity. Bottom image illustrates the mixing pod where different spice bomb modules can be mixed. The longer a person shakes the pod, the stronger the food tastes. Figure reprinted from ‘Space food experiences: Designing passenger’s eating experiences for future space travel scenarios’ by Obrist, Tu, Yao, and Velasco (2019)/CC BY.

Concepts: All sensory elements are both temporally and spatially synchron­ ous. The concept also capitalises on crossmodal correspondences between movement and flavour intensity. The stronger the mixing pot is shaken, the stronger the resulting flavour. The different flavours are presented visually through different shapes and colours, which may either represent a given taste (through crossmodal correspondences such as red for sweet) or a specific



BEYOND THE KNOWN AND INTO THE UNKNOWN FUTURE

| 63

­ avour or food (semantic congruence such as cilantro in green colour and fl cilantro shape). Enabling technology: This experience requires minimal technology. The key devices used here are both the spice bomb modules, as well as the mixing pot.

4.2.2  Flavour Journey 3D Printer: What if You Could Have a 3D Printed Multisensory, Multicourse, Dining Experience in Space? Background: This experience was designed to provide a space traveller with a range of customized multisensory food experiences. The traveller can design her desired food or ask for a recommendation from family, friends, or chefs back on Earth.10 The recipes can be customized in terms of flavours and nutrients and can be prepared through a food 3D printer (Figure 4.3). The Flavour Journey 3D Printer design concept could, for example, consist of a printed food in form of a ‘bar’ that integrates several courses and allows a customized flavour

Order food

Develop flavor profile

Deliver recipes

3D print food

Eat a meal

Communicate with chefs or family members

Figure 4.3  Both images on the left illustrate the 3D printer, whilst the upper right image shows the logistics needed to prepare the experience, and the lower right image shows a food bar concept that captures the idea of the ‘flavour journey’. Figure reprinted from ‘Space food experiences: Designing passenger’s eating experiences for future space travel scenarios’ by Obrist, Tu, Yao, and Velasco (2019)/CC BY.

64 | E ATI N G : A F U N DA M E NTA L E V E RY DAY LI F E E XPE R I E N C E journey. This experience allows the space traveller to access both familiar flavours and/or to be surprised by novel food experiences. Impression: Customized flavour journey in space. Event: Having a multicourse dining experience in space. Sensory elements: In addition of the effects of microgravity on space travellers, there are different flavours and nutrients that can be used to create a specific flavour journey. These flavours are presented visually, and their shapes, colours, and textures can be controlled. The food involves multiflavoured, ­multifunctional, and multitextured characteristics that can be assembled bit by bit to create a customized multisensory dining experience. The food could be a bar that integrates several courses, namely, starter, main, and dessert in one print. Concepts: All sensory elements are both temporally and spatially synchronous. In addition, space travellers may choose to align the cues in terms of a given food identity or meaning. Perhaps, they may want Italian–Austrian style ‘spaetzle’, a Colombian ‘empanada’, or any other flavours associated with a ­typical dish from a given cuisine. Enabling technology: This experience is facilitated by food 3D printing technology that allows the customization of nutrients and flavours. This technology is based on physical voxel fabrication through a printer that deposits multifunctional voxel spheres.11 A voxel is a three-dimensional pixel that allows repeatability in its use to render a larger object (Figure 4.3 left). In other words, a voxel is a building block for layered manufacturing. Imagine a printer capable of making food consisting of tens or even hundreds of small sphere voxels that are generated in real time. Some spheres could be hard, some soft. Some could be infused with a particular aroma, some with a particular taste. Multiflavoured, multifunctional, and multitextured food can be treated as a digital material and assembled in real time to create a personalized eating experience in space.

4.2.3  Earth Memory Bites: What if You Could Simulate Your Memories of Earth Through Multisensory-Augmented Food Experiences during Space Travels? Background: This experience was designed to support space travellers by alleviating psychological distress when they feel homesick, experience isolation, or generally wish for a change of scenery. The Earth Memory Bites design concept offers space travellers two options to connect with Earth through food.10 One option is to order a given Earth flavour (in the form of a bite) and select a familiar dining environment for a particular meal. The second option is to

66 | LESSONS FROM MULTISENSORY SPACE FOOD EXPERIENCES the taste, aroma, and mouthfeel of ‘Speck’ (Italian smoked pork), as well as the extrinsic elements that accompany them—the look and sound of the Italian Alps—are aligned as a function of an Earth identity or meaning (South Tyrol). This semantic congruence could be based on any given region, culture, specific moment or experience associated with Earth. Enabling technology: This experience is enabled through a range of technologies starting from VR, projection mapping, an immersive sound system, multisensory technology to deliver smells and other environmental sensory elements (wind, temperature), and a food 3D printer. It is further extendable through the integration of intelligent systems, who can use AI and machine learning to create new food–environment pairings over time.

4.3  What Have We Learned about Multisensory Space Food Experiences? Spice Bomb Mixing (Section 4.2.1) involves little technology and is designed to season food experiences with a variety of flavour components, emotions, and social interactions. This makes food a means to support the emotional and social aspects associated with eating, but also a more satisfying end result. Flavour Journey 3D Printing (Section  4.2.2) was designed to emulate what some chefs refer to as the ‘flavour journey’ as well as to enrich the experience of eating through a 3D-printed journey (starter, main, and dessert) of specialized foods. This concept allows the customization of food colours, shapes, and textures to either mimic or enrich known food experiences and/or to develop new food concepts. Considering the changes that our senses go through in space, this concept also aims at augmenting the experience of food through internal (textures) and external food elements (colour). Earth Memory Bites (Section  4.2.3) combines both symbolic, familiar foods that resemble key memories from Earth. These foods can be eaten in specialized multisensory, social, virtual environments that have been designed to enhance the dining ­experiences. These concepts illustrate only initial steps for designing multisensory food experiences for future space travellers. Many more scenarios are possible that may augment eating experiences in space with and without the aid of technology. What impression should we design for, and what would be a worthwhile event to create a multisensory dining experience in space—perhaps a ‘first birthday’ away from Earth?



BEYOND THE KNOWN AND INTO THE UNKNOWN FUTURE

| 67

Where we eat and how we eat have changed in the evolution of humankind. We have gone from simple tools to facilitate eating all the way to digitizing our food experiences. Until space travel commences and we truly explore the universe and colonize other planets, we will not know exactly how eating will be different. However, we can start preparing for these changes by considering the way in which we will shape our human experience beyond Earth. As we venture into outer space, perhaps we might start appreciating food and its origin more because it will be difficult to obtain, especially fresh foods like vegetables and fruits. In the initial transition away from Earth, we will crave the familiar and will need, in one way or another, to connect our experiences to Earth. However, as time passes, we will involve new elements from the environments and contexts to which we will assi­ milate. Designing multisensory food experiences in space can provide a means to merge the known and the unknown while retaining our ‘Earthness’.

4.4  What Challenges Accompany The Opportunities of Designing Multisensory Experiences? While we have thus far discussed the opportunities in designing multisensory experiences, designing these experiences could also produce unexpected threats, e.g. favouring some and harming others. What if all the data accumulated about how the senses work, as well as the novel sensory-enabling technologies currently in development, were possessed by a small and elite group of people? Going further, what if machines (humanoid robots, AI systems), and not humans, were in charge of designing multisensory experiences in the future? What if machines could control what experiences we have, e.g. decide what and how we eat? Let us consider our scenarios of designing multisensory eating experiences in space in light of these questions. Imagine we are going on a space flight of long duration (at least two years). While we are passengers on the spaceship, multiple sensors are tracking our eating habits, preferences, and choices of Earth Memory Bites. All this information is fed into an intelligent system that refines recommendations and human input. The system is learning over time. While initially it will follow the pre-defined instructions for the food–environment pairings in the Earth Memory Bites concept, over time, it will strive to optimize the multisensory experiences and even design new food–environment pairings that are personalized to the passengers. The system will learn the passengers’ eating patterns, preferences, and may even know the ‘meaning’ of specific food–environment

68 | W H AT C H A LLE N G E S ACCO M PA N Y O PPO RTU N ITI E S pairings for specific purposes, e.g. an anniversary, Christmas dinner. It will also learn to detect and decipher specific emotional states through facial expressions and thus will be able to offer Earth Memory Bites to enhance the travellers’ mood and elicit social interactions with other passengers. Not suddenly, but gradually, our Earth Memory Bites are transformed, and new multisensory experiences are carefully designed by a machine (i.e. a set of ­algorithms). Now imagine that this machine evolves, transforms, and finally becomes a part of a group of exoplanet colonizers who have never experienced Earth, and whose eating experiences are fully computerised. What are some possible consequences of this scenario? How would we react to not deciding what we eat or perhaps being put in experiences that follow a ‘how we should eat’ procedure? Who decides, in the end, what we should eat, and how, via what multisensory experience? Will the decision be with the individual, the machine, the spaceship captain, a doctor, or someone else? What would you feel comfortable with? The evolution of humankind is based on change, so we should be ‘OK’ as the machine changes to designing our multisensory eating experiences, shouldn’t we? Perhaps we would never notice the difference, as the change is made intelligently and by giving us the impression that we are in control. In fact, many everyday activities are already controlled by intelligent systems that facilitate our search behaviour on the Internet, propose music and movies (such as on Spotify, Netflix) based on our past preferences, and the wide range of social media advertisement ecosystems that are reflected, for example, by Facebook adds based on your purchases on Amazon. Machines are in a way already ‘controlling’ certain aspects of our everyday experiences and we are often not even aware of that (or accept it because of convenience); all of this is reminiscent of Huxley’s Brave New World. Huxley described a society that is given SOMA, a sort of drug designed to make people feel happy and to, in one way or another, escape their reality to more pleasurable contexts. In this world, people are also entertained with multisensory experiences such as the scent organ, which is a technologically advanced device that entertains by presenting scent stories that give pleasure to people and which perhaps also reduces any strong sentiment against the status quo (see epigraph, Chapter 3, for a description of a scent organ experience). Furthermore, in Brave New World, Huxley also describes ‘The Feelies’, a sort of cinema of sensations that involves not only sight and sound but also touch (which parallels with 4D cinema experiences these days). Such dystopian novels or science fiction films give us plenty of examples that teach us to be cautious about the role of multisensory technologies and designing multisensory experiences in the future. But what if we take an exponential leap forward—1,000 or 100,000 years from now—we likely have changed our way of eating with or without the



BEYOND THE KNOWN AND INTO THE UNKNOWN FUTURE

| 69

i­ nfluence of machines, and our experience of the worlds around us (if physical or virtual) has likely transformed to an extent that we cannot yet imagine. Maybe we even found a way not to eat, but gain our ‘fuel’ somehow else, for instance, through our skin. If so, what new roles do our senses of taste and smell play? Would touch, or a technology-augmented sense, become the dominant sense in accepting or refusing food as fuel to survive? Despite all possible and imaginable futures, what is clear today is that multisensory experiences can inspire a range of design directions. As multisensory technologies become more integrated into everyday objects, surroundings, processes, and even people (such as through implants), the boundaries between reality and virtuality are increasingly blurred. Today, we may still be able to distinguish between humans and humanoid robots, but as technology advances, humanoids become more human-like and humans, through advances in bio-engineering, could evolve to become human+, a generation of humans that are more advanced in their sensory, cognitive, and physical abilities. With the opportunities of human multisensory augmentation and the consequent transformation of multisensory experiences also comes the challenge of over-customization and optimization of experiences. In other words, if we ­control too much of what we can see, feel, hear, taste, or smell, we risk creating bubbles and subgroupings of people that resist diversity (reminiscent of the concept of epistemic bubbles, though with a sensory twist). Despite our best possible intentions, we may deprive our senses of the fullness and richness of experiences and facilitate a process of self-selection in terms of what dominates in the sensory world of different groups of people. We may also create sensory overload through providing ourselves with more sensory capabilities through implants and gene manipulation. We may sense and experience more than what makes us human today. At what point do we interfere too much with what makes us human, and what is still considered acceptable? A key question is also how we come to define real and virtual if, from a first-person perspective, we reach a point where it becomes impossible to distinguish between reality and virtuality? Will there be a multisensory reality in the future, or will it all be an impression of reality that is carefully crafted by someone/thing to provide us with multiple multisensory experiences? The essence of a 2017 TED talk by Professor Anil Seth (see Chapter 2) is that ‘ . . . our experience of reality is a bunch of hallucinations we collectively agree on’, although with the emphasis that ‘ . . . we don’t just passively perceive the world, we actively generate it’.12 We have agency in this world, and as such, we have the responsibility to help shape it.

70 | TH I N K I N G A BO UT M U LTI S E N S O RY E XPE R I E N C E S

4.5  Why and How Should We Think of Multisensory Experiences? It is important to keep in mind that we are, to a certain level, in control of what multisensory experiences we design for and can utilize these opportunities for good. We can actively create and shape the future. In Chapter 3 the TREE VR experience that allows us to see the world through the eyes of a tree hints to the fact that designing multisensory experiences can help tackle important challenges currently facing humankind, e.g. deforestation and climate change. Multisensory experiences enabled through technology can increase people’s awareness on an important issue such as deforestation, and above all, elicit ­personal responses that would otherwise not exist when the problem is distant from daily life. When an individual can ‘see’ the world through the ‘eyes’ of a tree and ‘grow’ in a forest until she is ‘cut down’, the experience is powerful; perhaps the user will think and act differently in the future. This new memory and impression are enabled by multisensory technologies. Similarly, multisensory technologies can make a difference to an individual’s quality of life. As we age, changes in our sensory systems result in reduced hearing and sight capabilities, which are relatively well corrected via hearing aids and glasses. However, there is as yet no such aid for our other senses, despite the fact that we know that our senses of smell, touch, and taste also go through changes with age. We are less likely to enjoy the food we eat, as it is perceived as less flavourful, and we are less likely to enjoy the sweet scent of a sunflower as our sense of smell diminishes over time.13 In cases like these, multisensory technology can help create, for example, multisensory food experiences that, combined with social elements, enhance flavour perception, e.g. Spice Bomb Mixing designed to create flavour richness in space food experiences. The multisensory food experiences designed for space travel may benefit the elderly through ­positively changing their flavour experiences (making food more palatable), as well as children by helping to train their flavour palette (creating a range of both familiar and unfamiliar flavour experiences). Whilst multisensory experiences with and without technology have the potential for doing ‘good’, they also have the potential to be misused. With this in mind, Chapter  5 discusses our perspective on the implications of, and responsibilities associated with, multisensory experiences.

C H A P TE R 5

Illustration 5  As we make one step after another forward in multisensory experiences, discovering a range of opportunities, we also have to think responsibly about the impact of each step. Indeed, sometimes we may need to take a step back and reflect on what it is that we design, why, when, for whom, and how.

Multisensory Experiences: Where the Senses Meet Technology. Carlos Velasco and Marianna Obrist Oxford University Press (2020). © Carlos Velasco and Marianna Obrist. DOI: 10.1093/oso/9780198849629.003.0005

Laws of Multisensory Experiences The secret of getting ahead is getting started. The secret of getting started is breaking your complex overwhelming tasks into small manageable tasks, and starting on the first one. – Mark Twain

T

his book began with a description of our excitement for multisensory experiences and the growing endeavour to design them with technology. We are not the first and nor will we be the last to be fascinated by the power of the senses and the opportunities to shape the impressions that we develop of the world around us. The senses are a fundamental part of what makes us human. At the same time, technology has also become a fundamental part of our life. Mobile phones have become one of the most evident examples of the use of technology to extend or augment our human capabilities. We can store our data and memories in form of pictures, videos, and audio files. To date, though, technology cannot easily capture smells, or the touch of a loved one. However, a number of researchers are working to make that possible in the future.1 They are developing new devices and systems that allow us to interface with digital worlds through all of our senses (as illustrated in Chapter 3). Today, while this interface is still limited, it is, however, evident that many of our daily life experiences happen in mixed reality, where sensory aspects of the



L AWS O F M U LTI S E N S O RY E XPE R I E N C E S

| 73

physical and digital world merge seamlessly. Our physical reality is augmented, enhanced, and transformed through the integration of technology. This transformation comes with opportunities as well as challenges. This chapter revisits the journey we have taken you through in the first four chapters, and then shares our vision of the future of multisensory experiences: our Laws of Multisensory Experiences.

5.1  Stringing It All Together Chapter 1 discussed our fascination with understanding and designing multisensory experiences, which have also intrigued philosophers, psychologists, artists, technologists, futurists, and many others. It showed that habitual activities such as eating or going to an art gallery involve rich multisensory worlds that we can enhance or modify to create completely new experiences. For example, it explained how Tate Sensorium—an exhibition in a traditional art gallery — brought together practitioners and academics to design art experiences that, aided through the use of new multisensory technologies, stimulated all the senses. This enabled not just seeing, but feeling, hearing, and even smelling and tasting art. Why do we need multisensory experiences? Can we not just enjoy art through looking at it, or maybe by touching it (sculptures or objects)? Why would we want to smell or taste art? Human life experiences are multisensory in nature, that is, we interface with our environments with all of our senses. Indeed, perhaps our most impactful and memorable experiences involve multiple senses, such as eating or going to a concert. We wrote this book to illustrate the ‘why’ and to describe how we have agency, i.e. a level of control, when it comes to designing the multisensory worlds in which we live, and ultimately, the impressions that we develop from it. But what are multisensory experiences? Chapter 2 approached this question first by discussing what experiences are and what role the senses play in them. To enrich this discussion, we posed these two questions to experts from different backgrounds. The experts agreed that the senses are a sort of gateway for acquiring information from us (our bodies) and the world around us (our ­environments), and therefore are key to our experiences. Following this discussion, it defined multisensory experiences as impressions formed by specific events whose sensory elements have been carefully crafted by someone. For example,

74 | STR I N G I N G IT A LL TO G ETH E R to create the experience of a sunflower, colours, shapes, textures, and smells can be considered and put together in a given event to deliver the sunflower impression. The senses are placed at the centre of the formation of the impression of the sunflower, even in the absence of a real flower. The chapter also presented some concepts to select, mix, and match the sensory elements available to design multisensory experiences, including temporal, spatial, and semantic congruence, crossmodal correspondences, sensory dominance, and sensory overload. In order for the reader to become part of the imagination, creation, and formation of multisensory experiences, the chapter introduced the xSense Design Cards, a companion tool for this book and which are used in the subsequent chapters to analyse and reflect upon some exciting examples of multisensory experiences. xSense aims to provide an easy-to-use instrument that includes all the basic components of multisensory experiences in form of different cards (see Appendix) so that their user can become that ‘someone’ who carefully crafts (or analyses) the sensory elements of specific events that give rise to the impressions aimed for. We encourage the reader to take a look and try designing a multisensory experience. Chapter 3 looked at multisensory experiences in the context of new technologies. There are multiple initiatives to develop new ways to better connect and integrate our senses in physical and increasingly digital environments. These initiatives are enabling the design of multisensory experiences along the reality–virtuality continuum, which describes how multisensory experiences can range from reality, through mixed reality, to full virtuality. In other words, experiences can involve both physical and digital sensory elements. To illustrate this, the chapter presented a collection of eight representative multisensory experience examples moving along the reality–virtuality continuum and used xSense to analyse them. Chapter 4 showed how multisensory experiences may further be used to tackle future challenges facing humanity (from deforestation, through climate change, to space exploration). Additionally, using xSense, the chapter presented and analysed three multisensory experience concepts to overcome some of the challenges associated with space food and travel (limited food and companion supply/choice): Spice Bomb Mixing, Flavour Journey 3D printer, and Earth Memory Bites. These were designed, through multisensory food experiences, to support space travellers address the sensory, emotional, social, and environmental challenges faced in space. These concepts helped to illustrate the role of



L AWS O F M U LTI S E N S O RY E XPE R I E N C E S

| 75

multisensory experiences in space exploration but, most importantly, to highlight the role they can play for individuals and society. Chapter 4 reflected on the worries and fears that emerge through technological advancements and paved the way for this final chapter. This chapter looks at these worries, but also the future opportunities for multisensory ­experiences in light of new technologies, through the lense of our definition of multisensory experiences. It raises specific questions that must be considered for multisensory experiences, which encompass the why (the rationale/reason), what (the impression), when (the event), how (the sensory elements), who (the someone), and whom (the receiver).

5.2  Considerations of Multisensory Experiences as per Technological Advances Let us take a step back and consider the following point. Today we have the power to design, at least in part, the impressions that people have by crafting sensory ­elements increasingly enabled through technology. While this is ­definitely exciting, it is not as clear cut as we may initially think. Indeed, if we were fully accurate in delivering the experiences that we design for, the world would look a bit different, perhaps more like Huxley’s Brave New World, where there is a direct mapping between sensory elements, events, and impressions. While in Huxley’s novel people do not necessarily have much of a choice over some of their experiences, we do, and we should do. Whoever designs a multisensory experience needs to remember that the impressions crafted can be received differently depending on a persons’ prior experiences, preferences, expectations, and choices. People are not passive receivers but active actors in the generation of experiences. Therefore, it is important to consider the receiver carefully and give them a choice. At the same time, though, it is very likely that the accuracy with which we can map sensory elements, events, and impressions will increase and gain precision in the future. We are becoming better at developing multisensory technologies and devices that allow precise control of sensory elements. In addition, many of those devices are not unidirectional, i.e. they not only control and deliver sensory inputs, but also capture information through advanced sensing technology. For example, systems can detect your presence in a room, your facial expressions, and your physiological responses such as heart rate, just to mention a few. This information can guide the selection and control of sensory elements for specific events.

76 | CO N S I D E R ATI O N S O F M U LTI S E N S O RY E XPE R I E N C E S Let us consider the example of Philips Hue, a relatively popular system of lightbulbs, a controller, and an app that allows you to control the hue and brightness levels of your lights at home. In that sense, if you use this system, you control the lighting levels at home to create, for example, a relaxing or invigorating atmosphere. But what if the Philips Hue system could automatically detect your mood through sensors in your home or in your phone and adjust the lighting to influence your mood? Now imagine that, over several months or years, you and many other people have been using this system, and its increasing computational power allows a better prediction of the impression you may want. So far, so good. However, what if the control moves away from you? What if AI is used to predict your wanted impression, or perhaps someone else, e.g. another human, controls the lighting system? What if you did not give permission (or did not realize you did) to the system or another human to control the lighting? These questions not only apply to lighting systems but to all events in which the sensory elements can be controlled. Given that we are increasingly living in mixed realities, the devices and technologies we use involve a series of sensory elements that can be controlled by us, others, and machines. What if a critical life event is at stake, e.g. giving birth, finding your partner, going through surgery, or finding a job? What is common to all of those events is that technology is already, or is increasingly becoming, a key part of them. For example, dating methods to find a partner have changed dramatically through technology. Dating apps have opened up new ways for people to meet potential partners. In many apps users can see or even hear potential dates. But what if multisensory technologies would allow users to share a drink over the Internet with a potential date or perhaps increase the accuracy of the matching function based on scent compatibility (how much potential daters like each other’s scent)?2 What if the first date with a potential partner was fully controlled by the matching system (such as the place, sounds, dress code, perfume, food)? In other words, what if your dating process was, from beginning to end, guided by an intelligent system? If multisensory experiences become more accurate and powerful over time (see Box 5.1), what are the potential consequences for humans? Who should have the power of designing scalable (for the many) multisensory experiences, and what sort of guidelines or codes of conduct should people have when designing them? What are the implications of multisensory experiences at the micro level (individuals), meso level (social institutions) and macro level (global perspective)? The answers to these questions are not black or white, but there are multiple thought guidelines that we may follow when considering designing experiences.



L AWS O F M U LTI S E N S O RY E XPE R I E N C E S

| 77

Table  5.1 takes each component of our definition of multisensory experiences and considers consequences for the design rational/reason (why), the impression (what), the specific event (when), the sensory elements (how), the someone who crafts them (who), and receiver (whom).

Table 5.1  Components of multisensory experiences and related questions and considerations. Components

Questions and considerations

Background

Why

Why do we want to design a given experience? What is the reason and rational for such an experience?

Impressions

What

What impressions do we want to create? How do we know what is real, mixed, or VR in light of technological advances? What reality will we take as a reference point for what is real?

Events

When

What events should we design the experiences for or not? How do we draw the line between common and important life events?

Sensory elements

How

What sensory element should we select and why? Are there trade-offs? Should we personalise them? Should knowledge about the senses and multisensory technology be accessible to everyone?

Someone

Who

Who crafts the multisensory experience? Should certain humans, all humans, and/or machines be allowed to be the ‘someone’? Do we give equal access to citizens, industries, politicians?

Receiver

Whom

For whom are we designing? Are we designing for friends, colleagues, communities, whole countries, vulnerable populations? Should we be free to decide for ‘whom’ we design? Should machines be considered a ‘whom’, given their increasingly human-like intelligence and sensing capabilities?

Box 5.1 The accelerating pace of change and exponential growth in computing power The growth of computing capacities is not linear but exponential; a phenomenon also known as Moore’s Law, coined by Intel cofounder Gordon Moore in 1965. Simply said, this law observes an increase in computing power with reducing costs. Most recently, the growth in computing power is put in context of Ray’s Kurzweil work on ‘The Singularity is Near’3—where the accelerating pace of change and exponential growth in computing power will lead to the Singularity in 2045. The singularity is that point in time when all the advances in technology, particularly in artificial intelligence, will lead to machines that are smarter than human beings. Multisensory experiences are situated on the right end of this technology growth cycle, enabled not only through technological advances but also a growing understanding of how our brain and senses work. While machines may surpass brainpower of humans in the not-so-distant future, their ability to go beyond performance and towards experience measures is still uncertain.

80 | L AWS O F M U LTI S E N S O RY E XPE R I E N C E S

5.3.1  The First Law Multisensory experiences should be used for good and must not harm others. This law aims to guide the thinking process related to the question what impressions and events we want to design for, and why? The answer to this question should always be that reasons, events, and impressions must not cause any harm to the receiver, nor anyone else. Multisensory experiences should be used for good. For example, if you want to enjoy a sweet drink without using sugar, you may want to design the impression of ‘sweetness’ by using red light to season the drink (see Wine Tasting Experience in Section  3.1.1). This does not necessarily harm anyone, but instead can benefit you. However, if you are the manufacturer, you may also want people to consume your drink, so you might think of using another light hue (say, green) to reduce the sweetness impression, and potentially, increase the consumption of more sweet drinks. In this case, you may actually harm people by not considering the growing health concerns linked to sugar consumption.5 We have talked about how impressions may or may not harm the receiver or others. However, we also need to think about the events. Would it be ethical to design the experience of having a sweet drink when the receiver has diabetes? Would it be right to design the experience of having a baby or fighting in a war? These questions do not have black and white answers, but the general rule here is to aim for doing good and avoid harming others, through both the impressions and events for which you want to design.

5.3.2  The Second Law Receivers of a multisensory experience must be treated fairly. This law aims to consider for whom we are designing, i.e. should we design differently for different receivers? We must identify the receiver and its particular characteristics, e.g. is the receiver a friend? The target of a business? The citizen of a country? A child or an elder? What if your receiver has, for example, a mental or physical health ­disability? Each receiver has his/her/their own unique characteristics and it is our responsibility to treat them fairly when we are considering the impressions that we want to form through an arrangement of sensory elements. We want to consider the differences between adults and children when designing experiences for them. However, we also want to be careful when creating intelligent systems that may actually capture some of our human biases related to different receivers.



L AWS O F M U LTI S E N S O RY E XPE R I E N C E S

| 81

Consider the following example. A group of friends in Western Europe develop a multisensory intelligent system that uses lights with colour and music to create the impression of satiety and thus reduces both food consumption and waste. Most likely, if your receivers are children or adults, the specifications of the sensory elements may differ, and this is something that you want because they react differently to food. So far, so good. But now consider that this intelligent system was developed by, and based on the behaviour of, people from western, educated, industrialized, rich, and democratic countries.6 If this system was used in a small community in Borneo, would it evoke the same impression? Humans and intelligent systems have biases,7 which we need to consider when designing experiences. Not only should we treat receivers fairly by balancing their differences and giving them all the same opportunities, but also we must empower them through giving them a voice in multisensory experiences. In other words, receivers do not just passively receive, but may adjust ­experiences to their own needs.

5.3.3  The Third Law The someone and the sensory elements must be known. This final law aims to address two questions: first, who is crafting the multisensory experience, and second, what sensory elements do we select, and why? With this law we call for transparency in terms of who designs, what knowledge guides the design, and what sensory elements are chosen to craft an impression. We understand that not all information may be provided up front to the receiver, although the receiver must have easy access to such information if they want. Remember the Meta Cookie from Section 3.1.5? In the same vein, perhaps you go to a restaurant and receive from the hostess a multisensory VR headset and start eating sushi in mixed reality. You believe you are trying different types of sushi. However, you are just given several pieces of sushi, which are exactly the same, but appearance and smell are modified digitally. This way, you can experience different flavours in mixed reality. In this example, it is likely that you know what the sensory elements are in advance (through the menu), and if not, any surprises will be brought to your attention (by the chef) once the ­experience is over. Perhaps the chef did not use concepts of multisensory ­experiences to design the dining experience but just intuition. Regardless of this, there is a growing body of research on how different senses contribute to our mixed reality dining experience. We believe that this knowledge should be available to everyone.

82 | L AWS O F M U LTI S E N S O RY E XPE R I E N C E S In the Meta Cookie example, we do not doubt that the chef is, or has the right to be, the who that crafts sensory elements for the dining impression. But what if an organization uses mixed reality food aid experiences to increase satiety in a country at war with limited food resources? Would you agree to this, considering that more than the impression of satiety, people in this example actually need to be sated and nurtured? In this scenario, different humans are the someone and the receiver of a multisensory experience. In a not-so-distant future, we can expect that intelligent systems may also be the someone. How should we think about this? Today we design multisensory experiences, but in the future, we can expect that machines, enabled through advances in AI, will equally be capable of doing it. Perhaps they may even do it better than humans through better sensing and processing, as well as crafting sensory elements in a variety of events to deliver specific impressions. We want both humans and machines to consider our laws when designing multisensory experiences. In the third law of multisensory experiences in particular, we believe that the someone (who designs) and the how (sensory elements used) must be known and subject to public debate.

The future is unwritten, so we now have the chance to write and shape the future of multisensory experiences. In a way, our book is a powerful and personal story about our passion for, and viewpoint on, multisensory experiences. We aimed to enable the reader to become the ‘someone’ in multisensory experiences and think critically and responsibly about them. In our three laws, we set out how we want this someone to think and act when designing multisensory experiences. It is not just about doing, but also being aware of the implications of doing. How do you think now about the role of the senses in your everyday life experiences, after reading this book? Can you re-imagine some of your experiences in light of how we can design the physical, digital, and mixed reality ­multisensory worlds in which we live? You now have the opportunity to create new realities through multisensory experiences. This world is but canvas to our imaginations. – H. D. Thoreau

84 | APPE N D IX

Experience Cards Background:

Impression:

A PPE N D IX

Sensory Cards Event:

Sensory elements:

Concepts:

⬜ Temporal ⬜ Spatial

⬜ Semantic

⬜ Correspondences

⬜ Sensory dominance

⬜ Sensory overload

| 85

86 | APPE N D IX

Technology Cards Enabling Technology:

Envisioned Technology:

A PPE N D IX

| 87

Exploration Cards

Vision

Audition

Taste

Smell

Touch

Reality–Virtuality-Continuum:

Real environment

Mixed reality Augmented environment

Augmented virtuality

Virtual environment

REFERENCES

CHAPTER 1 (1) Spence, C. (2017). Gastrophysics: The new science of eating. Penguin. (2) Michel, C., Velasco, C., Gatti, E., & Spence, C. (2014). A taste of Kandinsky: Enhancing expectations and experience through the use of art-inspired food presentation. Flavour, 3, 7. https://doi.org/10.1186/2044-7248-3-7. (3) Vi, C.  T., Marzo, A., Ablart, D., Memoli, G., Subramanian, S., Drinkwater, B., & Obrist, M. (2017). TastyFloats: A contactless food delivery system. In S. Subramanian, & J.  Steimle (Eds.), Proceedings of the 2017 ACM International Conference on Interactive Surfaces and Spaces (ISS ‘17) (pp. 161–170). ACM. https://doi.org/ 10.1145/3132272.3134123. (4) Velasco, C. & Spence, C (Eds.) (2019). Multisensory packaging: Designing new product experiences. Palgrave MacMillan. (5) For information about the prevalence of synaesthesia, see: Simner, J., Mulvenna, C., Sagiv, N , Tsakanikos, E., Witherby, S.  A., Fraser, C., Scott, K., & Ward, J. (2006). Synaesthesia: The prevalence of atypical cross-modal experiences. Perception, 35(8), 1024–1033. https://doi.org/10.1068/p5469. For a recent reflection on synaesthesia, see: Ward, J. (2019). Synaesthesia: A distinct entity that is an emergent feature of adaptive neurocognitive differences. Philosophical Transactions B: Biological Sciences, 374(1787), 20180351. https://doi.org/10.1098/ rstb.2018.0351. (6) Ablart, D., Velasco, C., Vi, C. T., Gatti, E., & Obrist, M. (2017). The how and why behind a multisensory art display. Interactions, 24(6), 38–43. https://doi.org/ 10.1145/3137091. (7) For people’s descriptions of mid-air tactile experiences, see: Obrist, M., Seah, S.A., & Subramanian, S. (2013). Talking about tactile experiences. In R.  F.  Bødker, S. Brewster, P. Baudisch, M. Beaudouin-Lafon, & W. E. Mackay (Eds.), Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI ‘13) (pp. 1659–1668). ACM. https://doi.org/10.1145/2470654.2466220.

90 | R e f ere n ce s (8) Gilbert, A.  N. (2008). What the nose knows: The science of scent in everyday life. Crown Publishers. (9) Velasco, C., Tu, Y., & Obrist, M. (2018). Towards multisensory storytelling with taste and flavor. In A.  Nijholt, C.  Velasco, M.  Obrist, K.  Okajima, & C.  Spence (Eds.), Proceedings of the 3rd International Workshop on Multisensory Approaches to Human-Food Interaction (MHFI ‘18) (Article 2, 7 pages). ACM. https://doi.org/ 10.1145/3279954.3279956. CHAPTER 2 (1) For an introduction to different perspectives on experiences, see:

•  Churchland, P. M. (2013). Matter and consciousness. MIT Press.



•  Dennett, D. C. (1993). Consciousness explained. Penguin.



•  Dewey, J. (2005). Art as Experience. The Berkley Publishing Group.



• Graziano, M. S. A. (2019). Rethinking consciousness: A scientific theory of subjective ex­peri­ence. W. W. Norton & Company.



• Hoffman, D. (2019). The case against reality: Why evolution hid the truth from our eyes. W. W. Norton & Company.



• Koch, C. (2019). The feeling of life itself: Why consciousness is widespread but can’t be computed. The MIT Press.



• Linsenmayer, M. (Host). (2019, June 17). The hard problem of consciousness (Episode 218: Chalmers, et al. Part one). [Audio podcast transcript]. In The ­partially examined life. .



• McCarthy, J. & Wright, P. (2004). Technology as experience. MIT Press.



• Nagel, T. (1974). What is it like to be a bat? The Philosophical Review, 83(4), 435–450. https://doi.org/10.2307/2183914.



• Stanford Encyclopedia of Philosophy (2017, December 18). Qualia. Retrieved from .



• Smith, B. C. (2013). The nature of sensory experience: The case of taste and tasting. Phenomenology and Mind, 4, 212–227. https://doi.org/10.13128/Phe_Mi-19603.

(2) Marks, L.  E. (1978). The unity of the senses: Interrelations among the modalities. Academic Press. (3) Press, C., Kok, P., & Yon, D. (2019). The perceptual prediction paradox. Trends in Cognitive Sciences, 24(1), 13–24. https://doi.org/10.31234/osf.io/hdsmz. Lange, F. P., Heilbron, M., & Kok, P. (2018). How do expectations shape perception? Trends in Cognitive Sciences, 22(9), 764–779. https://doi.org/10.1016/j.tics.2018.06.002. Seth, A. (2017, July 18). Your brain hallucinates your conscious reality [Video] TED Conferences. .

 R e f ere n ce s

| 91

(4) This example is based on Teufel, C., Dakin, S. C., & Fletcher, P. C. (2018). Prior object-knowledge sharpens properties of early visual feature-detectors. Scientific Reports, 8, 10853. https://doi.org/10.1038/s41598-018-28845-5. (5) Hand, E. (2016, June 23). Maverick scientist thinks he has discovered a magnetic sixth sense in humans. . (6) Spence, C. (2017). Gastrophysics: The new science of eating. Penguin UK. (7) Spence, C. (2016). Oral referral: On the mislocalization of odours to the mouth. Food Quality and Preference, 50, 117–128. https://doi.org/10.1016/j.foodqual. 2016.02.006. (8) For a general introduction to sensation and perception, see: Goldstein, E.  B. & Brockmole, J. (2016). Sensation and perception (10th ed.). Cengage Education. (9) For an overview of the current state of knowledge on multisensory processes, see: Sathian, K. & Ramachandran V.S. (2019). Multisensory perception: From laboratory to clinic. Academic Press. • Velasco, C. & Spence, C. (2019). The multisensory analysis of product packaging framework. In C. Velasco & C. Spence (Eds.), Multisensory packaging: Designing new product experiences (pp. 191–223). Palgrave MacMillan. (10) For further analysis of the effects of congruence versus incongruence, see: Velasco, C., Michel, C., Youssef, J., Gamez, X., Cheok, A. D., & Spence, C. (2016). Colour-taste correspondences: Design food experiences to meet expectations or surprise. International Journal of Food Design, 1, 83–102. https://doi.org/10.1386/ ijfd.1.2.83_1. CHAPTER 3 (1) Vi, C. T., Marzo, A., Ablart, D., Memoli, G., Subramanian, S., Drinkwater, B., & Obrist, M. (2017). TastyFloats: A contactless food delivery system. In S. Subramanian, & J. Steimle (Eds.), Proceedings of the 2017 ACM International Conference on Interactive Surfaces and Spaces (ISS ‘17) (pp. 161–170). ACM. https://doi.org/10.1145/3132272.3134123. See also Marzo, A., Seah, S.  A., Drinkwater, B.  W., Sahoo, D.  R., Long, B., & Subramanian, S. (2015). Holographic acoustic elements for manipulation of levitated objects. Nature Communications, 6, 8661. https://doi.org/10.1038/ncomms9661. (2) For detailed information about the reality–virtuality continuum see: • Abowd, G. D., & Mynatt, E. D. (2000). Charting past, present, and future research in ubiquitous computing. ACM Transactions on Computer-Human Interaction (TOCHI), 7(1), 29–58. https://doi.org/10.1145/344949.344988 • Milgram, P., Takemura, H., Utsumi, A., & Kishino, F. (1995). Augmented reality: A class of displays on the reality-virtuality continuum. Telemanipulator and Telepresence Technologies, 2351, 282–292. https://doi.org/10.1117/12.197321.

92 | R e f ere n ce s • Raisamo, R., Rakkolainen, I., Majaranta, P., Salminen, K., Rantala, J., & Farooq, A. (2019). Human augmentation: Past, present and future. International Journal of Human-Computer Studies, 131, 131–143. https://doi.org/10.1016/j ijhcs.2019.05.008. (3) Spence, C., Velasco, C., & Knoeferle, K. (2014). A large sample study on the influence of the multisensory environment on the wine drinking experience. Flavour, 3, 8. https://doi.org/10.1186/2044-7248-3-8. (4) Tao, Y., Do, Y., Yang, H., Lee, Y-C., Wang, G., Mondoa, C., Cui, J., Wang, W., & Yao, L. (2019). Morphlour: Personalized flour-based morphing food induced by dehydration or hydration method. In F.  Guimbretière, M.  Bernstein, & K.  Reinecke (Eds.), Proceedings of the 32nd Annual ACM Symposium on User Interface Soft­ ware and Technology (UIST ‘19) (pp. 329–340). ACM. https://doi.org/10.1145/ 3332165.3347949. (5) Narumi, T., Nishizaka, S., Kajinami, T., Tanikawa, T., & Hirose, M. (2011). Augmented reality flavors: Gustatory display based on edible marker and crossmodal inter­action. In D. Tan, G. Fitzpatrick, C. Gutwin, B. Begole, & W. A. Kellogg (Eds.), Proceedings of the Conference on Human Factors in Computing Systems (CHI ‘11) (pp. 93–102). ACM. https://doi.org/10.1145/1978942.1978957. (6) Ranasinghe, N., Jain, P., Tram, N. T. N., Koh, K. C. R., Tolley, D., Karwita, S., Lien-Ya, L., Liangkun, Y., Shamaiah, K., Tung, C. E. W., Yen, C. C., & Do, E. Y.-L. (2018). Season traveller: Multisensory narration for enhancing the virtual reality experience. In R.  Mandryk, M.  Hancock, M.  Perry & A.  Cox (Eds.), Proceedings of the Conference on Human Factors in Computing Systems (CHI ‘18) (Paper 577, 13 pages). ACM. https://doi.org/10.1145/3173574.3174151. (7) Wirtz, J., Patterson, P. G., Kunz, W. H., Gruber, T., Lu, V. N., Paluch, S., & Martins, A. (2018). Brave new world: Service robots in the frontline. Journal of Service Management, 29(5), 907–931. https://doi.org/10.1108/JOSM-04-2018-0119. CHAPTER 4 (1) Sasaki, T., Saraiji, M. H. D., Fernando, C. L., Minamizawa, K., & Inami, M. (2017, July). MetaLimbs: Multiple arms interaction metamorphism. In ACM SIGGRAPH 2017 Emerging Technologies (Article 16, 2 pages). ACM. https://doi.org/10.1145/ 3084822.3084837. (2) Svanaes, D., & Solheim, M. (2016). Wag your tail and flap your ears: The kinesthetic user experience of extending your body. In J. Kaye, A. Druin, C. Lampe, D. Morris, J. P. Hourcade (Eds.). Proceedings of the 2016 CHI Conference Extended Abstracts on Human Factors in Computing Systems (CHI EA ‘16) (pp. 3778–3779). ACM. https:// doi.org/10.1145/2851581.2890268. (3) Shedroff, N., & Noessel, C. (2012). Make it so: Interaction design lessons from science fiction. Rosenfeld Media. (4) Hamilton-Fletcher, G., Wright, T. D., & Ward, J. (2016). Cross-modal correspondences enhance performance on a colour-to-sound sensory substitution device. Multi­ sensory Research, 29(4–5), 337–363. https://doi.org/10.1163/22134808-00002519.

 R e f ere n ce s

| 93

(5) Denis, G., Alary, D., Pasco, X., Pisot, N., Texier, D., & Toulza, S. (2019). From new space to big space: How commercial space dream is becoming a reality. Acta Astronautica, 166, 431–443. https://doi.org/10.1016/j.actaastro.2019.08.031. (6) On the following website, you can see a description of NASA’s perspectives on space food: NASA (2016, July 18). Space food. National Aeronautics and Space Administration. . For more on the development of multisensory food experiences for space travels, see: Obrist, M., Tu, Y., Yao, L., & Velasco, C. (2019). Space food experiences: Designing passenger’s eating experiences for future space travel scenarios. Frontiers in Computer Science, 1, 3. https://doi.org/10.3389/fcomp.2019.00003. (7) Stepanova, E. R., Quesnel, D., & Riecke, B. E. (2019). Space—A virtual frontier: How to design and evaluate a virtual reality experience of the overview effect. Frontiers in Digital Humanities, 6, 7. https://doi.org/10.3389/fdigh.2019.00007. (8) Kelly, S. (2017). Endurance. Vintage Books. (9) Kerwin, J., & Seddon, R. (2002). Eating in space—from an astronaut’s perspective. Nutrition. 18, 921–925. https://doi.org/10.1016/s0899-9007(02)00935-8. (10) Obrist, M., Tu, Y., Yao, L., & Velasco, C. (2019). Space food experiences: Designing passenger’s eating experiences for future space travel scenarios. Frontiers in Computer Science, 1, 3. https://doi.org/10.3389/fcomp.2019.00003. (11) Hall, L. (2013, May 23). 3D Printing: Food in space. National Aeronautics and Space Administration. . Hiller, J., & Lipson, H. (2009). Design and analysis of digital materials for physical 3D voxel printing. Rapid Prototyping Journal, 15(2), 137–149. https://doi.org/ 10.1108/13552540910943441. (12) Rao, A. (2018, October 11). Our experience of reality is a bunch of hallucinations we collectively agree on. VICE. . (13) Doets, E. L., & Kremer, S. (2016). The silver sensory experience–A review of senior consumers’ food perception, liking and intake. Food Quality and Preference, 48, 316–332. https://doi.org/10.1016/j.foodqual.2015.08.010. CHAPTER 5 (1) For examples on digital taste and smell interfaces, see: Ranasinghe, N., Tolley, D., Nguyen, T. N. T., Yan, L., Chew, B., & Do, E. Y. L. (2019). Augmented flavours: Modulation of flavour experiences through electric taste ­augmentation. Food Research International, 117, 60–68. https://doi.org/10.1016/ j.foodres.2018.05.030. Ranasinghe, N., Nakatsu, R., Nii, H., & Gopalakrishnakone, P. (2012). Tongue mounted interface for digitally actuating the sense of taste. In 2012 16th International Symposium on Wearable Computers (pp. 80–87). IEEE. https://doi.org/10.1109/ISWC.2012.16.

94 | R e f ere n ce s (2) Mahmut, M. K., & Croy, I. (2019). The role of body odors and olfactory ability in the initiation, maintenance and breakdown of romantic relationships–A review. Physiology & Behavior, 207, 179–184. https://doi.org/10.1016/j.physbeh.2019.05.003. (3) Kurzweil R. (2006). The singularity is near: When humans transcend biology. Penguin Books. (4) Asimov, I. (1950). I, Robot. Bantam Dell. (5) Lustig, R. H. (2012). The toxic truth about sugar. Nature, 482, 27–29. https://doi.org/ 10.1038/482027a. (6) Henrich, J., Heine, S.  J., & Norenzayan, A. (2010). The weirdest people in the world? Behavioral and Brain Sciences, 33(2–3), 61–83. https://doi.org/10.1017/ S0140525X0999152X. (7) Kallus, N. & Zhou, A. (2018). Residual unfairness in fair machine learning from prejudiced data. Proceedings of the International Conference on Machine Learning (ICML 2018), 80, 2439–2448. ArXiv, https://arxiv.org/abs/1806.02887. Ghahramani, Z. (2015). Probabilistic machine learning and artificial intelligence. Nature, 521, 452–459. https://doi.org/10.1038/nature14541.