Hands-On Augmented Reality Development with Meta Spark Studio: A Beginner’s Guide 9781484294666, 9781484294673, 1484294661

Explore Meta Spark Studio, a program used for augmented reality (AR) effect creation and deployment across multiple soci

133 88 7MB

English Pages 236 [229]

Report DMCA / Copyright

DOWNLOAD FILE

Polecaj historie

Hands-On Augmented Reality Development with Meta Spark Studio: A Beginner’s Guide
 9781484294666, 9781484294673, 1484294661

Table of contents :
Table of Contents
About the Author
About the Technical Reviewer
Introduction
Chapter 1: Introduction to Augmented Reality
What Is AR?
The Beginnings of AR
AR vs. VR
AR and the Metaverse
How Does AR Work?
Hardware
Software
Inputs
AR Creation Tools
ARKit and RealityKit
ARCore
Unity
Unreal Engine
Lens Studio
Effect House
Meta Spark Studio
Summary
Chapter 2: Getting Started with Meta Spark Studio
Installation
Introduction to the Meta Spark Studio Interface
Scene Panel
Assets Panel
Viewport
Inspector
Toolbar
Menu Bar
Simulator and Video
Exploring the Meta Spark Studio AR Library
Accessing the AR Library
3D Objects
Music and Sound
Patch Assets
Textures
Blocks
Script Packages
Color LUTs
Summary
Chapter 3: Introduction to the  Patch Editor
Understanding Patches
Animation
Audio
Body Landmarks
Device
Face Landmarks
Interaction
Logic
Math
Shaders
Time
User Interface
Utility
Summary
Chapter 4: Creating Your First Effect
Planning Out a Project
Creating a New Effect File
Background Segmentation
Texture Extraction
Applying Textures to Materials
Adding Objects
Organizing Assets and Objects
Default Object Visualization
Adjusting Object Size
Adding Existing Materials to Objects
Creating New Materials for Objects
Default Material Visualization
Importing Textures
Mapping Visual Assets to a Face
Adding a Face Tracker
Working with Face Meshes
Adding Interactivity
Adding and Managing Planes
Positioning Planes
Generating and Connecting Patches
Color Grading
Finding LUTs
Applying LUTs
Summary
Untitled
Chapter 5: Testing Effects
Experience Types
Sending Effects to a Device
Testing Effects on Instagram
Testing Effects on Facebook
Refreshing Effect Tests
Testing vs. Publishing
Summary
Chapter 6: Customizing Projects Through Asset Replacement
Saving As a New Effect
Replacing Texture Assets
Swapping Color LUTs
Replacing Textures for the Face
Understanding Face Mesh Mapping
Face Reference Templates
Creating Custom Textures for the Face
Using External Resources
Image Resources
Image Usage Guidelines
Summary
Chapter 7: Creating a Target Tracking Effect
What Is Target Tracking
Selecting an Ideal Target
Target Quality
Flat Targets
Easily Viewable Targets
Planning Out a Target Tracker Project
Setting Up a Target Tracking Effect
Adding a Target Tracker
Particle Systems
Adding a Particle System
Positioning a Particle System
Customizing a Particle System
Working with 3D Assets
Finding 3D Assets
Customizing 3D Assets
Target Markers
Creating a Target Marker
Target Marker Interactivity
Effect Instructions
Previewing Target Tracking Effects in the Simulator
Testing Target Tracking Effects
Preparing the Test in Meta Spark Studio
Preparing the Target
Experiencing the Target Tracking Test
Summary
Chapter 8: Creating an  Augmented Reality Game
Planning Out a Game Project
Game Project Setup
Simulator Touch Settings
Simulator Cameras
Creating the Game Environment
Creating a Playable Character
Adding Animated Textures
Tracking User Movement
Creating an Objective
Initiating the Game
Changing the Objective’s Position
Creating an Enemy Character
Changing the Enemy’s Position
Keeping Score
Creating User Interface Elements
Adding Text
Creating a Counter
Unpacking Position Values
Calculating Success
Ending the Game
Creating a Game Over Screen
Calculating Game’s End
Restarting the Game
Accurate Scoring
Adding Multiple Instructions
Summary
Chapter 9: Publishing Effects
Preparation Before Publishing
Naming Your Effect
Creating a Demo Video
Creating an Icon
Project File Size
Project Capabilities
Policies, Standards, and Guidelines
The Publishing Process
Summary
Chapter 10: Conclusion
Finding Inspiration
Learn from Seasoned Developers
Inspiration from Various Platforms
Attend Events
Leverage Your Niche
Conduct User Research
Giving Back to the Community
Final Words
Index

Citation preview

Hands-On Augmented Reality Development with Meta Spark Studio A Beginner’s Guide

Jaleh Afshar

Hands-On Augmented Reality Development with Meta Spark Studio: A Beginner’s Guide Jaleh Afshar Menlo Park, CA, USA ISBN-13 (pbk): 978-1-4842-9466-6 https://doi.org/10.1007/978-1-4842-9467-3

ISBN-13 (electronic): 978-1-4842-9467-3

Copyright © 2023 by Jaleh Afshar This work is subject to copyright. All rights are reserved by the Publisher, whether the whole or part of the material is concerned, specifically the rights of translation, reprinting, reuse of illustrations, recitation, broadcasting, reproduction on microfilms or in any other physical way, and transmission or information storage and retrieval, electronic adaptation, computer software, or by similar or dissimilar methodology now known or hereafter developed. Trademarked names, logos, and images may appear in this book. Rather than use a trademark symbol with every occurrence of a trademarked name, logo, or image we use the names, logos, and images only in an editorial fashion and to the benefit of the trademark owner, with no intention of infringement of the trademark. The use in this publication of trade names, trademarks, service marks, and similar terms, even if they are not identified as such, is not to be taken as an expression of opinion as to whether or not they are subject to proprietary rights. While the advice and information in this book are believed to be true and accurate at the date of publication, neither the authors nor the editors nor the publisher can accept any legal responsibility for any errors or omissions that may be made. The publisher makes no warranty, express or implied, with respect to the material contained herein. Managing Director, Apress Media LLC: Welmoed Spahr Acquisitions Editor: Mark Powers Development Editor: Spandana Chatterjee Editorial Assistant: Spandana Chatterjee Cover designed by eStudioCalamar Shubham Dhage on Unsplash (www.unsplash.com) Distributed to the book trade worldwide by Springer Science+Business Media New York, 1 New York Plaza, Suite 4600, New York, NY 10004-1562, USA. Phone 1-800-SPRINGER, fax (201) 348-4505, e-mail [email protected], or visit www.springeronline.com. Apress Media, LLC is a California LLC and the sole member (owner) is Springer Science + Business Media Finance Inc (SSBM Finance Inc). SSBM Finance Inc is a Delaware corporation. For information on translations, please e-mail [email protected]; for reprint, paperback, or audio rights, please e-mail [email protected]. Apress titles may be purchased in bulk for academic, corporate, or promotional use. eBook versions and licenses are also available for most titles. For more information, reference our Print and eBook Bulk Sales web page at http://www.apress.com/bulk-sales. Any source code or other supplementary material referenced by the author in this book is available to readers on GitHub (https://github.com/Apress). For more detailed information, please visit http://www.apress.com/source-code. Printed on acid-free paper

To Leonardo and Kiwi, for all your support and encouragement.

Table of Contents About the Author�������������������������������������������������������������������������������xiii About the Technical Reviewer������������������������������������������������������������xv Introduction��������������������������������������������������������������������������������������xvii Chapter 1: Introduction to Augmented Reality�������������������������������������1 What Is AR?�����������������������������������������������������������������������������������������������������������1 The Beginnings of AR��������������������������������������������������������������������������������������2 AR vs. VR���������������������������������������������������������������������������������������������������������3 AR and the Metaverse�������������������������������������������������������������������������������������3 How Does AR Work?����������������������������������������������������������������������������������������������4 Hardware���������������������������������������������������������������������������������������������������������4 Software����������������������������������������������������������������������������������������������������������5 Inputs���������������������������������������������������������������������������������������������������������������6 AR Creation Tools��������������������������������������������������������������������������������������������������6 ARKit and RealityKit�����������������������������������������������������������������������������������������7 ARCore�������������������������������������������������������������������������������������������������������������7 Unity����������������������������������������������������������������������������������������������������������������7 Unreal Engine��������������������������������������������������������������������������������������������������7 Lens Studio������������������������������������������������������������������������������������������������������8 Effect House����������������������������������������������������������������������������������������������������8 Meta Spark Studio�������������������������������������������������������������������������������������������8 Summary��������������������������������������������������������������������������������������������������������������8

v

Table of Contents

Chapter 2: Getting Started with Meta Spark Studio�����������������������������9 Installation������������������������������������������������������������������������������������������������������������9 Introduction to the Meta Spark Studio Interface�������������������������������������������������11 Scene Panel���������������������������������������������������������������������������������������������������12 Assets Panel��������������������������������������������������������������������������������������������������12 Viewport��������������������������������������������������������������������������������������������������������14 Inspector��������������������������������������������������������������������������������������������������������14 Toolbar�����������������������������������������������������������������������������������������������������������15 Menu Bar�������������������������������������������������������������������������������������������������������16 Simulator and Video��������������������������������������������������������������������������������������17 Exploring the Meta Spark Studio AR Library�������������������������������������������������������20 Accessing the AR Library�������������������������������������������������������������������������������20 3D Objects�����������������������������������������������������������������������������������������������������21 Music and Sound�������������������������������������������������������������������������������������������22 Patch Assets��������������������������������������������������������������������������������������������������24 Textures���������������������������������������������������������������������������������������������������������25 Blocks������������������������������������������������������������������������������������������������������������26 Script Packages���������������������������������������������������������������������������������������������27 Color LUTs�����������������������������������������������������������������������������������������������������28 Summary������������������������������������������������������������������������������������������������������������29

Chapter 3: Introduction to the Patch Editor����������������������������������������31 Understanding Patches���������������������������������������������������������������������������������������32 Animation������������������������������������������������������������������������������������������������������33 Audio�������������������������������������������������������������������������������������������������������������33 Body Landmarks��������������������������������������������������������������������������������������������34 Device������������������������������������������������������������������������������������������������������������34 Face Landmarks��������������������������������������������������������������������������������������������34

vi

Table of Contents

Interaction�����������������������������������������������������������������������������������������������������34 Logic��������������������������������������������������������������������������������������������������������������35 Math��������������������������������������������������������������������������������������������������������������35 Shaders���������������������������������������������������������������������������������������������������������35 Time���������������������������������������������������������������������������������������������������������������35 User Interface������������������������������������������������������������������������������������������������36 Utility�������������������������������������������������������������������������������������������������������������36 Summary������������������������������������������������������������������������������������������������������������36

Chapter 4: Creating Your First Effect��������������������������������������������������37 Planning Out a Project����������������������������������������������������������������������������������������37 Creating a New Effect File����������������������������������������������������������������������������������39 Background Segmentation���������������������������������������������������������������������������������40 Texture Extraction������������������������������������������������������������������������������������������40 Applying Textures to Materials����������������������������������������������������������������������45 Adding Objects����������������������������������������������������������������������������������������������48 Organizing Assets and Objects����������������������������������������������������������������������50 Default Object Visualization���������������������������������������������������������������������������52 Adjusting Object Size�������������������������������������������������������������������������������������53 Adding Existing Materials to Objects�������������������������������������������������������������56 Creating New Materials for Objects���������������������������������������������������������������58 Default Material Visualization������������������������������������������������������������������������59 Importing Textures�����������������������������������������������������������������������������������������60 Mapping Visual Assets to a Face�������������������������������������������������������������������������61 Adding a Face Tracker�����������������������������������������������������������������������������������62 Working with Face Meshes���������������������������������������������������������������������������63 Adding Interactivity���������������������������������������������������������������������������������������������65 Adding and Managing Planes������������������������������������������������������������������������66 Positioning Planes�����������������������������������������������������������������������������������������67 vii

Table of Contents

Generating and Connecting Patches�������������������������������������������������������������71 Color Grading������������������������������������������������������������������������������������������������������73 Finding LUTs��������������������������������������������������������������������������������������������������73 Applying LUTs������������������������������������������������������������������������������������������������74 Summary������������������������������������������������������������������������������������������������������������74

Chapter 5: Testing Effects�������������������������������������������������������������������75 Experience Types������������������������������������������������������������������������������������������������75 Sending Effects to a Device��������������������������������������������������������������������������������78 Testing Effects on Instagram�������������������������������������������������������������������������79 Testing Effects on Facebook��������������������������������������������������������������������������81 Refreshing Effect Tests���������������������������������������������������������������������������������������84 Testing vs. Publishing�����������������������������������������������������������������������������������������85 Summary������������������������������������������������������������������������������������������������������������85

Chapter 6: Customizing Projects Through Asset Replacement����������87 Saving As a New Effect���������������������������������������������������������������������������������������87 Replacing Texture Assets������������������������������������������������������������������������������������88 Swapping Color LUTs������������������������������������������������������������������������������������������92 Replacing Textures for the Face��������������������������������������������������������������������������94 Understanding Face Mesh Mapping��������������������������������������������������������������95 Face Reference Templates����������������������������������������������������������������������������95 Creating Custom Textures for the Face���������������������������������������������������������96 Using External Resources�����������������������������������������������������������������������������������98 Image Resources�������������������������������������������������������������������������������������������99 Image Usage Guidelines��������������������������������������������������������������������������������99 Summary����������������������������������������������������������������������������������������������������������100

viii

Table of Contents

Chapter 7: Creating a Target Tracking Effect������������������������������������101 What Is Target Tracking�������������������������������������������������������������������������������������101 Selecting an Ideal Target�����������������������������������������������������������������������������������102 Target Quality����������������������������������������������������������������������������������������������102 Flat Targets��������������������������������������������������������������������������������������������������103 Easily Viewable Targets�������������������������������������������������������������������������������104 Planning Out a Target Tracker Project���������������������������������������������������������������104 Setting Up a Target Tracking Effect�������������������������������������������������������������������105 Adding a Target Tracker�������������������������������������������������������������������������������105 Particle Systems�����������������������������������������������������������������������������������������������108 Adding a Particle System����������������������������������������������������������������������������109 Positioning a Particle System����������������������������������������������������������������������110 Customizing a Particle System��������������������������������������������������������������������112 Working with 3D Assets������������������������������������������������������������������������������������116 Finding 3D Assets����������������������������������������������������������������������������������������117 Customizing 3D Assets��������������������������������������������������������������������������������117 Target Markers��������������������������������������������������������������������������������������������������121 Creating a Target Marker�����������������������������������������������������������������������������121 Target Marker Interactivity��������������������������������������������������������������������������126 Effect Instructions���������������������������������������������������������������������������������������������130 Previewing Target Tracking Effects in the Simulator�����������������������������������������132 Testing Target Tracking Effects�������������������������������������������������������������������������133 Preparing the Test in Meta Spark Studio�����������������������������������������������������133 Preparing the Target������������������������������������������������������������������������������������134 Experiencing the Target Tracking Test���������������������������������������������������������135 Summary����������������������������������������������������������������������������������������������������������138

ix

Table of Contents

Chapter 8: Creating an Augmented Reality Game�����������������������������139 Planning Out a Game Project����������������������������������������������������������������������������139 Game Project Setup������������������������������������������������������������������������������������������141 Simulator Touch Settings�����������������������������������������������������������������������������142 Simulator Cameras��������������������������������������������������������������������������������������143 Creating the Game Environment�����������������������������������������������������������������������145 Creating a Playable Character���������������������������������������������������������������������������152 Adding Animated Textures���������������������������������������������������������������������������153 Tracking User Movement�����������������������������������������������������������������������������156 Creating an Objective����������������������������������������������������������������������������������������161 Initiating the Game��������������������������������������������������������������������������������������164 Changing the Objective’s Position���������������������������������������������������������������165 Creating an Enemy Character���������������������������������������������������������������������������167 Changing the Enemy’s Position�������������������������������������������������������������������170 Keeping Score���������������������������������������������������������������������������������������������������173 Creating User Interface Elements����������������������������������������������������������������173 Adding Text��������������������������������������������������������������������������������������������������176 Creating a Counter���������������������������������������������������������������������������������������179 Ending the Game�����������������������������������������������������������������������������������������������183 Creating a Game Over Screen���������������������������������������������������������������������183 Calculating Game’s End�������������������������������������������������������������������������������186 Restarting the Game�����������������������������������������������������������������������������������������188 Accurate Scoring�����������������������������������������������������������������������������������������������190 Adding Multiple Instructions�����������������������������������������������������������������������������192 Summary����������������������������������������������������������������������������������������������������������194

x

Table of Contents

Chapter 9: Publishing Effects�����������������������������������������������������������195 Preparation Before Publishing��������������������������������������������������������������������������195 Naming Your Effect��������������������������������������������������������������������������������������196 Creating a Demo Video��������������������������������������������������������������������������������������196 Creating an Icon������������������������������������������������������������������������������������������198 Project File Size�������������������������������������������������������������������������������������������199 Project Capabilities��������������������������������������������������������������������������������������199 Policies, Standards, and Guidelines������������������������������������������������������������������200 The Publishing Process�������������������������������������������������������������������������������������201 Summary����������������������������������������������������������������������������������������������������������203

Chapter 10: Conclusion���������������������������������������������������������������������205 Finding Inspiration��������������������������������������������������������������������������������������������205 Learn from Seasoned Developers���������������������������������������������������������������205 Attend Events����������������������������������������������������������������������������������������������206 Leverage Your Niche������������������������������������������������������������������������������������207 Conduct User Research�������������������������������������������������������������������������������207 Giving Back to the Community��������������������������������������������������������������������������208 Final Words�������������������������������������������������������������������������������������������������������209

Index�������������������������������������������������������������������������������������������������211

xi

About the Author Jaleh Afshar is a designer who has built digital experiences across various industries and platforms, including mobile, web, wearables, voice, and emerging technologies. Currently a Product Design Director at Meta, she has taught courses, led workshops, and published on augmented reality.

xiii

About the Technical Reviewer Simon Jackson is a long-time software engineer and architect with many years of Unity game development experience, as well as an author of several Unity game development titles. He loves to both create Unity projects and lend a hand to help educate others, whether it’s via a blog, vlog, user group, or major speaking event. His primary focus at the moment is with the Reality Toolkit project, which is aimed at building a cross-platform mixed reality framework to enable both VR and AR developers to build efficient solutions in Unity and then build/distribute them to as many platforms as possible.

xv

Introduction Augmented reality has the potential to change the world as we know it. Rather than locking oneself into a solely virtual environment, AR unlocks a new layer of experiences within the real world, blending transformational digital flavor with the familiarity of physical matter. This book aims to give you a taste of this technology through step-­by-­step learning. Zero prior coding or design experience is required to complete the exercises in this book. By the end of the last chapter, you’ll have completed a few augmented reality projects that can be shared privately or published publicly. These projects are also structured for ease of customization, so I invite you to bring your creative vision and tailor these projects to your personal taste. I’m honored to join you in this AR learning adventure. Let’s begin!

xvii

CHAPTER 1

Introduction to Augmented Reality Congratulations on embarking on an augmented reality (AR) journey! I look forward to accompanying you in developing your very own AR creations. We’ll start with a short introduction to the fundamentals of this technology and then dive into Meta Spark Studio for hands-on learning. To complete the projects in this book, absolutely no prior experience with AR is required. Whether you're a beginner who is just starting to explore this topic or a seasoned technologist deeply familiar with other forms of interactive experiences, this book is for you.

What Is AR? AR describes technology that enhances real-world surroundings with digital content. AR provides an immersive, multisensory experience by adding computer-generated elements like sound, graphics, and interactivity to what you experience in the physical world. AR has the potential to transform the way we interact with the world around us, and it's already starting to have an impact on a variety of industries. Familiar examples may include entertainment, such as playing games that respond to your facial expressions or having fun with friends © Jaleh Afshar 2023 J. Afshar, Hands-On Augmented Reality Development with Meta Spark Studio, https://doi.org/10.1007/978-1-4842-9467-3_1

1

Chapter 1

Introduction to Augmented Reality

by wearing virtual masks in a video call, to education and training, such as assisting you in learning a new skill by adding in contextual virtual instructions as you follow steps in a real-life tutorial. This technology has the power to bring convenience to common experiences, such as helping you navigate an unfamiliar environment by overlaying a contextual map or highlighting information on notable landmarks that can be easily accessed by tourists and locals alike. It can also bring greater accessibility and safety via visual, auditory, and tactile response in reaction to real-world environmental changes. These experiences can be achieved with devices like smartphones and tablets, or with dedicated hardware such as eyewear and headsets.

The Beginnings of AR Although AR may feel futuristic, there have been sophisticated explorations of this concept dating back to the 1960s. One of the earliest examples of augmented reality appears in the “head-mounted three-dimensional display” invention by Ivan E. Sutherland, an American computer scientist and computer graphics pioneer, and his collaborators Robert “Bob” Sproull, Quintin Foster, Ted Lee, Danny Cohen, and Stewart Ogden. Leveraging an ultrasonic head-position sensor originally designed by Charles Seitz and Stylianos Pezaris at the MIT Lincoln Laboratory, this hardware is an early example of blending real-world objects with virtual imagery. The eyepieces which the headset wearer would look through utilized a set of customized mirrors, allowing simultaneous viewing of both imagery from a cathode ray tube and the view of the room the user was in. The imagery displayed could be positioned to coincide with real-world objects and simulate some level of interaction, for instance, intelligently positioning visuals to appear as if they were coinciding with a map on the wall or the top of a desk.

2

Chapter 1

Introduction to Augmented Reality

AR vs. VR In the last few years, both AR and VR (virtual reality) are terms that have entered mainstream consciousness. While these technologies are not necessarily new, advances in computing power and hardware development mean adoption of AR and VR is becoming commonplace. Both of these technologies help users immerse themselves, at varying degrees, within digital environments. As a result, these two concepts are often used interchangeably, but there is a key difference between the two. The primary differentiator is that VR replaces a user’s real-world surroundings with a completely virtual world (typically encapsulated within the screens and speakers of a VR headset), while AR blends virtual objects into the user’s real-world environment. While the experience of VR and AR can feel quite different, many devices are blending the lines, seamlessly allowing users to move between a fully virtual world and across to an augmented reality experience, and back even to full reality via passthrough technology, where externally placed cameras on a VR headset can render the real world outside the walls of the opaque headset into the display inside. This seamless merging of virtual objects co-mingling with a physical environment, or vice versa (real-life objects in a virtual world), has been described as mixed reality (MR). The totality of the spectrum from AR to VR is described as extended reality (XR).

AR and the Metaverse You may have heard the term “metaverse” before, perhaps in the context of AR and futuristic tech. First coined by Neal Stephenson in his science fiction novel Snow Crash, this term originally described a hyper-realistic, fully immersive virtual world that users would access via specialized VR-­esque goggles. In today’s vernacular, the term “metaverse” now more commonly refers to any set of persistent, interconnected digital spaces that are 3

Chapter 1

Introduction to Augmented Reality

focused on social interaction, allowing people to connect with each other, generally engaging one another through the use of avatars. These virtual environments can be accessed through a variety of devices, such as smartphones, gaming consoles, and VR headsets. While we may not be living in the metaverse yet, augmented reality is starting to become part of the everyday lives of many. As adoption of AR grows, the way this technology can shape the future of how people interact with each other can absolutely play a part in the evolution of metaverse experiences.

How Does AR Work? If we are to simplify the mechanisms by which augmented reality technology functions down to its core pieces, there are three fundamental parts required in creating an AR experience.

Hardware A key part of an AR experience is hardware. Hardware in an AR context refers to the physical devices one uses to access an AR experience. For most AR experiences, this at the very least will require the following hardware devices: •

A computer, such as a laptop or smartphone. This includes the physical components used to construct these devices, such as microchips or the hard drive.



A camera, such as a webcam or smartphone camera.



A display, such as a monitor or smartphone’s touchscreen.

It’s important to note that not all augmented reality effects rely purely on visual data—augmenting reality can include audio and other senses as well. Hardware to support additional senses could be as familiar as a microphone or speaker, to more uncommon devices such as odor-­ releasing devices with scents that sync to your personalized AR experience. 4

Chapter 1

Introduction to Augmented Reality

In addition to more traditional hardware combinations like a camera-enabled smartphone or computer with a webcam, there are specialized wearable hardware devices that support AR experiences. Examples include the following: •

Magic Leap 2—This enterprise-focused AR device has been marketed to industries such as healthcare and manufacturing, enabling wearers to utilize the device as part of surgical navigation, patient diagnostics, or offsite training.



HoloLens 2—Created by Microsoft, this headset utilizes hand tracking systems powering intuitive interaction with augmented reality holograms, without the need of controllers.



Lenovo ThinkReality A3—Designed for compatibility with selected smartphones and workstations as a plug-in peripheral, these enterprise smart glasses are optimized for business use cases such as functioning as a virtual monitor or schematics display.



Meta Quest Pro—A premium-tier successor to the VR-focused Meta Quest 2 headset, the Meta Quest Pro utilizes full-color passthrough technology enabling highdetail AR experiences alongside its VR capabilities.

In this book, we will keep it simple and focus on making effects optimized for smartphones.

Software Software is the digital program that tells the hardware how to work and react, for instance, what to display on the screen or how to score a game. Software can also tell other software what to do. For instance, the software powering the operating system on your smartphone will then enable additional software to be installed and used. 5

Chapter 1

Introduction to Augmented Reality

In this book, we will use specific software, Meta Spark Studio, to create our AR effects. Your friends, family, and colleagues who you might choose to share your AR creations with will also use different software apps to access your creations, such as Facebook and Instagram.

Inputs Translating the real world into digital means there must be a mechanism by which to capture and process relevant pieces of what a user is doing in their environment. AR works by processing an input—those elements and interactions in the real world—via hardware devices and software. This processing will output an augmented effect. Inputs inform the “reality” in “augmented reality.” For example, an input could be the video camera capture of a selfie video, or the action of tapping on the screen of a smartphone, or the live feed of an environment and the associated location data. The input can be further refined to something as specific as understanding when a user is blinking or smiling within that selfie video. Inputs are captured by input devices, which are hardware components specifically designed to pick up these interactions. The input device will then provide these signals and data to be processed by the computer and relevant software. Keyboards, touchscreens, controllers, microphones, and webcams are all examples of input devices.

AR Creation Tools Now that we’ve learned a bit about how software powers the AR experience, we’ll go over different software options for AR creation. These tools help creators like you build interactive experiences through various mechanisms, whether through coding or by using drag-and-drop style visual interfaces. 6

Chapter 1

Introduction to Augmented Reality

ARKit and RealityKit ARKit and RealityKit are augmented reality frameworks developed by Apple. The creations built by these frameworks are exclusive to iOS devices, utilizing specific functionality that can only be accessed via Apple’s proprietary hardware and software.

ARCore ARCore is Google’s official software development kit for augmented reality. First released in 2018, this technology is powered by three fundamental mechanisms: motion tracking, environmental awareness (such as detecting a surface’s location and if it is positioned horizontally, vertically, or angled), and estimating the lighting conditions of the real-world environment.

Unity Unity creates cross-platform development products, which means their tools allow developers to build apps for multiple types of hardware and operating systems in a streamlined way. Unity supports development for a variety of AR handheld and wearable devices, including Android, iOS, Magic Leap, and HoloLens.

Unreal Engine Similar to Unity, Unreal Engine’s AR framework allows developers to have a single, unified development path for multiple platforms, thus enabling a more efficient development process. Unreal Engine currently supports Android and iOS.

7

Chapter 1

Introduction to Augmented Reality

Lens Studio Lens Studio is a tool from Snap Inc., the company that is probably best known for creating the instant messaging app Snapchat. With Lens Studio, creators can build AR effects (or “Lenses” as they are called in this software) specifically for use by Snapchatters.

Effect House Effect House is an AR effect-making tool from TikTok. This tool specifically enables building of AR effects to be used by creators of short-form videos on the TikTok platform.

Meta Spark Studio Formerly named Spark AR Studio, Meta Spark Studio is a free AR creation tool from Meta. This tool specifically enables creators to build AR effects to be used on Instagram, Facebook, and Messenger. This is a beginner-­ friendly tool that does not require programming to develop an AR effect. We’ll be using this software for all of our projects in this book.

Summary In this introductory chapter, we learned the fundamentals of augmented reality, from its early beginnings to how it works and the various tools used to create AR experiences. In the next chapter, we’ll dive into AR creation software and explore Meta Spark Studio’s interface and capabilities.

8

CHAPTER 2

Getting Started with Meta Spark Studio Now that we have an idea of some tools used for AR, let’s get started installing our first tool. The latest version of Meta Spark Studio is v156.0 at the time of the writing of this book. If you have an older version of the software, you will need to update it prior to starting the projects.

I nstallation To install Meta Spark Studio, you must download it directly from the Meta Spark official website (Figure 2-1).

© Jaleh Afshar 2023 J. Afshar, Hands-On Augmented Reality Development with Meta Spark Studio, https://doi.org/10.1007/978-1-4842-9467-3_2

9

Chapter 2

Getting Started with Meta Spark Studio

Figure 2-1.  Meta Spark Studio website Click on the Get Started button to sign in and proceed with downloading. Accessing this software requires a Facebook account, as all effects created within Meta Spark Studio are distributed on either Facebook, Instagram, or Messenger. Once downloaded, double-click the installer file on your computer to proceed with installing the software. Meta Spark Studio is available for both Windows and Mac. In addition, the companion app Meta Spark Player is available on the Apple App Store and Google Play Store. This companion app is not required to complete the projects in this book; however, it is a handy tool for quickly previewing what your in-progress AR projects look like on a mobile device.

10

Chapter 2

Getting Started with Meta Spark Studio

Introduction to the Meta Spark Studio Interface We’ll kick things off with a brief overview of the main components you’ll be frequently using in our upcoming projects. To follow along, open Meta Spark Studio and select Blank Project under Create New in the Welcome Screen (Figure 2-2). This is the option at the top with the large “+” icon.

Figure 2-2.  Meta Spark Studio Welcome Screen Now that you’ve created a new project file, let’s go through the various sections of the interface together.

11

Chapter 2

Getting Started with Meta Spark Studio

Scene Panel In the upper left of your screen is the Scene panel (Figure 2-3). This section contains all of the objects that will ultimately be directly rendered in your effect. Fundamental elements of your AR effect, such as the device and camera, will be present in the Scene panel by default. As we create projects together, you’ll add additional objects to the scene.

Figure 2-3.  The default Scene panel

Assets Panel In the lower left of your screen is the Assets panel (Figure 2-4). Assets in this context refer to any digital element like a photo, illustration, 3D model, GIF, audio clip, or fonts that you might want to incorporate into your AR project.

12

Chapter 2

Getting Started with Meta Spark Studio

Figure 2-4.  The default Assets panel You can import your own existing assets from your computer into the Assets panel, or you can bring in assets from outside sources. For example, you can find assets to use in the Meta Spark Library and Sketchfab, which we’ll explore later in this chapter. An important tip to note is that for an asset to show up in your effect, it must be associated with or applied to an object in your Scene panel. We’ll learn how to do this in our first project.

13

Chapter 2

Getting Started with Meta Spark Studio

Viewport In the center of the screen is the Viewport (Figure 2-5), which functions as the working canvas for your AR effect.

Figure 2-5.  The Viewport As you select objects in your Scene panel or Object panel, they are highlighted in the Viewport to allow for direct manipulation.

Inspector When you have something selected in the Assets or Scene panel, on the right side of the screen, you will see the Inspector. This section of the interface is where you’ll configure settings and adjustments for what you have selected, as shown in Figure 2-6. Here, you’ll be able to alter parameters like size, color, positioning, and more. 14

Chapter 2

Getting Started with Meta Spark Studio

Figure 2-6.  The Inspector options when “Camera” is selected in the Scene panel

Toolbar On the furthest left side of the Meta Spark Studio interface, you’ll find a column of icons that make up the Toolbar (Figure 2-7).

15

Chapter 2

Getting Started with Meta Spark Studio

Figure 2-7.  The Toolbar These icons allow you to quickly access functions such as workspace layout options, video options and playback options for the Simulator, the Library, publishing tools, and documentation.

Menu Bar On the topmost portion of the Meta Spark Studio interface, you’ll see the text-based navigation links to core actions. This is called the Menu Bar (Figure 2-8).

16

Chapter 2

Getting Started with Meta Spark Studio

Figure 2-8.  The Menu Bar Click on the File option to find creation, saving, import, and logout options. Edit contains rudimentary actions such as undo/redo and copy/ paste. The View drop-down encompasses all features related to what is visible within the interface, such as hiding or showing panels and zooming in/out. Add contains quick links for adding additional scene objects or assets. Project will surface configurations related to the overarching AR project file as a whole. Window manages how the overall interface displays. Help leads to documentation.

Note  Make sure to save in regular intervals. You can always find the Save option within the File drop-down in the Menu Bar; however, I recommend getting into the habit of using the Save shortcut (Command + S on Mac and Control + S on PC) frequently when you are working on your project.

Simulator and Video This Simulator (Figure 2-9) represents the camera view a user will experience when using your AR effect. By default, the Simulator will be docked in the top right of your viewport. The Simulator can be flipped from front-facing “selfie” view to rear-facing “environment” view by clicking the second icon underneath the Simulator (“Switch to front camera”/”Switch to back camera”).

17

Chapter 2

Getting Started with Meta Spark Studio

Figure 2-9.  The Simulator appears docked in the top-right corner of the Viewport You can customize the camera feed that displays in the Simulator. Access the Video Library (Figure 2-10) by clicking the first icon underneath the Simulator, entitled “Show Video Library”. This menu can also be accessed through the Video icon in the Toolbar.

18

Chapter 2

Getting Started with Meta Spark Studio

Figure 2-10.  Use the Camera options to customize what appears in your Simulator Options include using your hardware webcam, a selection of pre-­ rendered videos for selfie and full-body view, and virtual environments to simulate using the back camera in the real world. Additional virtual cameras you may have installed on your computer will also appear in this list. As you begin to create your first projects, these built-in videos are a helpful way to ensure your effects work well for people across a variety of appearances. The Simulator also conveniently allows you to preview your effect without sending it to your phone or plugging in a test device.

19

Chapter 2

Getting Started with Meta Spark Studio

Note To save processing power, you can pause your Camera using the Pause icon in the Toolbar. Try using this Pause feature if you don’t actively need a visual preview at the moment. However, make sure to unpause the Camera once you do have changes to preview. Changing assets or properties in the Inspector won’t turn the Camera back on automatically.

Exploring the Meta Spark Studio AR Library All of the projects in this book can be completed by using the custom pre-­ created images and code snippets downloadable on the Apress GitHub. However, within Meta Spark Studio, there is also a massive free resource library at your disposal. Browsing this AR Library is a great way to get inspiration when embarking on a new project and can also be helpful in enhancing your AR effects to be customized beyond the provided material. As you begin your AR journey, you’ll learn hands-on how to leverage these assets within projects in upcoming chapters. In the meantime, we’ll get familiar with the types of files you can find in this repository.

Accessing the AR Library Open the AR Library by clicking the AR Library (Figure 2-11) folder icon found in the Toolbar. You can alternately access the AR Library through the Assets panel by clicking the plus icon and selecting Search AR Library in the menu that appears.

20

Chapter 2

Getting Started with Meta Spark Studio

Figure 2-11.  The Meta Spark Studio Library Once the AR Library is opened, you’ll see a list of categories to the left, each containing a variety of files within. You can also easily search in the top-left search box of the AR Library if you already have a particular item in mind. What sorts of things can you find in the AR Library? Let’s take a look!

3D Objects The AR Library contains 3D assets made by both Meta itself and a third-­ party platform, Sketchfab. These 3D assets can be used as is or can be further customized. When clicked, each 3D asset will display the original creator or source of the file, the licensing agreement, and some technical specifications, as shown in Figure 2-12. These specifications include the size of the file and geometric complexity of the object’s model.

21

Chapter 2

Getting Started with Meta Spark Studio

Figure 2-12.  3D object detail page To add a 3D object into your project’s Assets panel, click the Import Free button. To import assets created by Sketchfab, you will need to create a free account at http://sketchfab.com, and log in when prompted after clicking the Import Free button. We’ll use 3D objects in a later project within this book.

Music and Sound The Music and Sound section (Figure 2-13) contains hundreds of free-to-­ use audio clips, conveniently separated into three collections: •

22

Music—This section contains melodic or instrumental audio clips, great for setting the overall tone of your AR creation. These sound effects can also be used to reinforce celebratory experiences, such as playing after an AR user completes a particular action.

Chapter 2

Getting Started with Meta Spark Studio



Sound effects—These one-off audio clips can be used to draw attention to specific actions or elements in your AR creation. Examples include a bubble popping or a camera shutter.



Ambient sounds—These atmospheric audio clips can bring a sense of immersion into your AR creation, simulating background noise or the sound you may hear in a particular environment, such as rain, cars passing by, or windshield wipers. Generally, these clips should be looped to create an ongoing, extended sound.

Preview any clip by clicking the play icon.

Figure 2-13.  Music and Sound page

23

Chapter 2

Getting Started with Meta Spark Studio

Additional information such as licensing and length is also shown for each clip. To add any of these audio files to your Assets panel, click the Import Free button.

Patch Assets Patches are a core way to add interactivity and sophisticated functions into your AR effects. In the Patch Assets section of the library (Figure 2-14), you’ll find patches created by Meta Spark and AR creators.

Figure 2-14.  Patch Assets page

24

Chapter 2

Getting Started with Meta Spark Studio

We’ll deep dive into patch details in an upcoming chapter, but for now, as a brief overview, many of the patches in the AR Library can be categorized under the following collections: •

Audio—Enables alterations to how music and sounds are processed



Animation—Adds and adjusts animation properties applied to objects



Utility—Makes your workflow more efficient through these consolidated patches designed to streamline common patch flows



Shaders—Changes how materials render through these visual alterations

Each patch detail page will display authorship, licensing, file size, a description of the patch’s capabilities, the inputs and outputs necessary for the patch to function, and, in many cases, a preview screenshot. To add any of these patches into your project, click the Import Free button.

Textures The textures appearing in the AR Library’s Textures section (Figure 2-15) are specifically used to add realistic lighting and color elements to objects.

25

Chapter 2

Getting Started with Meta Spark Studio

Figure 2-15.  Textures page Each texture will have a visual preview, licensing, and file size. Import textures using the Import Free button.

Blocks Blocks are entire sections of a Meta Spark Studio project, saved as a file that can be easily imported as an asset and added to your project’s scene. Information about blocks can be viewed on the detail page (Figure 2-16), such as licensing, file size, functionality, and if the author is Meta Spark or a third party. Import blocks using the Import Free button.

26

Chapter 2

Getting Started with Meta Spark Studio

Figure 2-16.  Blocks detail page Using blocks can be a time saver when you are looking to replicate a set of assets, objects, patches, and settings as one combined unit.

Script Packages Within the AR Library, you’ll find collections of code snippets, called script packages, as shown in Figure 2-17. These are used to add additional functionality to the code you may use in your project.

27

Chapter 2

Getting Started with Meta Spark Studio

Figure 2-17.  Script detail page Information about script packages, such as the licensing, code repository, and author can be viewed on the detail page. Import using the Import Free button.

Color LUTs Color LUTs are pre-made sets of color values, which are represented as a table of RGB values. Each LUT’s detail page (Figure 2-18) shows a before and after image, previewing the stylistic changes from the particular set of color alterations. Import a LUT using the Import Free button.

28

Chapter 2

Getting Started with Meta Spark Studio

Figure 2-18.  Color LUTs detail page These LUTs are used in color grading, that is, altering the contrast, saturation, and tone of colors for aesthetic purposes.

Summary We’ve now explored the core elements within Meta Spark Studio’s interface and familiarized ourselves with how each part of the software can assist us in the AR creation process. Next, we’ll dive deep into patches—a powerful feature within this software that will enable us to develop AR effects with rich interactivity and immersive functionality.

29

CHAPTER 3

Introduction to the  Patch Editor While many elements of functionality and customization are built directly into the Meta Spark Studio’s object properties, patches can enable you to truly unlock the sophistication of this software. By using patches, you can add features such as facial expression detection, voice changing, scoring a game, and more.

Figure 3-1.  The Patch Editor displaying a string of patches

© Jaleh Afshar 2023 J. Afshar, Hands-On Augmented Reality Development with Meta Spark Studio, https://doi.org/10.1007/978-1-4842-9467-3_3

31

Chapter 3

Introduction to the Patch Editor

Understanding Patches Patches are a bundle of data and functions, abstracted as a simplified rectangular visual. Each patch controls a specific set of unique information, and patches can be connected to each other to pass and receive that information. To view the Patch Editor (Figure 3-2), navigate to View within the Menu Bar in Meta Spark Studio, then click Show Patch Editor.

Figure 3-2.  Default Patch Editor showing underneath the Viewport To browse all available patches, simply double-click anywhere in the Patch Editor grid. This action will bring up the patch browsing menu.

32

Chapter 3

Introduction to the Patch Editor

Animation Animation patches (Figure 3-3) initiate and control animated features, for example, controlling the static frames that play to create an animated sequence or ensuring that looped animations play continuously.

Figure 3-3.  Animation patch category with default Animation patch selected There are multiple patches within this patch category, such as Keyframe and Transition. These can all be viewed within the patch browsing menu.

Audio Audio patches enable analysis and manipulation of audio. Use these patches to distort voices, create echoes, remove background noise, and more.

33

Chapter 3

Introduction to the Patch Editor

Body Landmarks These patches are conveniently pre-programmed to detect and follow landmarks of the body, such as the position of the head, neck, and limbs.

Device Device patches capture and output helpful information from the AR effect user’s device, such as movement and rotation. It can also understand information such as the user’s device language and region.

Face Landmarks For further granularity beyond the body landmark patches, face landmarks allow for capturing position in 3D space for highly specific elements of a face, such as the cheek, chin, eyeball, eyebrow, eyelid, forehead, mouth, and nose. These landmarks can be further refined to capture position of parts as highly specific as, for example, the right cheekbone or the left iris.

Interaction Interaction patches enable refined gesture detection, allowing AR creators to build effects that react to how users move or express emotion through their facial expressions. Movements like blinking, head shakes, and nodding are covered under these patches. Detectable gestures include making a happy face, kissing face, surprised face, or simply smiling. Touch interactions with the screen are also categorized under interaction patches. These include detecting screen taps, pinches, panning, and rotation.

34

Chapter 3

Introduction to the Patch Editor

Logic These patches are used to define what needs to happen first for a resulting change to occur. Examples include the “and” patch, which is generally used to ensure that two conditions are met simultaneously—for example, detecting that a user is smiling at the same time a particular image is being displayed. We’ll use these in upcoming projects to better contextualize the functionality of these patches.

Math Use math patches to run mathematical operations on the values that are being passed into the patch. Similar to logic patches, the functionality of these is best contextualized in a flow of a project, so we’ll revisit these later in our hands-on project section.

Shaders Shader patches are designed to alter the appearance of materials in an effect, creating a wide range of visual changes. Examples include adding gradients, blending colors or textures together. Patches that capture information relating to visual information, such as sampling portions of an existing texture or outputting RGB color values.

Time Time patches simply track and output time measurements, such as the passage of time since the effect started, or the time that has passed between frames within the effect.

35

Chapter 3

Introduction to the Patch Editor

User Interface Depending on the type of AR effect you are creating, you may need to include a user interface as a way for users to directly communicate a particular choice, intent, or preference. The user interface patches include a picker, where users can select from options you create, and a slider, where users can change the value or intensity of a particular element in your effect.

Utility The utility patches section covers a wide breadth of fundamental functions necessary to power many effects. Examples include the body select and face select patches, which enable selecting a specific body or face to track, to the camera distance patch that outputs the distance between the device camera and a particular object.

Summary Throughout this chapter, we’ve learned the details on each category of patches, the building blocks of sophisticated AR effect creation in Meta Spark Studio. In the upcoming chapter, we’ll use all of this background knowledge we’ve gained and put it into practice by developing our very first AR effect!

36

CHAPTER 4

Creating Your First Effect Now that you’ve familiarized yourself with Meta Spark Studio’s interface and capabilities, let’s get hands on with creating a complete AR project— also called an “effect”—from scratch!

Planning Out a Project Before diving directly into Meta Spark Studio, consider outlining the concept of what you hope to create in a sketch or a brief written summary. This can serve as a reference for yourself, or the other designers or developers you collaborate with. This concept sketch (Figure 4-1) or note can also help deconstruct the various parts of the effect to tackle step by step and additionally highlight any elements that might be missing.

© Jaleh Afshar 2023 J. Afshar, Hands-On Augmented Reality Development with Meta Spark Studio, https://doi.org/10.1007/978-1-4842-9467-3_4

37

Chapter 4

Creating Your First Effect

Figure 4-1.  Concept sketch example Using the concept sketch shown here as an example, we can see that the theme of the project is to create a vintage, comic book–styled effect. To break things down more granularly, what we’ll need to learn to achieve this result is how to:

38



Create a custom background—A dot-based halftone pattern will be overlaid onto the background behind our user’s face and body.



Add illustrated details to the face—Comic book-like artistic lines and shading will be mapped to the user’s face and facial movements.



Change the color of the entire video—The effect will change the full color video feed of the front-facing camera to a grayscale, newspaper-like color grading.

Chapter 4



Creating Your First Effect

Respond to facial expressions—When the user's mouth is open, a comic panel speech bubble will appear.

We’ll address how to achieve each one of these core elements of the effect one by one in the upcoming sections.

Creating a New Effect File We’ll use a fresh new file for this project. Start by opening Meta Spark Studio and select Blank Project under Create New in the Welcome Screen. Alternatively, if you already had an existing project file opened, you can always select New Effect under File in the top Menu Bar as shown in Figure 4-2.

Figure 4-2.  New Effect option in the Menu Bar Make sure to save this new file, naming it First_Project.arproj. Continue to save at regular intervals to ensure you do not lose any of your hard work. To ensure your project previews reflect the figures in this and future projects, click on the three-line icon on the bottom right of the Simulator. In the menu that appears, navigate to iOS devices and select iPhone 13 Pro. The source files used in this chapter’s project are available on GitHub via this book’s product page, located at www.apress.com.

39

Chapter 4

Creating Your First Effect

Background Segmentation Let’s tackle the first element of our effect—a customized background that appears behind our user’s face and body.

Texture Extraction To start, let’s take a look at our Scene panel in the top left of the interface. You’ll see that under the Device object, there is another object entitled Camera. We want to extract the different bits of information that the Camera is picking up so we can apply the comic book–like halftone pattern to the background of the video feed only. To do this, click once on the Camera object to select it, as shown in Figure 4-3.

Figure 4-3.  Camera selected in the Scene panel By default, the Simulator displays a video of a user using the front-­ facing camera, representing this default Camera input. If you have selected a different video option in the Video Library, make sure you have selected one that shows a face to best preview this project. 40

Chapter 4

Creating Your First Effect

With the Camera object row selected in the Scene panel, head over to the Inspector on the right side of the interface. In the Inspector (Figure 4-4), click the “+” icon on the Texture Extraction row. This will extract the Camera’s texture. In Meta Spark Studio, texture can refer to a wide variety of imagery, including patterns, artwork, or other visuals. However, in this case, the texture we achieve through extraction is the entirety of the visual video input coming through the Camera.

Figure 4-4.  Texture Extraction row in the Inspector After clicking the “+” on the Texture Extraction row of the Camera, we now see a new Textures folder appear in our Assets panel (Figure 4-5) in the bottom left of the interface, with the newly created cameraTexture0 appearing within it.

41

Chapter 4

Creating Your First Effect

Figure 4-5.  Extracted Camera texture in the Assets panel In our project, we want to only apply our background pattern to the parts of the Camera that show what is behind the user, not the entire Camera texture (Figure 4-6). To separate the person from the background, we will need to use a built-in Meta Spark Studio feature called Segmentation. This feature allows for automatic separation, or masking, of common elements, such as a user’s entire body, or a user’s hair. This technique allows us to achieve our desired result.

42

Chapter 4

Creating Your First Effect

Figure 4-6.  Example of an incorrectly applied background that covers the entire Camera (left) and a correctly applied background that only covers the area behind the user (right) With the Camera object still selected in the Scene panel, take a look at the Inspector panel on the right. In the Inspector, click the “+” icon on the Segmentation row. Select Person in the flyout menu that appears (Figure 4-7).

43

Chapter 4

Creating Your First Effect

Figure 4-7.  Segmentation options in the Inspector We will now see that segmentationMaskTexture0 appears in our Assets panel, as shown in Figure 4-8.

Figure 4-8.  Segmentation mask appearing in the Assets panel This newly added segmentation mask will enable us to separate our user—the detected “person” on camera, from the background. 44

Chapter 4

Creating Your First Effect

Applying Textures to Materials While we see our extracted textures in the Assets panel, we do not yet see any change to the effect preview in the Simulator. This is because the Assets panel is simply a storage container for assets. For an asset to be visualized to the user of an effect, the asset must eventually be applied to an object in the Scene panel. For textures in particular, they must be applied to a material asset first, and the material subsequently applied to an object. Materials are assets that specify how the surface of an object will appear, including the lighting, transparency, and also which texture, if any, appears on the object. Let’s create our first material by clicking the “+” icon in the bottom right of the Assets panel and selecting Material (Figure 4-9).

45

Chapter 4

Creating Your First Effect

Figure 4-9.  Material option in the “+” in the Assets panel This creates a Materials folder and material0 asset in the Assets panel. Assets and objects created within Meta Spark Studio will have a naming convention of [type][number] by default. We’ll learn how to rename these shortly. With the material0 asset selected in the Assets panel, head back over to the Inspector and find the Texture row, which appears within the Shader Properties section. Click the drop-down arrow in the Texture row (Figure 4-10). This allows you to select from any existing textures in the Assets panel. Select cameraTexture0 to apply it to the material.

46

Chapter 4

Creating Your First Effect

Figure 4-10.  Selecting the camera’s texture in the Inspector for the material0 asset As we learned earlier in this chapter, this camera texture will pull in the entire video feed. However, for our effect, we need to make sure the person that is detected by the camera will stand out against the background, allowing us to apply artwork to the background only, without covering the person. To achieve this result, we will use the Alpha channel control—the parameters that define the transparency of the material. While material0 is still selected in the Assets panel, turn on alpha by clicking the check box in the Alpha row within the Inspector. Upon clicking the check box, additional rows appear including another texture selector.

47

Chapter 4

Creating Your First Effect

Using the drop-down arrow menu, select the segmentationMaskTexture0 for the alpha’s texture (Figure 4-11).

Figure 4-11.  Alpha enabled with segmentation mask texture selected Now our material will only display what we want, which is the person’s face and body. The background behind the person will be transparent.

Adding Objects Now that we have a textured material ready, we can apply it to an object. To add objects, click on the “+” icon in the bottom right of the Scene panel. There are several objects that you will see in the flyout menu. I recommend reading through the object descriptions to get familiar with the possibilities available. For now, select Canvas (Figure 4-12) and click the Insert button to add it to your scene.

48

Chapter 4

Creating Your First Effect

Figure 4-12.  Canvas object in the “+” menu of the Scene panel A canvas is a container that is automatically sized and scaled to the size of the device screen, and it can hold multiple 2D objects inside of it. A canvas itself cannot have a material applied to it as it’s simply a container. However, the objects it holds can have materials applied to them. Open the “+” menu and select Rectangle, then click Insert to add it to your scene. Rectangles are a type of 2D object that can display a material. We will need two rectangles in total, so you can either add a second rectangle through the menu or right-click the first rectangle in the Scene panel and select the Duplicate option. Ensure that both rectangles are nested within canvas0, as shown in Figure 4-13.

49

Chapter 4

Creating Your First Effect

Figure 4-13.  Scene panel with correctly nested rectangles You can check this by clicking the caret, or arrow, to the left of the canvas row and ensuring that both rectangles hide together or show together based on the direction of the caret.

Organizing Assets and Objects It’s helpful to get in the habit of renaming assets and objects to allow for easier management of your project. As you add more elements to projects, having intuitive names can streamline your workflow significantly and prevent any confusion. Let’s try this on our newly created rectangles. Double-click on the first rectangle row, the one that appears directly under the canvas row. Double-­ clicking puts the object name into edit mode, and you will see all of the text highlighted. Give the rectangle a new name of background by typing when the text is highlighted. Press return or click anywhere outside the editable text box to commit the name change. Repeating the same steps, rename the second rectangle to person. 50

Chapter 4

Creating Your First Effect

Assets can also be renamed in the same way. In the Assets panel, double-click on the material0 asset and rename it to person_mat (Figure 4-14). We’ll be adding more materials shortly, so this will make our material selection process much smoother.

Figure 4-14.  Renamed rectangles and material It’s important to note that the particular order of objects in the Scene panel has implications on how they will appear to the user. Items lower in the hierarchy are closer to the user's view, and items higher are further 51

Chapter 4

Creating Your First Effect

away—to put it another way, the hierarchy determines the order in which objects are drawn, with each following item being drawn on top of the previous and so on. For example, in our rectangle stack, the rectangle appearing in the bottommost row is what will appear closest to the effect viewer. This is why we are placing the rectangle that will contain the visualization of the person at the bottom—this ensures the cutout of the person will appear in front of the background pattern. If the ordering was reversed, the background would entirely cover the person.

Default Object Visualization You may have noticed a checkerboard pattern appearing at the top left of the Simulator (Figure 4-15), representing the rectangle object.

52

Chapter 4

Creating Your First Effect

Figure 4-15.  Default checkerboard pattern appearing on rectangle in the Simulator Objects that require a material applied to them (such as rectangles) but don’t yet have a material added appear with a checkerboard pattern to clearly indicate the missing material.

Adjusting Object Size Let’s begin by expanding the size of both our rectangles to fill the entire canvas.

53

Chapter 4

Creating Your First Effect

Select the first rectangle, then hold Shift while clicking the second rectangle to select both rectangles. With both rectangle rows in the Scene panel now appearing blue, head over to the Inspector on the right side of the interface. By default, rectangles are set to a fixed size of 100 × 100 pixels. The pixel size of screens can change depending on the device a user is using, and we want to fill the entire width and height of the device’s screen automatically, no matter the device. To achieve this, we must first change the drop-down menus in the Inspector for both Width and Height from Fixed to Relative, as shown in Figure 4-16. Relative mode changes the measurement unit from pixels to percentages.

Figure 4-16.  Changing Width and Height drop-down menus to Relative for both rectangles Since we want to fill the entire screen, we also need to change our percentages to 100%. To do this, click inside the Width input box and change the value to 100. Then, click inside the Height input box and

54

Chapter 4

Creating Your First Effect

change the value to 100 as shown in Figure 4-17. Ensure that both drop-­ downs remain Relative.

Figure 4-17.  Width and Height values changed to 100 Our Simulator now appears to be completely covered by the rectangles, indicated by the checkerboard pattern in Figure 4-18.

55

Chapter 4

Creating Your First Effect

Figure 4-18.  Rectangles covering the Simulator This indicates that the rectangles are now sized to fill the entirety of their parent container: the canvas. The canvas, as we learned earlier, is a container automatically sized to fit the user’s device. This means our rectangles will automatically fill 100% of a user’s device screen when they are using our AR effect.

Adding Existing Materials to Objects In this section, we will add the material we created earlier, person_mat, to an object.

56

Chapter 4

Creating Your First Effect

First, select only the person rectangle in the Scene panel. In the Inspector on the right side of the interface, click the “+” icon in the Materials row. Select person_mat in the drop-down menu (Figure 4-19), which will link the selected material to the rectangle.

Figure 4-19.  Selecting a material in the Inspector Now, the material representing only the face and body of our user is applied to our person rectangle. In case you have paused your Simulator, ensure it is actively playing to preview this change, as shown in Figure 4-20.

57

Chapter 4

Creating Your First Effect

Figure 4-20.  Simulator showing a material applied to the person rectangle object only The placeholder checkerboard pattern of the person rectangle has been replaced with our person_mat material, containing the camera texture we created earlier. Our background rectangle, in contrast, still displays the default checkerboard and appears behind the person rectangle.

Creating New Materials for Objects Our person rectangle is now comfortably filled with the person_mat. However, we now need to add a material to our background object, since 58

Chapter 4

Creating Your First Effect

we still see the default checkerboard pattern appearing on it. This time, rather than creating a material from the Assets panel menu, we can create one through the Inspector. With the background rectangle object selected in the Scene panel, head back over to the Inspector and click the “+” in the Material row. This time, select Create New Material. Instantly, a new material appears in the Assets panel. Double-click this material and rename it to background_mat.

Default Material Visualization Since we’ve now applied a material to the background rectangle, the default checkerboard pattern (indicating a missing material) should now be gone. Instead, we should now see a light gray color filling the rectangle (Figure 4-21).

59

Chapter 4

Creating Your First Effect

Figure 4-21.  Simulator showing the background rectangle with a textureless material applied This coloration indicates the default state of a material that does not have a texture applied to it yet.

Importing Textures Let’s add a texture to replace the default gray on our background rectangle. With the background_mat selected in the Assets panel, head over to the Texture row in the Inspector, and click Choose File. Select the background.png file (Figure 4-22), downloaded from this book’s Apress GitHub page. 60

Chapter 4

Creating Your First Effect

Figure 4-22.  Simulator and Inspector displaying custom background texture This image file will appear in the Assets panel textures folder and immediately apply to our effect. The Simulator now displays this custom image texture behind the cutout of our user.

Mapping Visual Assets to a Face Meta Spark Studio conveniently has built-in functionality to detect faces and facial expressions. This means you will not have to manually program or train the software to recognize if the subject on-camera is a person. 61

Chapter 4

Creating Your First Effect

We’ll use this technology to add our customized artwork automatically on top of any face that is detected by the software.

Adding a Face Tracker In the Scene panel, click the “+” icon in the bottom right. Find the Face Tracker (Figure 4-23) in the menu and click Insert.

Figure 4-23.  Face Tracker object in the “+” menu of the Scene panel Face trackers are fundamental objects that allow for finding and following a face within an effect. Adding a face tracker also has an added feature—it will add instruction text when your effect is being used, informing your effect user to “find a face.” This way, your effect users will know how to position their camera toward a face.

62

Chapter 4

Creating Your First Effect

Working with Face Meshes Similar to the canvas, the face tracker alone cannot have materials added to it. For that, we need to add a face mesh object. Face meshes are three dimensional and are modeled to fit the anatomy of a face. To add a face mesh in your Scene panel, click the “+” icon in the bottom right. Find the Face Mesh object in the menu and click Insert. The face mesh will automatically nest within the face tracker (Figure 4-24), and, like other material-less objects, will appear with a checkerboard pattern in our Simulator when a face is detected. The default name for this object will be faceMesh0.

Figure 4-24.  Face tracker object with nested face mesh displaying in the Scene panel

63

Chapter 4

Creating Your First Effect

We’ll need to create a new material for this new object. Select faceMesh0 in the Scene panel and head over to the Inspector. Click the “+” icon in the Inspector’s Material row. Select Create New Material. Rename the newly created material in your Assets panel to face_mat. We now need to replace the gray untextured default of this material to our custom texture. First, we want to ensure this new material renders properly in our Simulator for quick previewing. Select the face_mat in the Assets panel and head over to the Inspector. Near the bottom of the Inspector, expand the Advanced Render Options section. Ensure that the check boxes for Use Depth Test and Write to Depth Buffer are not checked (Figure 4-25). This prevents objects from being inadvertently obscured when rendering and will ensure that our face mesh appears properly in the Simulator.

Figure 4-25.  Advanced Render Options unchecked for face_mat in the Inspector With the face_mat still selected in the Assets panel, click the Choose File option in the Texture row in the Inspector. Select the face_lines.png file, which immediately adds it to our Assets panel and simultaneously applies it to our effect (Figure 4-26).

64

Chapter 4

Creating Your First Effect

Figure 4-26.  Simulator and Inspector displaying custom texture on face Ensure your Simulator is playing to preview how the face mesh follows the movement of the user.

Adding Interactivity In Chapter 3, we were introduced to the power of patches—the data and logic bundles to enable interactivity in an effect. We’ll use this feature to enable showing and hiding an image asset based on our user's facial expression.

65

Chapter 4

Creating Your First Effect

Adding and Managing Planes Planes are 2D objects similar to rectangles in that they are flat with four sides; however, planes can be placed in 3D space. This allows us to create an effect where our comic book–style speech bubble can appear to float in front of the user’s face. Navigate to the Scene panel’s “+” menu and add a Plane object. Rename the plane to bubble for easy reference. We also need our speech bubble to stay tracked to our user’s general head position, so to do this, make sure the plane is nested within the face tracker folder (Figure 4-27). To do this in the Scene panel, click and drag the bubble plane row on top of the face tracker row until the face tracker row is outlined in blue, then release the plane.

Figure 4-27.  Plane properly nested within the face tracker container You can double-check if the bubble plane is properly nested by clicking the caret, or arrow, to the left of the face tracker row and ensuring that both the face mesh and plane hide together or show together based on the direction of the caret.

66

Chapter 4

Creating Your First Effect

Positioning Planes By default, you can see the plane has been placed in the center of the face, represented by a checkerboard square. This indicates we need to add a material. Select the bubble plane, then add a material to the plane by selecting Create New Material in the Inspector. Rename this newly created material to bubble_mat. With the newly created bubble_mat selected in the Assets panel, select Choose File in the Inspector’s Texture row, and select the bubble.png file. While our bubble can remain in this default position if we wanted, it may make more sense aesthetically to position the bubble a bit higher up on the screen so it does not cover the user’s face and further in front of our user’s face to create a sense of depth. To make this positioning process easier, use the Toolbar in the far left of the interface to pause your Simulator when the person in the Simulator is facing the front of the screen. Once your Simulator is paused, select the bubble plane in the Scene panel and take a look at your viewport. You will see three arrows (Figure 4-28), each representing a position axis of the plane.

67

Chapter 4

Creating Your First Effect

Figure 4-28.  Bubble plane selected in viewport The blue arrow represents the z axis. With the Simulator paused, click and drag the blue arrow slowly outward to position the bubble in 3D space slightly further in front of our user. It’s important not to pull this arrow too far outward, as the bubble could appear so far in front of the user’s face that it no longer appears on screen when the user is trying the effect. The green arrow represents the y axis. Since we don’t want the bubble to completely obscure our user’s face, we will need to move it slightly from its default position. Click and drag the green arrow upward to position the bubble above the user’s face.

68

Chapter 4

Creating Your First Effect

The red arrow represents the x axis. Click and drag the red arrow to position the bubble slightly off to the side of the user’s face, reminiscent of a comic book speech bubble placement (Figure 4-29).

Figure 4-29.  Bubble plane with adjusted positioning Alternatively, the position of an object can also be controlled through the Inspector menu. With the bubble plane selected in the Scene panel, you can also manually input position values in the Inspector’s Position row (Figure 4-30). The recommended values are shown in Table 4-1.

69

Chapter 4

Creating Your First Effect

Figure 4-30.  Bubble plane with manually input positioning Table 4-1.  Bubble plane position values x axis

y axis z axis

0.08

0.10

0.03

You can always toggle between playing and pausing the Simulator to see where the bubble is positioned as the user moves their head. Remember to always pause the Simulator using the Toolbar before changing the position values of the bubble in the Inspector, or when using the arrows in the Viewport.

70

Chapter 4

Creating Your First Effect

Generating and Connecting Patches With the bubble plane selected, take a look at the Visible row in the Inspector. Try clicking and unclicking the check box to toggle visibility of the object. This has similarities to the result we wanted to achieve in our final effect, but we need visibility to be tied to the user opening their mouth, not us manually clicking a box. To activate this visibility property automatically, click on the arrow directly next to the left of the Visible row. This opens the Patch Editor and creates a patch representing the visibility toggle. The Visible row now turns yellow and cannot be manually clicked anymore, as the property is controlled solely via the patch now. Next, let’s generate patches to detect our user’s facial expression. To do this, select the faceTracker0 in the Scene panel. In the Inspector menu, there will be a row entitled Interactions, which, upon clicking Create, displays several built-in patch options for detectable gestures as shown in Figure 4-31.

71

Chapter 4

Creating Your First Effect

Figure 4-31.  Built-in options for interaction patches Since we want to show a speech bubble when the user’s mouth is open, select the Mouth Open patch. This will drop a string of four patches into the Patch Editor. From finding and selecting a face, to following its movement, to detecting if the mouth on the face is open, this bundle of patches works together to give us the information our bubble’s patch needs to determine its visibility. Now, we need to connect the end of this patch string into the bubble patch. Drag the yellow bubble patch to the right of the string of interaction patches. The last patch in the string, entitled Mouth Open, has two outputs. The first output sends binary information about the open state, essentially a yes or no answer to “is the mouth open?” Drag the arrow 72

Chapter 4

Creating Your First Effect

representing this output into the input on the left of the bubble visibility patch as shown in Figure 4-32.

Figure 4-32.  Connected string of patches powering the visibility of the bubble plane If your Simulator is paused, ensure it is playing again to see this interaction come to life! The bubble will now become visible only when the user opens their mouth.

Color Grading When we explored the Meta Spark Studio AR Library in Chapter 2, we briefly explored the pre-made color alteration options that are available within the AR Library’s Color LUTs section. In this effect, we intend to adjust the color values to look more like a vintage newspaper, so let’s use a LUT to achieve this.

Finding LUTs Open the AR Library, which is found in the Toolbar on the furthest left side of the interface, and navigate to the Color LUTs section, then browse through the options to find a grayscale LUT, such as Blown Out BW. Click on the LUT to view its detail page, then click Import Free to insert it into your project. Close the AR Library window.

73

Chapter 4

Creating Your First Effect

Applying LUTs Like other imported assets we’ve encountered thus far, the LUT does not automatically apply to your effect upon import. To add the LUT, find it within the Color LUTs folder that now appears in your Assets panel. Select the LUT row, Blown_Out_BW, within the folder and right-click on it to bring up the flyout menu. Navigate to Actions and select Apply to Camera (Figure 4-33).

Figure 4-33.  Applying a Color LUT to the Camera The color change will instantly appear in the Simulator, and you will also see a set of patches associated with the LUT added to the Patch Editor.

Summary Congratulations—you’ve created your very first AR effect! This included initial file setup, learning to work with textures, creating materials, adding objects, and importing assets. You combined these elements of an AR effect file with interactive patches and a Color LUT to create a cohesive augmented reality experience. Make sure you’ve saved your project as you’ve made excellent progress in your learning journey. Next, we’ll learn how to try this newly created effect on a real live device. See you in the next chapter!

74

CHAPTER 5

Testing Effects While the Simulator is a convenient way to rapidly preview in-progress effects, it is simply an approximation of how your project will look and behave. For a more accurate depiction, it’s important to properly test how your augmented reality creations feel on the actual device a user would experience it on. Using our newly created First Project.arproj from Chapter 4, let’s try testing an effect directly on a mobile device.

E xperience Types In the left side of the Meta Spark Studio interface, find the Test on Device icon in the bottom half of the Toolbar. This option brings up a flyout menu with various options for testing (Figure 5-1).

© Jaleh Afshar 2023 J. Afshar, Hands-On Augmented Reality Development with Meta Spark Studio, https://doi.org/10.1007/978-1-4842-9467-3_5

75

Chapter 5

Testing Effects

Figure 5-1.  Test on Device flyout menu The Mobile Meta Spark Player option allows for previewing effects through using the downloadable Meta Spark Player app, which runs on iOS and Android. The Meta Spark Desktop Player is a downloadable Mac and Windows app, which is helpful for opening up multiple effect preview windows on one monitor since on a mobile device, it is only possible to preview one effect at a time. This is helpful for previewing effects that you might be creating for group video calls. In our case, we want to prioritize testing our effect in the most realistic way possible—directly on the same apps that actual users have access to, such as Instagram or Facebook. However, we don’t see those options showing up yet in this menu.

76

Chapter 5

Testing Effects

To enable those options to appear, select the Add Experience button to begin specifying which app or product you’d like to try your AR effect with. This opens a window (Figure 5-2) to fully manage experience types and associated properties.

Figure 5-2.  Experience Properties window In this interface, select + Add Experience in the bottom right. The experience options available are as follows: •

Sharing Effect—Used on both Instagram and Facebook, this experience type covers the broad range of AR use cases across social media and can be used in features such as posts, messages, the ephemeral Stories format, and in short-form video Reels.



Ads—Used on Instagram, advertisement AR effects are commonly used to allow prospective customers to try on a product, for instance, eyeglasses or makeup. 77

Chapter 5



Testing Effects

Video Calling—Used on Messenger and Instagram, these effects are for usage while in a live video calling interface.

Select the first option, Sharing Effect, then click Insert in the bottom right. This will populate the window with preselected defaults for each social platform. Click Done to finalize changes.

Sending Effects to a Device Navigate back to Test on Device in the Toolbar. Now, the flyout menu will display the social platforms that were selected for your effect in the Experience Properties window (Figure 5-3).

Figure 5-3.  Social platform options for testing effects It’s helpful to test on each platform to ensure the effect is working as intended.

78

Chapter 5

Testing Effects

Testing Effects on Instagram To test your project on Instagram, first, ensure you have the latest version of the Instagram app installed on your iOS device via the App Store, or on your Android device via Google Play. Next, click the Send button on the Instagram row. A circular icon will appear to show the effect’s uploading progress, as shown in Figure 5-4.

Figure 5-4.  Sending an effect to Instagram Once the effect has finished uploading, you will be shown a few options in the Test on Instagram window (Figure 5-5).

79

Chapter 5

Testing Effects

Figure 5-5.  Instagram testing window First, the effect has been sent as a notification to the Instagram account associated with your Meta Spark Studio account. In addition, a QR code is now scannable to open your effect directly. Finally, a test link has also been generated, which is a shareable URL to allow up to 50 users to try this effect over a 24-hour window. Using any of these methods, you can try your effect live in the Instagram app.

80

Chapter 5

Testing Effects

After sending an effect through the testing process, you can always access the test QR code or URL through the Test on Device menu (Figure 5-6).

Figure 5-6.  Test on Device menu after sending a test to Instagram There is now a link under Instagram entitled Test Effect, which allows you to reopen the Instagram testing window anytime.

Testing Effects on Facebook To test your project on Facebook, make sure you have the latest version of the Facebook app installed on your iOS device via the App Store, or on your Android device via Google Play. Next, click the Send button on the Facebook row. A circular icon will appear to show the effect’s uploading progress as shown in Figure 5-7.

81

Chapter 5

Testing Effects

Figure 5-7.  Sending an effect to Facebook Once the effect has finished uploading, you will be shown a few options in the Test on Facebook window (Figure 5-8).

82

Chapter 5

Testing Effects

Figure 5-8.  Facebook testing window First, the effect has been sent as a notification to the Facebook account associated with your Meta Spark Studio account. In addition, a QR code is now scannable to open your effect directly. Finally, a test link has also been generated, which is a shareable URL allowing up to 50 users to try this effect over a 24-hour window. Using any of these methods, you can try your effect live in the Facebook app.

83

Chapter 5

Testing Effects

After sending an effect through the testing process, you can always access the test QR code or URL in the Test on Device menu (Figure 5-9).

Figure 5-9.  Test on Device menu after sending a test to Facebook There is now a link under Facebook entitled Test Effect, which allows you to reopen the Facebook testing window anytime.

Refreshing Effect Tests When testing your effects, keep in mind that the test version is simply a snapshot in time from when you sent the test. The test effect that is viewed through the notification, QR code, or URL you generated does not automatically update when you change something in your project. To make sure your test has the latest changes, open the Test on Device menu and click the Refresh button next to the platform you want to re-test on. This will refresh the test, which generates a new notification, QR code, and URL containing the latest version of your effect. 84

Chapter 5

Testing Effects

Testing vs. Publishing It’s important to note that while you can share test links with friends, family, and colleagues, a test URL is not the same as fully publishing an effect. Publishing has much stricter requirements but also comes with the benefit of being able to release your effect to a wider audience. We will learn more about publishing effects in a later chapter.

Summary Experiencing how your project functions on a real device is an important part of the AR effect development process. In this chapter, we learned how to set up a test for our effect and how to send and refresh our effect for multiple available apps. Refer back to this documentation as you progress through the following chapters to ensure testing is part of your workflow. Next, we’ll learn how to customize our recently tested effect for increased uniqueness.

85

CHAPTER 6

Customizing Projects Through Asset Replacement By customizing the visual assets within your first (and subsequent) projects in this book, you can give your version of the effect a unique personality that best represents your own creativity. We’ll learn how to change texture and Color LUT assets in a way that can reimagine the feel of an effect entirely while keeping the same core objects and logic in the scene.

Saving As a New Effect Start by opening Meta Spark Studio and open First Project.arproj. We will be making significant changes, so rather than saving over our existing file, let’s make a new one. Navigate to File in the top Menu Bar, and click Save As as shown in Figure 6-1.

© Jaleh Afshar 2023 J. Afshar, Hands-On Augmented Reality Development with Meta Spark Studio, https://doi.org/10.1007/978-1-4842-9467-3_6

87

Chapter 6

Customizing Projects Through Asset Replacement

Figure 6-1.  Save As option in the Menu Bar Rename this new file to First Project Customized.arproj in the window that appears. We will solely be using this newly created First Project Customized.arproj file in this chapter. The source files used in this chapter’s project are available on GitHub via this book’s product page, located at www.apress.com.

Replacing Texture Assets Previously, we had designed this effect to look like a vintage comic book. Now, we’ll be creating a cheerful sunny day effect. Rather than having to rebuild the entire file from scratch though, we can utilize the existing structure and simply change which images we are pulling in for textures. Let’s revisit the textures we had previously added in the Assets panel.

88

Chapter 6

Customizing Projects Through Asset Replacement

We’ll start by swapping out our empty speech bubble with one that has some content added inside of it. Within the Assets panel, select the bubble texture row appearing in the Textures folder. Right-click on the row. In the flyout menu that appears, click Replace (Figure 6-2).

Figure 6-2.  Option to Replace when right-clicking an asset This Replace functionality allows us to swap this image asset to another one without having to alter any other properties of this texture.

89

Chapter 6

Customizing Projects Through Asset Replacement

In the file picker window that appears, select the bubble_hello_world. png image file, downloaded from this book’s Apress GitHub page. This newly selected bubble_hello_world.png texture will automatically replace the bubble.png image we had chosen in Chapter 4. Replacing an image through this method does not reset any other properties of the texture or change anything about the material that uses this texture. We are just simply changing the image file associated with this texture, nothing more. Now, when our Simulator plays, a bubble with text inside appears when the user opens their mouth (Figure 6-3).

Figure 6-3.  Simulator showing the image replaced for the bubble texture

90

Chapter 6

Customizing Projects Through Asset Replacement

Next, in the Assets panel, select the background texture row appearing in the Textures folder. Right-click on the row, then, in the flyout menu that appears, click Replace. In the window that appears, select the background_sky.png image file. This will swap our halftone pattern with this new image (Figure 6-4).

Figure 6-4.  Simulator showing the image replaced for the background texture This new file is a light blue image with clouds—but our Simulator is still showing gray! That’s because we still have our original Color LUTs applied, which we need to update next.

91

Chapter 6

Customizing Projects Through Asset Replacement

Swapping Color LUTs Our effect now uses a colorful background image, but it is not yet displaying properly due to our previously applied Color LUT. To swap out our LUT, open the AR Library using the left-side Toolbar and navigate to the Color LUTs section. Browse through the options to find a colorful LUT, such as Pola. Click on the Pola LUT to view its detail page, then click Import Free to insert it into this project. Close the AR Library window. To update the LUT applied to our effect, find Pola within the Color LUTs folder in the Assets panel. Right-click on Pola to bring up the flyout menu. Navigate to Actions and select Apply to Camera (Figure 6-5).

92

Chapter 6

Customizing Projects Through Asset Replacement

Figure 6-5.  Applying a newly added Color LUT to the Camera The Patch Editor will appear, showing that the Pola patch is now added into the string of Color LUT patches. However, you can see that the Blown_ Out_BW patch from before is still present in the Patch Editor. For file cleanliness, head to your Assets panel and select the Blown_ Out_BW row, then right-click on it and select Delete. In the window that appears, asking if you’d like to delete those files from the project folder as well, click Delete again.

93

Chapter 6

Customizing Projects Through Asset Replacement

Replacing Textures for the Face Now that our effect has been customized to look like a bright summer scene, we’ll swap out our face mesh imagery from a comic book line art style to a pair of happy suns on the user’s cheeks. Let’s revisit the texture we had previously added in the Assets panel. Within the Textures folder in the Assets panel, select the face_lines texture row. Right-click on the row. In the flyout menu that appears, click Replace. In the file picker window that appears, select the face_suns.png image file. This newly selected face_suns.png texture will automatically replace the face_lines.png image we had chosen in Chapter 4 (Figure 6-6).

Figure 6-6.  Simulator displaying newly replaced texture for the face tracker material

94

Chapter 6

Customizing Projects Through Asset Replacement

Now, when our Simulator plays, a sun will appear on each cheek of the user. Unlike the bubble replacement, which replaced one bubble with another bubble, our face texture completely changed from lines to suns. To bring added clarity to our file, within the Assets panel, double-click the face_lines row in the Textures folder and rename it to face_suns.

Understanding Face Mesh Mapping You might wonder—how did the software know that we wanted each sun placed on the user’s cheek? This is accomplished through a cascading set of logic and features: •

The Face Tracker has built-in logic to automatically detect faces in the camera.



The Face Mesh is a pre-made 3D object, modeled to the shape of a human face.



The material, and therefore texture, applied to the Face Mesh will automatically wrap to fit the 3D Face Mesh.

While the method of replacing face mesh textures closely follows the method of swapping non-face mesh texture imagery, the creation of face mesh textures follows a very specific template in order to be properly mapped to the Face Mesh’s 3D model.

Face Reference Templates To create a 2D face mesh texture file that maps accurately to the Face Mesh, you will need to utilize a template (Figure 6-7) that shows the positioning of facial features, such as face_template.png that is included in this chapter’s project files on the Apress GitHub.

95

Chapter 6

Customizing Projects Through Asset Replacement

Figure 6-7.  face_template.png file This template file contains an image representing the flattened topology of a human face. You can use this file to help estimate positioning of 2D imagery relative to landmarks on a face.

Creating Custom Textures for the Face Using an image editor program of your choice, you can create various texture files by positioning pictures, text, photos, emoji, and more on this template. Simply add in these additional assets and resize them to fit onto the face shown in the template file, as shown in the example in Figure 6-8.

96

Chapter 6

Customizing Projects Through Asset Replacement

Figure 6-8.  Example of various assets added on top of the face_ template.png texture file It’s critical to note that when saving a customized face mesh texture, it is important to export your file with the face_template.png layer hidden (Figure 6-9). This ensures the illustration of the template itself does not get saved as part of your exported texture file.

Figure 6-9.  Improperly exported texture (left) and properly exported texture (right) If the template itself gets visibly exported in the same file as the custom texture, then the image of the template itself, such as the gray shadows and face landmark outlines, will appear on the face mesh (Figure 6-10). 97

Chapter 6

Customizing Projects Through Asset Replacement

Figure 6-10.  Example of an improperly exported texture file appearing on the face mesh Now that we’ve learned how to properly export customized textures for the face and how to replace textures in our Assets panel, let’s learn where we can find image resources to use in custom-created textures.

Using External Resources Creating your own images from scratch is an excellent option for a truly unique AR effect result. You can do this through photography, digital art, scans of traditional art, typography, and more.

98

Chapter 6

Customizing Projects Through Asset Replacement

However, if you are seeking existing images for your project, there are several resources that can help you in your search.

Image Resources National Gallery of Art: www.nga.gov/open-access-images.html The National Gallery of Art has over 50,000 images representing public domain works of art within their permanent collection. Library of Congress: www.loc.gov/free-to-use/ Within the Library of Congress’ free-to-use collection, you can find paintings, editorial images, posters, photographic arts, diagrams, newspaper clippings, and more. NOAA: https://photolib.noaa.gov/ Containing over 80,000 images, NOAA’s photo library spans visuals related to ocean and atmospheric sciences, such as marine life, weather phenomenon, and watercraft. NASA: https://images.nasa.gov/ This repository shared by NASA includes captures of planets, astronauts, space shuttles, and telescopes, along with computer-simulated images of phenomena such as black holes.

Image Usage Guidelines Ensure that you familiarize yourself with the usage guidelines for any image resource. From a legal perspective, it is very important to respect the asks of an image rights holder. In most cases, such as the resources shared in this chapter, usage guidelines are found directly on the website and clearly differentiate the rules of commercial use from personal or educational use.

99

Chapter 6

Customizing Projects Through Asset Replacement

Summary Leveraging step-by-step project tutorials, such as the ones in this book, is an excellent way to get started building AR effects. Through the customization techniques in this chapter, we’ve learned how to bring a greater sense of uniqueness to your effects while utilizing the core templates and logic built into these tutorial projects. This includes changing the color palette and swapping the imagery used for textures. In the next chapter, we’ll dive into a completely new type of effect, extending our knowledge from face-based effects to using the real-world environment.

100

CHAPTER 7

Creating a Target Tracking Effect Now that we have some experience with developing an effect, let’s extend our skills to bring an additional dimension into images found in the real world.

What Is Target Tracking Target tracking is the ability for an effect to detect and recognize (a.k.a. “track”) a 2D image (the “target”) that the camera is seeing in the real world (Figure 7-1).

© Jaleh Afshar 2023 J. Afshar, Hands-On Augmented Reality Development with Meta Spark Studio, https://doi.org/10.1007/978-1-4842-9467-3_7

101

Chapter 7

Creating a Target Tracking Effect

Figure 7-1.  Target tracking effect when looking for a target (left) and after detecting a target (right) Targets can present themselves in the real world through a variety of ways. For instance, the target could be a logo, a poster, the artwork on a vinyl record jacket, the cover of a book, a business card, or even a screenprint appearing on a T-shirt.

Selecting an Ideal Target Targets are a specifically chosen image that, when detected by the device camera, initiates the target tracking effect. While the target can be presented in a variety of ways in the real world, there are some characteristics that make for an ideal target.

Target Quality To function properly, targets must be high contrast (Figure 7-2), ensuring that the edges of the subject matter in the image are clearly visible. This

102

Chapter 7

Creating a Target Tracking Effect

means limiting the use of gradients or soft fade outs. Rather than an image with elements that blend together, it’s better to have an image where the shapes are clearly distinct.

Figure 7-2.  Low-contrast target (left) and high-contrast target (right) Opt for clear image details that can be easily delineated from the background. Complex or asymmetrical images work well, as the variety of details in the visual will be picked up much easier by the tracker. Very importantly, targets must also be high resolution. Blurry, low-­ resolution images will also be difficult for the tracker to detect. Both grayscale and full color targets are acceptable to use. In fact, a great way to test the overall contrast of a target image is to temporarily set it to grayscale. If in grayscale the entire image turns to one similar shade of gray, it is too low contrast to be an effective target.

Flat Targets It’s critical to ensure that the target in the real world can be viewed in a flat format. For example, if the target is screen printed on a garment, it will need to be placed on a part of the clothing that the wearer can easily smooth out when the camera is pointed at it. A distorted tracker image in the real world, such as on a crumpled paper or a wrinkled shirt, will not be trackable. 103

Chapter 7

Creating a Target Tracking Effect

Easily Viewable Targets For the camera to pick up a target, the image should fill a large portion of the device screen without needing to zoom in. Because of this, images that are printed very small will be difficult for a camera to pick up as a target, such as a postage stamp. Conversely, images that are printed very large but are placed very far away are also difficult to use as targets. An example of this would be a billboard.

Planning Out a Target Tracker Project As mentioned in Chapter 4, it’s helpful to write or sketch (Figure 7-3) the concept of what we hope to achieve before starting to build this effect.

Figure 7-3.  Concept sketch of a target tracking project Using the concept sketch shown here as an example, we can see that the theme of the project is to create an effect that adds virtual elements around the image target. To break things down more granularly, what we’ll need to learn to achieve this result is how to •

104

Recognize our image target—Our effect should only display when the pink AR heart image is detected.

Chapter 7

Creating a Target Tracking Effect



Generate animated particles—Yellow stars will spawn out of the detected image.



Display 3D shapes—Static red hearts will also appear around the detected image.

We’ll address how to achieve each one of these effect elements in the upcoming sections.

Setting Up a Target Tracking Effect We’ll create a new file for this project. Start by opening Meta Spark Studio and select Blank Project, which is at the top of the Create New section in the Welcome Screen. Alternatively, if you already had an existing project file opened, you can always select New Effect under File in the top Menu Bar. Make sure to save this new file, naming it Target_Tracking_Project. arproj. As always, continue to save at regular intervals to ensure you do not lose any of your hard work. Meta Spark Studio conveniently has built-in functionality to detect visuals in the real world that resembles your chosen target. This means you will not have to manually program or train the software to recognize your target on camera. We can simply add our chosen image using the techniques indicated in the next section, and Meta Spark Studio will automatically understand how to utilize the image for our desired effect. The source files used in this chapter’s project are available on GitHub via this book’s product page, located at www.apress.com.

Adding a Target Tracker Let’s tackle the first element of our effect—recognizing an image target. To achieve this, we need to add a new type of object.

105

Chapter 7

Creating a Target Tracking Effect

In the Scene panel, click the “+” and find the Target Tracker object (Figure 7-4). Click the Insert button to add it to your scene.

Figure 7-4.  Target tracker object in the “+” menu of the Scene panel Target trackers are the specific object type that enables Meta Spark Studio to respond when the camera is pointed at a particular image. With the targetTracker0 object selected in your Scene panel, head over to the Inspector panel on the right. Within the Properties section, locate the Texture row (Figure 7-5).

106

Chapter 7

Creating a Target Tracking Effect

Figure 7-5.  Texture selector in a target tracker’s Inspector properties Let’s update the texture of the target tracker to one of the project files you downloaded from this book’s Apress GitHub page. Click Choose File in the Inspector, and select the heart.png file.

Note  Despite the use of the same descriptor “texture,” the way a texture is interpreted for a target tracker object is different from how a texture for a material is handled. A target tracker’s texture is the image that the target tracker will recognize. Alternately, a material’s texture is the representation of that material’s appearance and displays visually on the object that the material is associated with. Now we have our effect set up so it can automatically detect when the camera is pointing to our heart.png image. You will see that the Simulator has updated the camera view to represent a rear-facing camera, as shown in Figure 7-6.

107

Chapter 7

Creating a Target Tracking Effect

Figure 7-6.  Meta Spark Studio interface with a textured target tracker in Simulator However, based on our original concept, our aim is to add more visual interest and actually show the user something in motion when the artwork is detected. For that, we will need to add a few new object types.

Particle Systems In our original concept, we envisioned animated stars flowing out of the artwork. While we could manually animate dozens of stars in Meta Spark Studio, the good news is there is a way to generate the same type of result in a much more efficient way.

108

Chapter 7

Creating a Target Tracking Effect

Adding a Particle System In the Scene panel, click the “+” and find the Particle System object (Figure 7-7). Click the Insert button to add it to your scene.

Figure 7-7.  Particle system object in the “+” menu of the Scene panel Particle systems enable automatic creation (a.k.a. “birthing”) of a large number of shapes (a.k.a. “particles”). This technique is helpful to create visualizations of weather, such as snow or rain, or whimsical visuals such as confetti, or in our case, a large number of stars. By default, the newly added particle system will appear in the Scene panel as emitter0. Double-click to rename this object to stars. We only want this stars particle emitter to display when the target is tracked. To achieve this, in the Scene panel, click and drag the particle

109

Chapter 7

Creating a Target Tracking Effect

system row on top of the target tracker row until the targetTracker0 row is outlined in blue, then release the particle emitter (Figure 7-8).

Figure 7-8.  Particle system properly nested within the target tracker in the Scene panel With the stars object selected in the Scene panel, let’s take a look at our Viewport in the middle of the Meta Spark Studio interface.

Positioning a Particle System While our stars object can remain in this default position, it may make more sense aesthetically to position the stars a bit above the target. This allows the particles to emit without covering the text on our artwork. To make this positioning process easier, use the Toolbar in the far left of the interface to pause your Simulator. With the Simulator paused, click and drag the green arrow upward to position the particle system above the text on the heart, as shown in Figure 7-9.

110

Chapter 7

Creating a Target Tracking Effect

Figure 7-9.  Particle system repositioned slightly above the target artwork Alternatively, the position of a particle system can also be controlled through the Inspector menu. With stars selected in the Scene panel, you can also manually input position values in the Inspector’s Position row. The recommended values are shown in Table 7-1.

Table 7-1.  Stars particle system position values

Position

x axis

y axis

z axis

0

0.10

0

Now that our stars particle system is positioned in an ideal place, let’s customize the look and feel of the particles that it emits.

111

Chapter 7

Creating a Target Tracking Effect

Customizing a Particle System From afar, the collection of particles may look like a bundle of gray squares. However, when we zoom in, you’ll notice that the particles are in fact covered with a checkerboard pattern (Figure 7-10) reminiscent of a default rectangle object.

Figure 7-10.  Zoomed-in view of default particles This familiar pattern is because each particle in the emitter is in fact a 2D object itself. Therefore, the particles appear with the same pattern that other material-less objects display. In case you have paused your Simulator, ensure it is actively playing to preview the upcoming changes. Use the Toolbar in the far left of the Meta Spark Studio interface to unpause your Simulator if so. With the stars object selected in the Scene panel, head over to the right side of your interface to take a look at the Inspector. Under the Emitter section of the particle system’s Inspector, find the Type section. The Type options control the area where particles are initially emitted. By default, Meta Spark Studio chooses a Point type, which is why our particles appear as if they are coming out from one singular dot on the screen. A Point type is a great choice if you have one small, specific area you’d like your particles to appear from, such as the tip of a magic wand. 112

Chapter 7

Creating a Target Tracking Effect

However, in our case, our heart artwork is a wide area, so let’s change the Type to Line. This allows the particles to emit over a designated linear section. We’ll also extend the length of the line in the Length row within the Emitter section, changing the value to 0.3 in the Length input field, as shown in Figure 7-11.

Figure 7-11.  Updated values for Emitter options in the Inspector Our original effect concept was to display yellow stars, not checkerboard squares, so let’s customize our individual particle visualization next. With the stars still selected in the Scene panel, navigate to the Inspector and scroll all the way down. There will be a collapsed section entitled Shape. Click the caret, or arrow, to the left of the Shape row to expand the section. Meta Spark Studio provides multiple built-in shape options for particles. In our case, we’ll select Star in the drop-down. This immediately updates the particle appearance in our Viewport (Figure 7-12).

113

Chapter 7

Creating a Target Tracking Effect

Figure 7-12.  Star-shaped particles in Viewport Now that we have the correct shape, the last step for our particles is customizing the color. Directly underneath the Shape section in the Inspector is the Materials section. Click the “+” icon on the Materials row, which immediately adds a new material0 into the Assets panel. Rename this material to stars_mat (Figure 7-13).

Figure 7-13.  Particle material stars_mat in the Assets panel

114

Chapter 7

Creating a Target Tracking Effect

Since we’ve now applied this stars_mat material to the stars particle system, the default checkerboard pattern (indicating a missing material) should now be gone. Instead, we should now see a light gray color filling the particles, indicating a missing texture (Figure 7-14).

Figure 7-14.  Textureless particles appearing with gray material in Viewport Previously when applying textures, we’ve used externally created imagery to overlap onto the material. For this effect, we simply want to color these stars yellow. Select stars_mat in the Assets panel and head over to the Inspector. In the very first drop-down entitled Shader Type, select Flat. Under the Diffuse section of Shader Properties, you’ll see a row entitled Color. Click on the gray rectangle next to Color to view the color picker. In the color picker, select a yellow color of your choice (Figure 7-15).

115

Chapter 7

Creating a Target Tracking Effect

Figure 7-15.  stars_mat Shader Type and Color changes previewed in the Viewport The star particles will now appear in yellow in the Viewport. Ensure your Simulator is actively playing to preview your results.

Working with 3D Assets The next step of our project is adding hearts to appear alongside our animated stars. We will not have to personally create 3D models to complete this part of the effect as Meta Spark Studio conveniently has these shapes available for us to use in the AR Library.

116

Chapter 7

Creating a Target Tracking Effect

Finding 3D Assets Open the AR Library, which is found in the Toolbar on the furthest left side of the interface, and navigate to the 3D Objects section. Click on the 3D Shapes option to browse the base shapes available. In this screen, find the Heart Primitive object (Figure 7-16).

Figure 7-16.  Heart Primitive highlighted in the 3D Shapes section of the AR Library Click Heart Primitive to view its detail page, then click Import Free to insert it into your project. Close the AR Library window.

Customizing 3D Assets A folder entitled Heart has now been added into our Assets panel. As we’ve learned before however, for something to appear in our actual effect, it must be connected to an object in our Scene panel. With the 117

Chapter 7

Creating a Target Tracking Effect

entire Heart folder selected in the Assets panel, click and drag it on top of the targetTracker0 object in the Scene panel until the target tracker is outlined in blue, then release the Heart folder to nest it within the target tracker (Figure 7-17).

Figure 7-17.  Heart folder properly nested within the target tracker container We now see the contents of the Heart folder appearing in our Viewport. With the Heart folder in the Scene panel selected, click and drag the red arrow to the right so the Heart does not obscure our artwork. The recommended values are shown in the Position row of Table 7-2. To bring in a bit more visual variety, we can also change the rotation of the 3D heart. This can be input in the Rotation fields under the Transformation section of the Inspector. The recommended values are shown in the Rotation row of Table 7-2.

118

Chapter 7

Creating a Target Tracking Effect

Table 7-2.  Heart object values x axis

y axis

z axis

Position

0.14

0

0

Rotation

0

0

-15

To bring some visual balance to the effect, let’s add a second heart. In the Scene panel, right-click the Heart folder and select Duplicate. For clarity, rename this second folder to Heart2. Let’s also move Heart2 so it doesn’t overlap the position of our original Heart 3D object. Click and drag the red arrow in the Viewport to the left. The recommended values are shown in the Position row of Table 7-3.

Table 7-3.  Heart2 object values x axis

y axis

z axis

Position

-0.14

0

0

Rotation

0

0

15

Heart2 is currently appearing directly on top of our original Heart, so let’s rotate it a bit so we can see it more clearly. With Heart2 selected, change the values in the Rotation field in the Inspector. The recommended values are shown in the Rotation row of Table 7-3. Both of our 3D heart objects appear with the default gray of a textureless material. To customize this, we’ll need to explore our original Heart asset. Head down to the Assets panel and expand the Heart folder. Within the folder, you’ll see a row entitled Heart_mat. This is a material created by default for models imported from the 3D Shapes category in the AR Library.

119

Chapter 7

Creating a Target Tracking Effect

Select the Heart_mat material in the Assets panel, and head over to the Inspector. In the very first drop-down entitled Shader Type, select Flat. Under the Diffuse section of Shader Properties, you’ll see a row entitled Color. Click on the gray rectangle next to Color to view the color picker. In the color picker, select a red color of your choice (Figure 7-18).

Figure 7-18.  Heart_mat Shader Type and Color changes previewed in the Viewport The 3D hearts will now appear in red in the Viewport. Ensure your Simulator is actively playing to preview your results.

120

Chapter 7

Creating a Target Tracking Effect

Target Markers Since we are the creators of this effect, we know that the target is an image of a heart with the letters “AR” emblazoned on it. But what if this effect was being used by somebody else? How would that user know to look for this particular image to trigger the effect? To solve this problem, we will need to add a target marker—a preview image that indicates to the user what target to look for.

Creating a Target Marker Head to the Scene panel and open the “+” menu. Select Rectangle, then click Insert to add it to your scene. This will add a rectangle nested within a canvas container. Double-click the rectangle and rename it to marker (Figure 7-19).

121

Chapter 7

Creating a Target Tracking Effect

Figure 7-19.  Marker rectangle added to the Scene panel With the marker rectangle selected, head over to the Inspector to adjust size and position values. First, we want to ensure the marker is positioned around the center of the screen. The recommended values for the Position input fields are shown in Table 7-4.

Table 7-4.  Marker rectangle position values

Position

122

x axis

y axis

95

300

Chapter 7

Creating a Target Tracking Effect

Next, we need to ensure the rectangle itself can comfortably fit a preview of our tracker image. The recommended values for the Width and Height input fields are shown in Table 7-5.

Table 7-5.  Marker rectangle width and height values Width

200

Fixed

Height

170

Fixed

Finally, we need to ensure that the placement of our rectangle is pinned properly to the edges of our device screen. Ensure that only the left and top sections in the Pinning row are highlighted in blue, as shown in Figure 7-20.

Figure 7-20.  Inspector displaying updated values for position, width, height, and pinning Our marker rectangle object is still displaying the default checkerboard pattern (Figure 7-21), so let’s add a material to remediate this.

123

Chapter 7

Creating a Target Tracking Effect

Figure 7-21.  Marker rectangle showing the default checkerboard pattern In the Inspector, click the “+” on the Materials row and select Create New Material. In the Assets panel, rename this newly created material to marker_mat. The checkerboard is replaced with a flat gray color, indicating this material has no texture specified. With marker_mat selected, head to the Inspector. Click on the drop-­ down on the Texture row. Select the existing heart texture. Our rectangle will now display the same texture as our tracker image (Figure 7-22).

124

Chapter 7

Creating a Target Tracking Effect

Figure 7-22.  Marker rectangle utilizing the same texture as our tracker image As to not completely obscure the user’s view of the real world, we will need to make our marker rectangle semi-transparent. With the marker_ mat still selected in the Assets panel, head to the Inspector. In the Render Options section, slide the Opacity slider down to 50% (Figure 7-23).

125

Chapter 7

Creating a Target Tracking Effect

Figure 7-23.  Opacity slider in Render Options Now, the effect user will be able to see through the marker rectangle, allowing them to more easily scan the real world for the tracker.

Target Marker Interactivity Once the user finds the target, the target marker should disappear so it doesn’t clash with the effect of 3D stars and hearts. To accomplish this, we’ll be using patches to make the marker disappear when the target is found. In the Scene panel, select the marker rectangle. Next, head to the Inspector and find the Visible row. Click on the arrow directly next to the left of the Visible row (Figure 7-24).

126

Chapter 7

Creating a Target Tracking Effect

Figure 7-24.  Visibility property activated through the Inspector This opens the Patch Editor and creates a patch (Figure 7-25) representing the visibility toggle. The Visible row now turns yellow and cannot be manually clicked anymore, as the property is controlled solely via the patch now.

127

Chapter 7

Creating a Target Tracking Effect

Figure 7-25.  Visibility patch in the Patch Editor Next, head back to the Scene panel and select our target tracker, targetTracker0. Click and drag the target tracker into the Patch Editor. When the background of the Patch Editor turns light gray, release targetTracker0. This will drop a string of patches (Figure 7-26) representing the logic powering the target tracker.

Figure 7-26.  Target tracker patches in the Patch Editor While we could connect this string of target tracker patches to the marker patch, it would not give us the desired outcome. Doing so would activate the marker when the target is visible. Instead, we want the opposite result—the marker only appearing when the target is not visible. To achieve this, we will need to add another patch. 128

Chapter 7

Creating a Target Tracking Effect

Double-click anywhere on the Patch Editor background to browse all available patches. In the flyout menu that appears, find the Not patch (Figure 7-27). It is located within the Logic patch category, or you can simply type “not” in the search field to instantly locate it.

Figure 7-27.  Not patch in the patch browsing menu The Not patch outputs a reverse of the input that is fed into it. Click Add Patch to insert it into the Patch Editor. Position the Not patch between the last of the string of target tracker patches and before the marker patch. Click and drag patches to change their position. Connect the first output arrow of the targetTracker0 patch to the input of the Not patch. Connect the output of the Not patch into the marker patch, as shown in Figure 7-28.

129

Chapter 7

Creating a Target Tracking Effect

Figure 7-28.  Connected string of patches This patch string now ensures that the marker will only be visible when the target is not visible. This accomplishes our intent of having a semi-­ transparent marker preview that disappears upon the user finding the target image in real life.

Effect Instructions While the semi-transparent marker is quite helpful for signaling directions to the user, there is a way to literally spell out instructions to the user. Double-click anywhere on the Patch Editor background to browse all available patches. In the flyout menu that appears, find the Conditional Instruction patch. It is located within the Utility patch category, or you can simply type “conditional instruction” in the search field to instantly locate it. This patch will display text on your effect based on a specified condition, that is, a set of criteria that must be met. In our case, the criteria for showing the instructions are the same as for showing the marker. We want to show both if the user has not yet found the target. To accomplish this, connect the output arrow of the Not patch into the first input of the Conditional Instruction patch, as shown in Figure 7-29. Now, the Not patch is connected to both the marker patch and the Conditional Instruction patch.

130

Chapter 7

Creating a Target Tracking Effect

Figure 7-29.  Not patch connected to both the marker and Conditional Instruction patch Next, we need to specify the text for the Conditional Instruction patch to display. Head over to the Scene panel and select the Device row. Going over to the Inspector, look for the section entitled Instructions. Click the “+” to the right of Instructions. In the flyout menu that appears, find the Find the image instruction (Figure 7-30). It is located within the Target Tracker category, or you can simply type “find the image” in the search field to instantly locate it.

Figure 7-30.  Find the image in the Instructions menu 131

Chapter 7

Creating a Target Tracking Effect

Click Insert to add the instruction. Upon inserting the instruction, it appears as if nothing has changed. That is because we need to associate this text with our instruction patch. Locate the Conditional Instruction patch in the Patch Editor, and open the drop-down entitled Option. Select Find the Image in the drop-down. Now, the text “Find the image” will appear when using the effect on a real device, until the user finds the target.

 reviewing Target Tracking Effects P in the Simulator Throughout our effect creation process, you may have noticed the Simulator in this project appears differently than in our past projects. In the real world, your user might need to walk around to find where the target actually appears, and once the user finds the target, they might move their phone again, thus losing the target. It is a bit more complicated to preview that entire user experience in a completely virtual environment such as the simulator. Because of this, for target tracking effects, the Simulator is simply used to preview the way the effect would appear if it were successfully initiated—that is, the user has found the tracker image and is keeping their device still. You can still play with the Simulator using the W, A, S, and D keys to move around the effect. If you’re familiar with gaming on your computer, this movement might come naturally to you! Of course, to truly experience what it’s like to use this effect, we should take it through a real device test.

132

Chapter 7

Creating a Target Tracking Effect

Testing Target Tracking Effects As we learned in Chapter 5, it’s crucial to test your AR effects on an actual mobile device. Solely trusting the Simulator may lead to cases where the preview of an effect differs significantly from the real experience of using the effect. This is even more apparent on target tracking effects, where the Simulator is, at best, a rough approximation of what it would feel like to use a phone to find an image target.

Preparing the Test in Meta Spark Studio In the left side of the Meta Spark AR interface, click on the Test on Device icon in the bottom half of the Toolbar. Select the Add Experience button in the flyout, which will open a properties window (Figure 7-31).

Figure 7-31.  Experience Properties window

133

Chapter 7

Creating a Target Tracking Effect

In the properties window that appears, select + Add Experience in the bottom right. Select the first option, Sharing Effect, then click Insert in the bottom right. This will populate the window with preselected defaults for each social platform. Click Done to finalize changes. Navigate back to Test on Device in the Toolbar. Now, in the flyout menu, click Send on the desired apps you’d like the test on—Instagram, Facebook, or both.

Preparing the Target While Meta Spark Studio is sending the effect to your phone, navigate to the folder on your computer where you’ve saved all of the Apress GitHub files for this book. Open the heart.png file so it is visible on screen. You can do this by simply navigating to the file in your operating system’s file browser or opening the file using an Internet browser. Make sure you have a clear view of the file on your computer, and that the surrounding area around the pink heart is a white background, as shown in Figure 7-32. Resize the file preview window if needed so the image shows clearly.

134

Chapter 7

Creating a Target Tracking Effect

Figure 7-32.  heart.png file preview in a Mac Finder window, visible in the real world Alternatively, if you have a printer handy, you can print the heart. png file onto a sheet of paper. Basically, we simply need a real-world representation of the target image for the camera to point to, whether that is shown on a screen or on a printed page.

Experiencing the Target Tracking Test When the test effect has finished sending to your phone’s Instagram or Facebook app, open the effect test through the mobile app. To test the marker and instructions, point your camera at anything other than the target image. You will see the semi-transparent marker image with “Find the image” showing as an instruction (Figure 7-33).

135

Chapter 7

Creating a Target Tracking Effect

Figure 7-33.  Testing the target marker and instruction text on a mobile device To test the target tracking, point your camera at the heart image (Figure 7-34), whether on your computer screen or on paper. If the image is on your computer screen, make sure the brightness is high enough so it can be seen visibly, and that the heart image appears on a white background. If the image is printed, make sure you are in a well-lit area.

136

Chapter 7

Creating a Target Tracking Effect

Figure 7-34.  Testing by pointing a mobile device at the heart.png image target on the computer screen Voila! You should see the effect appear on screen as soon as the image is detected. Great job—you’ve completed your first target tracking AR effect! Make sure to save your project file so you don’t lose any of your work.

137

Chapter 7

Creating a Target Tracking Effect

Summary We’ve now learned how to blend imagery found in the real world with delightful, digitally crafted elements. This was achieved through the use of target tracking, particle systems, and importing 3D assets from the AR Library. We’ve also revisited the value of testing an effect on a device to understand the full user experience. In the next chapter, apply the techniques we’ve learned over the last few projects to create an AR-based game.

138

CHAPTER 8

Creating an  Augmented Reality Game Now that you’ve developed a few AR effects, you’ve accumulated a set of foundational techniques that can be combined to make even more sophisticated projects. Previously, we’ve explored using lightweight interactivity in our effects, such as an effect responding to the user opening their mouth. In this project, we will be constantly using our user’s actions to power the effect, thus making a more fully interactive game.

Planning Out a Game Project As first mentioned in Chapter 4 and leveraged in subsequent projects, it’s helpful to outline the concept (Figure 8-1) of what we hope to achieve with this effect before diving into the software. This is especially helpful in a game project, which can become quite complex to develop.

© Jaleh Afshar 2023 J. Afshar, Hands-On Augmented Reality Development with Meta Spark Studio, https://doi.org/10.1007/978-1-4842-9467-3_8

139

Chapter 8

Creating an Augmented Reality Game

Figure 8-1.  Concept sketch of a game project Referencing the concept sketch shown here as an example, we can see that the theme of the project is to create an effect that uses the user’s head movement to control a bunny character. The user of the effect is placed in a meadow where there is an innocently blinking bunny. The bunny collects food to score points, which is counted in a score board. The bunny also has an enemy, a wolf, that it is trying to avoid. If the bunny directly encounters the wolf, the game ends. To break things down more granularly, what we’ll need to learn to achieve this result is how to

140



Create a custom background—A meadow image will be overlaid onto the background behind our user’s face and body.



Add a player character—A bunny with animated blinking responds to the user’s head position by moving across the screen.



Display objects to collect—The carrots float down the screen for the bunny character to collect.

Chapter 8

Creating an Augmented Reality Game



Calculate the score—Every time the bunny collects food, the score number should go up.



Add an enemy—A wolf floats down the screen for the bunny character to avoid.



Create a game over screen—Finish the game if the character touches the enemy, and display the final score.

We’ll address how to achieve each one of these effect elements in the upcoming sections.

Game Project Setup We’ll use a fresh new file for this project. Start by opening Meta Spark Studio and select Blank Project under Create New in the Welcome Screen. Alternatively, if you already had an existing project file opened, you can always select New Effect under File in the top Menu Bar as shown in Figure 8-2.

Figure 8-2.  New Effect option in the Menu Bar Make sure to save this new file, naming it Game_Project.arproj. Continue to save at regular intervals to ensure you do not lose any of your hard work. To ensure your project previews reflect the figures in this project, click on the three-line icon on the bottom right of the Simulator. In the menu that appears, navigate to iOS devices and select iPhone 13 Pro. 141

Chapter 8

Creating an Augmented Reality Game

It’s also important to ensure you are viewing the front camera while previewing the effect. You can check the text at the bottom center of the Viewport, which should read “Camera: Front”. Additionally, hover over the second icon at the bottom of the Simulator to ensure you are viewing the front camera. If you are viewing the correct camera, the hover message will say “Switch to Back Camera”. If the hover message reads “Switch to Front Camera”, click the icon once to switch cameras. The source files used in this chapter’s project are available on GitHub via this book’s product page, located at www.apress.com.

Simulator Touch Settings To ensure your Simulator is set up properly to test interactions in this game, click on the three-line icon on the bottom right of the Simulator. In the menu that appears, make sure that the Simulate Touch option is selected (Figure 8-3).

142

Chapter 8

Creating an Augmented Reality Game

Figure 8-3.  Simulate Touch option in the Simulator menu By having this option selected, clicking with the cursor on the Simulator video will be interpreted by the software as a user tapping on their phone screen. We will use this action in our game, so this setting allows us to quickly preview in the Simulator.

Simulator Cameras For this AR effect, specific movements made by the user on camera will directly affect what happens within the game. For this reason, I recommend setting the Simulator camera to a webcam. Access the Video Library by clicking the first icon underneath the Simulator, entitled “Show Video Library”. This menu can also be accessed through the Video icon in the Toolbar. The options showing at the top in

143

Chapter 8

Creating an Augmented Reality Game

gray buttons are the available Cameras on your device (Figure 8-4). Click on one of the gray buttons to display that camera in the Simulator.

Figure 8-4.  Webcam button showing in Simulator camera options This allows you, as the game’s creator, to quickly make a movement to test the response within the game, rather than waiting for the person in the Video Library clip to make the movement during its pre-recorded loop. I, as the author, will be enabling my webcam for this project rather than using the Video Library as I've done in previous chapters.

144

Chapter 8

Creating an Augmented Reality Game

Creating the Game Environment Our game takes place in a sunny meadow, so let’s begin our project by creating this aspect of the project to set the mood. We’ll be using the background segmentation technique to establish our game environment. With the Camera object row selected in the Scene panel, head over to the Inspector on the right side of the interface (Figure 8-5). In the Inspector, click the “+” icon on the Texture Extraction row to extract the Camera’s texture.

Figure 8-5.  Texture Extraction row in the Inspector

145

Chapter 8

Creating an Augmented Reality Game

After clicking the “+” on the Texture Extraction row of the Camera, we now see a new Textures folder appear in our Assets panel in the bottom left of the interface, with the newly created cameraTexture0 appearing within it. With the Camera object still selected in the Scene panel, take a look at the Inspector panel on the right. In the Inspector, click the “+” icon on the Segmentation row (Figure 8-6). Select Person in the flyout menu that appears.

146

Chapter 8

Creating an Augmented Reality Game

Figure 8-6.  Segmentation options in the Inspector We will now see that segmentationMaskTexture0 appears in our Assets panel. Now that we’ve created two new textures, let’s create the objects that these textures will ultimately map to. Head to the Scene panel and click on the “+” icon in the bottom right of the Scene panel. In the flyout menu that appears, select Canvas and click the Insert button to add it to your scene. Rename this newly added canvas to background_canvas. 147

Chapter 8

Creating an Augmented Reality Game

With background_canvas selected, click on the “+” icon in the bottom right of the Scene panel. In the flyout menu that appears, select Rectangle and click the Insert button to add it to your scene. Rename this newly added rectangle to background. Right-click on the background rectangle and select Duplicate. Rename the duplicated rectangle to person (Figure 8-7).

Figure 8-7.  Rectangles nested within canvas Next, we’ll expand the size of both our rectangles to fill the entire canvas. Select the background rectangle, then hold Shift while clicking the person rectangle to select both rectangles. With both rectangle rows in the Scene panel now appearing blue, head over to the Inspector on the right side of the interface.

148

Chapter 8

Creating an Augmented Reality Game

Change the drop-downs in the Inspector for both Width and Height from Fixed to Relative. Next, click inside the Width input box and change the value to 100. Then, click inside the Height input box and change the value to 100. Next, select only the person rectangle in the Scene panel. In the Inspector on the right side of the interface, click the “+” icon in the Materials row and create a new material. In the Assets panel, rename this material to person_mat. With person_mat selected, head back over to the Inspector. First, click the drop-down arrow in the Texture row and select cameraTexture0. Next, turn on alpha by clicking the check box in the Alpha row within the Inspector. Upon clicking the check box, additional rows appear including another texture selector. Using the drop-down, select the segmentationMaskTexture0 for the alpha’s texture. Our effect user will now be able to see themselves in front of the background, as shown in Figure 8-8.

149

Chapter 8

Creating an Augmented Reality Game

Figure 8-8.  Effect user in Simulator appearing in front of the background To complete our game environment, select only the background rectangle in the Scene panel. In the Inspector on the right side of the interface, click the “+” icon in the Materials row and create a new material. In the Assets panel, rename this material to background_mat. With background_mat selected, head back over to the Inspector. In the Texture row, click Choose File. Select the background_meadow.png file (Figure 8-9), downloaded from this book’s Apress GitHub page.

150

Chapter 8

Creating an Augmented Reality Game

Figure 8-9.  Image texture applied to background material behind the effect user Our effect user now appears in front of a sunny meadow, completing the game environment.

151

Chapter 8

Creating an Augmented Reality Game

Note Items lower in the Scene hierarchy are closer to the user’s view, and items higher are further away. For example, when a user is viewing this game, the background rectangle will always appear behind the person rectangle. This concept will be very important in this project, as we will be adding multiple canvas stacks. To ensure the proper visibility of each object, we’ll need to be very intentional about where they are placed relative to each other. Since we’ll be adding many objects during the course of this project, click the arrow next to background_canvas in the Scene panel. This will collapse the group of objects and keep our file looking cleaner. You can always click the arrow again to reveal what is nested within this object row.

Creating a Playable Character Our next step is to add the animated character controlled and played by our AR effect user. In the Scene panel, click the “+” icon in the bottom right. Find the Face Tracker in the menu and click Insert. Rename faceTracker0 to player_ face_tracker. Click the “+” icon in the bottom right and insert a Canvas. Rename it to core. This core canvas will be the container that holds the main objects for our gameplay. Drag and drop core on top of player_face_tracker to nest the canvas within the face tracker. With the core canvas selected in the Scene panel, click the “+” icon in the bottom right and insert a Plane. This plane should be nested within the core canvas container. Rename the plane to character (Figure 8-10).

152

Chapter 8

Creating an Augmented Reality Game

Figure 8-10.  Nested objects within the face tracker This set of objects will power our player character—the rabbit who will react to the AR effect user’s movements.

Adding Animated Textures Select the character plane and head over to the Inspector. Click the “+” icon in the Materials row, creating a new material in the Assets panel entitled material0. Rename this material to character_mat. With character_mat selected, near the bottom of the Inspector, expand the Advanced Render Options section. Ensure that the check box for Use 153

Chapter 8

Creating an Augmented Reality Game

Depth Test is not checked. At the top of the Inspector, change the shader type for the to Flat. With character_mat still selected, navigate near the top of the Inspector and click the drop-down arrow on the Texture row. Select New Animation Sequence (Figure 8-11).

Figure 8-11.  Animation sequence option in the texture drop-­ down menu Selecting this option instantly replaces the Inspector with the animation sequence Inspector options. Click Choose File and a file picker window will appear. In the file picker, select all ten of the rabbit.png images, named rabbit1.png, rabbit2.png, and so on. Depending on your operating system, selecting all of them can be done by dragging the mouse across all the files or by shift-clicking each file until all ten are selected. Once all ten rabbit pngs are selected, click Open to finish creating the animation sequence. The animating rabbit will now appear in the middle of the Simulator, rapidly blinking (Figure 8-12). If your Simulator is paused, ensure it is playing again to see this animation come to life. 154

Chapter 8

Creating an Augmented Reality Game

Figure 8-12.  Blinking rabbit animation sequence The rabbit is blinking quite quickly, so to slow down the pace, we can adjust the FPS (frames per second) setting. In the Inspector, change the default FPS from 25 to 6 for a more comfortable blink speed. The size of our rabbit character is a bit large compared to the size of the screen, so let’s size it down a bit. Head back up to the Scene panel and select the character plane. Then, in the Inspector, change the values in every one of the Scale fields—X, Y, and Z—to 800. This will proportionally reduce the size of our character about 50% of its original size.

155

Chapter 8

Creating an Augmented Reality Game

Tracking User Movement Now that we have a character on screen, let’s map its movement to that of the effect user. We’ll be using the Patch Editor extensively in this section. To show the Patch Editor on screen, navigate to View within the Menu Bar in Meta Spark Studio, then click Show Patch Editor. To browse all available patches, simply double-click anywhere in the Patch Editor grid. Add patches from the patch browser by clicking Add Patch.

Note In complex projects, you may end up with more patches that can be seen in a glance within the default Patch Editor area. You can click and drag the top of the Patch Editor upward to expand the area. Alternatively, you can pop the Patch Editor out into its own window using the Undock icon near the top right side of the area. With the Patch Editor open, head to the Scene panel and select player_ face_tracker. Drag and drop this face tracker onto the Patch Editor grid. This will create a string of patches representing how the software finds and selects a face, following the face’s movement. Since our character’s position will be based on the rotation of the user’s head, add the Head Rotation patch and connect the first output of the player_face_tracker patch to the input of the Head Rotation patch (Figure 8-13).

156

Chapter 8

Creating an Augmented Reality Game

Figure 8-13.  Face tracking and head rotation patch string Next, add in a Pulse patch, and connect the 2nd output (“Right Turn State”) of the Head Rotation patch to the Pulse patch’s input (Figure 8-14). A pulse signal will now be sent every time the user turns their head right. Since our character appears in the furthest left position it can possibly be, when the user turns right, our character will always move away from this starting position.

Figure 8-14.  Pulse patch connected to the Head Rotation patch’s Right Turn State We still don’t yet see any movement happening on screen. This is because our pulse patch needs to set a position transformation animation into motion. To accomplish this, add an Animation patch, and attach the first output of the Pulse (“Turned On”) to the first input of the Animation patch (“Play”) and the second output of the Pulse (“Turned Off”) to the second input of the Animation patch (“Reverse”), as shown in Figure 8-15.

157

Chapter 8

Creating an Augmented Reality Game

Figure 8-15.  Animation patch connected to the Pulse patch’s outputs This means when the user turns their head right, an animation will be set in motion, while if the user is not turning their head right, that animation will be reversed. We still have yet to clarify just what is being animated though. To do this, add a Transition patch and connect the first output of the Animation patch to the first input of the Transition patch. Our character is still positioned in the middle of the screen though, so let’s place it in the location we plan to start our game. Head back up into the Scene panel. Select character, and drag the arrows appearing in the Viewport so the character appears in the lower left side of the screen (Figure 8-16). Alternatively, in the Inspector, change the values in the Position row for both X and Y to -160.

158

Chapter 8

Creating an Augmented Reality Game

Figure 8-16.  Rabbit character appears in the lower left of the screen Now let’s revisit our Transition patch in the Patch Editor. In the Start row within the patch, change the values in for both X and Y to -160. This means the start of the transition will be exactly the same position as we currently see our rabbit. In the End row, input 160 as the value for the X field, and input -160 again for the Y value, as shown in Figure 8-17. This will mean the end of the transition stops of the X position of 160, while the Y value stays the same. This way, our character doesn’t bounce up and down the screen but moves across horizontally.

159

Chapter 8

Creating an Augmented Reality Game

Figure 8-17.  Transition patch with updated values We now need to feed these position values into a patch connected to our rabbit character. Head back to the Scene panel and select the character plane. In the Inspector, click the small arrow to the left of the Position row to drop in a patch. Drag this yellow patch to the end of our patch string and connect the Transition patch output to the Position patch input (Figure 8-18).

Figure 8-18.  Transition patch connected to the character’s Position patch Now, our effect user moving their head will also move the animated rabbit character, as shown in Figure 8-19.

160

Chapter 8

Creating an Augmented Reality Game

Figure 8-19.  Rabbit character moving along with head movement In case you have paused your Simulator, ensure it is actively playing to preview this change.

Creating an Objective Our rabbit character is now appearing on screen and responds to the user’s head movement. Our next step is to create an objective for our character— eating carrots! In the Scene panel, select the Core canvas. Click the “+” icon in the bottom right of the Scene panel and add a Plane. Rename this plane to

161

Chapter 8

Creating an Augmented Reality Game

objective. In the Inspector, create a New Material for the objective plane, and rename this material to objective_mat. Select objective_mat and head to the Inspector’s Texture row. Click Choose File and select the carrot.png file (Figure 8-20).

Figure 8-20.  Objective plane displaying a carrot With objective_mat still selected, near the bottom of the Inspector, expand the Advanced Render Options section. Ensure that the check box for Use Depth Test is not checked. At the top of the Inspector, change the shader type for the to Flat. The size of our carrot is a bit large compared to the size of the rabbit character, so let’s size the objective plane down a bit. Head back up to the

162

Chapter 8

Creating an Augmented Reality Game

Scene panel and select the objective plane. Then, in the Inspector, change the values in every one of the Scale fields—X, Y, and Z—to 800, as shown in Figure 8-21.

Figure 8-21.  Resized objective plane We also want to position the objective off-screen at the start of the game, giving the user some time to react to the carrot coming into view. In the Position row for the objective plane, input 200 as the value for the X field, and input 500 for the Y value.

163

Chapter 8

Creating an Augmented Reality Game

Initiating the Game The movement of our rabbit character is controlled by the user's head position, so it is tied directly to the face tracker’s patches. Our objective, on the other hand, will be controlled mostly through positioning logic that we’ll set up manually, rather than basing it off the user’s movement. There is, however, one user movement that will affect our objective—a screen tap. To allow the user time to prepare for the game, we will not begin gameplay until the user taps the screen, thus initiating the game and causing the objectives to begin moving. Open the Patch Editor and add the Screen Tap patch. Add a Switch patch, and connect the Screen Tap’s first output to the Switch’s first input (Figure 8-22).

Figure 8-22.  Switch patch string initiated by a Screen Tap This Switch patch can now enable our objective to begin its movement. For the logic we’ll be adding in later sections, we’ll need to capture the value of this string of patches. To do that, add a Value patch and connect the Switch’s output to the Value’s input. By default, Value patches capture Numbers. Our switch can only pass through a binary variable, so we’ll need to change our Value patch’s type.

164

Chapter 8

Creating an Augmented Reality Game

Click on the Value patch to select it, and then click on the blue drop-down menu directly underneath it. Select Boolean as the new type.

Changing the Objective’s Position Add a Loop Animation patch, and connect the Value’s output to the Loop Animation input, as shown in Figure 8-23. The Loop Animation is similar to the earlier Animation patch we used; however, the Loop Animation enables the animation to play repeatedly. This is helpful in our case because we want this carrot objective to fall across the screen multiple times, allowing our user many tries to catch it.

Figure 8-23.  Loop Animation patch connected to the patch string Add a Transition patch and connect the first output of the Loop Animation patch to the first input of the Transition patch. In the Start row within the Transition patch, change the values X to 200 and the value of Y to 500. This means the start of the transition will be exactly the same position as we currently see our carrot in the Viewport. In the End row, input -200 as the value for the X field, and input -500 for the Y value. This way, our carrot will move diagonally across the screen during the animation, starting in the top right and ending in the bottom left. We now need to feed these position values into a patch connected to our carrot objective. Head back to the Scene panel and select the objective plane. In the Inspector, click the small arrow to the left of the Position row 165

Chapter 8

Creating an Augmented Reality Game

to drop in a patch. Drag this yellow patch to the end of our patch string, and connect the Transition patch output to the objective’s Position patch input (Figure 8-24).

Figure 8-24.  Transition patch connected to the objective’s Position patch Now, a click on the Simulator screen will initiate the game, causing the objective to animate down and diagonally on the screen (Figure 8-25). A user trying this effect on a mobile device would also need to tap their device screen to begin the effect.

166

Chapter 8

Creating an Augmented Reality Game

Figure 8-25.  Carrot objective moving downward and diagonally In case you have paused your Simulator, ensure it is actively playing to preview this change, and that it is set to Simulate Touch. Additionally, make sure to periodically Restart the Simulator, which looks like a refresh icon in the left-side Toolbar.

Creating an Enemy Character The last of our game’s core objects is our enemy character: the wolf.

167

Chapter 8

Creating an Augmented Reality Game

In the Scene panel, select the Core canvas. Click the “+” icon in the bottom right of the Scene panel and add a Plane. Rename this plane to enemy (Figure 8-26).

Figure 8-26.  Scene panel containing Enemy object In the Inspector, create a New Material for the enemy plane, and rename this material to enemy_mat. Select enemy_mat and head to the Inspector’s Texture row. Click Choose File and select the wolf.png file (Figure 8-27).

168

Chapter 8

Creating an Augmented Reality Game

Figure 8-27.  Enemy plane displaying a wolf With enemy_mat still selected, near the bottom of the Inspector, expand the Advanced Render Options section. Ensure that the check box for Use Depth Test is not checked. At the top of the Inspector, change the shader type for the to Flat. The size of our wolf is a bit large compared to the size of the rabbit character, so let’s size the enemy plane down a bit. Head back up to the Scene panel and select the enemy plane. Then, in the Inspector, change the values in every one of the Scale fields—X, Y, and Z—to 800.

169

Chapter 8

Creating an Augmented Reality Game

Changing the Enemy’s Position Open the Patch Editor and add a Loop Animation patch. Connect the output of the Value patch we added earlier to this new Loop Animation input. Add a new Transition patch and connect the new Loop Animation output to this new Transition patch input. It may be helpful to rearrange the patches so they neatly stack on top of one another, as shown in Figure 8-28.

Figure 8-28.  Newly added patches highlighted in the Patch Editor In the Start row within the new Transition patch, change the value of X to 0 and the value of Y to 800. In the End row, input 200 as the value for the X field, and input -800 for the Y value. Head back to the Scene panel and select the enemy plane. In the Inspector, click the small arrow to the left of the Position row to drop in a patch. Drag this yellow patch to the end of our latest patch string, and connect the latest Transition patch output to the enemy’s Position patch input (Figure 8-29).

170

Chapter 8

Creating an Augmented Reality Game

Figure 8-29.  Patches changing the enemy’s position highlighted in the Patch Editor Restart the Simulator, which looks like a refresh icon in the left-side Toolbar, then press play in the Toolbar and tap the Simulator to begin the effect. Upon previewing the effect in the Simulator, you may have noticed that both the carrot and the wolf enter the screen at the same time. To better separate the objects, we want the enemy to enter the Viewport a bit slower than the objective. To do this, change the Duration field in the latest Loop Animation patch to 3. Leave the first Loop Animation patch duration as 1, as shown in Figure 8-30.

171

Chapter 8

Creating an Augmented Reality Game

Figure 8-30.  Adjusted durations for Loop Animations Since we’ll be adding more objects in the next section, click the arrow next to player_face_tracker in the Scene panel. This will collapse the group of objects and keep our file looking cleaner. You can always click the arrow again to reveal what is nested within this object row.

172

Chapter 8

Creating an Augmented Reality Game

Keeping Score An important aspect of games is measuring how well the player has accomplished the objective of the game. To do this, most games display a score. For us to add this feature to our game, we’ll need to first create an interface to display the score and then tell our interface how to measure the player’s accomplishments.

Creating User Interface Elements In the Scene panel, click the “+” icon in the bottom right of the Scene panel and add a Canvas. Rename this canvas to scoreboard. With the scoreboard canvas selected, add a Rectangle. Rename this rectangle to scoreboard_container. With scoreboard_container selected, head over to the Inspector. Update the Width value for this rectangle to 200 and the Height value to 90. Set the Position X value to 100 and the Position Y value to 20 to update the size and position (Figure 8-31).

173

Chapter 8

Creating an Augmented Reality Game

Figure 8-31.  Simulator showing rectangle with updated size and position values While still in the Inspector, create a new material for this rectangle. Rename the material to scoreboard_container_mat. Select scoreboard_container_mat in the Assets panel. Head to the Inspector and click on the gray rectangle next to Color. In the color picker that appears, select white. Next, in the Render Options section of the Inspector, set the Opacity to 50 (Figure 8-32).

174

Chapter 8

Creating an Augmented Reality Game

Figure 8-32.  Simulator showing updated material settings for the scoreboard_container_mat Near the bottom of the Inspector, expand the Advanced Render Options section. Ensure that the check box for Use Depth Test is not checked. The Simulator will now display a semi-transparent rectangle near the top of the screen. This will serve as the background container for our score text, which will make the letters and numbers easier to read.

175

Chapter 8

Creating an Augmented Reality Game

Adding Text In the Scene panel, select scoreboard and click the “+” icon in the bottom right of the Scene panel and add 2D Text. Rename this text to description. With description selected in the Scene panel, head to the Inspector. Update the Width value for this text to 200. Change the Position X value to 100. In the Typography section, next to Text, update the placeholder text to “Carrots eaten:” and select Facebook Sans Bold in the Font drop-­ down menu. In the Font Size field, change the value to 30, as shown in Figure 8-33.

176

Chapter 8

Creating an Augmented Reality Game

Figure 8-33.  Inspector showing updated text settings for the description

177

Chapter 8

Creating an Augmented Reality Game

We have one last user interface element to add—the score number itself. In the Scene panel, select scoreboard and click the “+” icon in the bottom right of the Scene panel and add 2D Text. Rename this text to score. With score selected, head over to the Inspector. Set the Position X value to 155 and the Position Y value to 30. Change the font to Facebook Hand Bold in the Font drop-down menu. Finally, in the Typography section, next to Text, update the placeholder text to 0 (Figure 8-34).

Figure 8-34.  Simulator showing the user interface for our game

178

Chapter 8

Creating an Augmented Reality Game

Creating a Counter We’ve now set up our interface elements that will display the score counter. To set up the logic for measuring this, we’ll need to open the Patch Editor. The way we will measure if the player has accomplished an objective is by checking if the playable character’s position overlaps with the objective’s position—the rabbit overlaps the carrot.

Unpacking Position Values Let’s look at the very first string of patches we created, determining the position of our character. This patch string starts with the green Face Finder patch and ends with the Position patch for the character’s 3D position. Add an Unpack patch underneath the yellow Position patch for the character’s 3D position. Unpack patches will extract the X, Y, and Z numbers from the patch feeding into it. We only need X and Y values for our character, so click on the Unpack patch to select it, and then click on the blue drop-down menu directly underneath it. Select Vector 2 as the new type. Copy and paste this Unpack patch two more times, and place each patch underneath the yellow Position patches in the Patch Editor. Connect the output of the Transition patch (located directly to the left of each Unpack patch) to the input of the Unpack patch (Figure 8-35).

179

Chapter 8

Creating an Augmented Reality Game

Figure 8-35.  Unpack patches connected to each of their associated Transition patches

Calculating Success Next, we’ll need to check if the X and Y values of the characters position equal that of the objective’s position. To do this, add two Equals patches. Connect the X output of the character’s Unpack patch to the first input of the first Equal patch. Connect the X output of the objective’s Unpack patch to the second input of the first Equal patch.

180

Chapter 8

Creating an Augmented Reality Game

Repeat this for Y values. Connect the Y output of the character’s Unpack patch to the first input of the second Equal patch. Connect the Y output of the objective’s Unpack patch to the second input of the second Equal patch. By default, the Tolerance setting for the Equal patch is 0, meaning if the values aren’t exactly the same, it does not count the values as equal. In our case, it may be difficult for the player to catch the objective when it overlaps exactly with the character, so let’s change the Tolerance to 50 to give a bit of leeway (Figure 8-36).

Figure 8-36.  Equals patches connected to Unpack patches We need both the X and Y values to be “equal” (with a tolerance of 50 of course) to count as a successful score point. To do this, add an And patch, and connect the output of the first Equals patch to the first And patch input and the output of the second Equals patch to the second And patch input. Double-click the And patch title, and rename it to Objective Achieved.

181

Chapter 8

Creating an Augmented Reality Game

Every time the Objective Achieved patch is true, we want to send a signal to our scoreboard text to increase the number by 1. To capture this signal, add a Pulse patch and connect the output of the Objective Achieved patch to the input of the Pulse (Figure 8-37).

Figure 8-37.  Objective Achieved patch connected to the Pulse patch There is a patch specifically designed to count pulses such as the one we’ve just added. Add the Counter patch and connect the first output of the Pulse patch to the Counter’s first input. Change the Maximum Count value on the Counter to 100. Add a Value patch and connect the Counter’s output to the Value patch’s input. Next, since we want this value to be represented in text, add a To String patch. Connect the Value patch output to the String’s input. Finally, we need a patch to represent where we want this string to display. Head to your Scene panel and select the Score text object. Head to the Inspector and click the small arrow to the left of the Text row. Head back to the Patch Editor and connect the output of the To String patch to the input of the score Text patch as shown in Figure 8-38.

182

Chapter 8

Creating an Augmented Reality Game

Figure 8-38.  Patch string enabling counting and display of the score Restart the Simulator, which looks like a refresh icon in the left-side Toolbar, then press play in the Toolbar and tap the Simulator to begin the effect.

Note  When testing a game, you may choose to lower the difficulty so it’s faster to check if the functionality is working as intended. In this game, you can change the Tolerance level on the Equals patches to a higher number, such as 75, for easier testing. Remember to change this number back before publishing the effect. Now, when the rabbit character moves and catches a carrot, the scoreboard displays the number of carrots eaten.

Ending the Game Our game is now set up to count the number of times a user has completed the objective. We now need to create a challenge for our player by introducing our enemy character’s ability to end the game.

Creating a Game Over Screen First, let’s head to our Scene panel. Since we’ll be adding more objects in this section, click the arrow next to scoreboard in the Scene panel. This

183

Chapter 8

Creating an Augmented Reality Game

will collapse the group of objects and keep our file looking cleaner. You can always click the arrow again to reveal what is nested within this object row. Click the “+” icon in the bottom right of the Scene panel and insert a Canvas. Rename it to game_over. We want the scoreboard to still show when the game_over screen is visible, so drag the game_over canvas so it appears above the scoreboard and below the player_face_tracker in the Scene panel. With the game_over canvas still selected in the Scene panel, click the “+” icon in the bottom right and insert a Rectangle. This rectangle should be nested within the game_over canvas container. Rename the rectangle to game_over_art (Figure 8-39).

Figure 8-39.  Game over objects in the Scene panel

184

Chapter 8

Creating an Augmented Reality Game

Select the game_over_art rectangle and head to the Inspector. Change the drop-down menus in the Inspector for both Width and Height from Fixed to Relative. Next, change the value in the Width input box to 100. Then, change the Height value to 100. With the game_over_art rectangle still selected in the Scene panel, create a new material for it in the Inspector. Rename this material to game_over_art_mat and choose the game_over.png art as this material’s texture (Figure 8-40). At the top of the Inspector, change the shader type for the game_over_art_mat to Flat. Near the bottom of the Inspector, expand the Advanced Render Options section. Ensure that the check box for Use Depth Test is not checked.

Figure 8-40.  Game over screen in the Simulator 185

Chapter 8

Creating an Augmented Reality Game

This will be what the user sees when the character’s position overlaps with the enemy’s position.

Calculating Game’s End In the Patch Editor, add two more Equals patches, each with a Tolerance of 50. Name the first Equals patch Enemy X and the second Equals patch Enemy Y. Similar to what we did for the objective, connect the X output of the character’s Unpack patch to the first input of the Enemy X patch. Connect the X output of the objective’s Unpack patch to the second input of the Enemy X patch. Repeat this for Y values. Connect the Y output of the character’s Unpack patch to the first input of the Enemy Y patch. Connect the Y output of the objective’s Unpack patch to the second input of the Enemy Y patch, as shown in Figure 8-41.

186

Chapter 8

Creating an Augmented Reality Game

Figure 8-41.  Highlighted enemy Equals patches connected to Unpack patches We need both the X and Y values to be “equal” (with a tolerance of 50 of course) to count for the enemy to end the game. To do this, add an And patch, and connect the output of the Enemy X patch to the first And patch input and the output of the Enemy Y patch to the second And patch input. Double-click the And patch title, and rename it to Game Ends. If the Game Ends patch is true, we want to send a signal so we know it’s time to display our game over screen. To capture this signal, add a Pulse patch and connect the output of the Game Ends patch to the input

187

Chapter 8

Creating an Augmented Reality Game

of the Pulse. Add a Switch patch and connect the first output of the Pulse patch to the second input of the Switch. Head to your Scene panel and select the game_over_art object. Head to the Inspector and click the small arrow to the left of the Visible row to drop in this rectangle’s visibility patch. Go back to the Patch Editor and connect the output of the Switch patch to the input of the game_over_art’s Visible patch (Figure 8-42).

Figure 8-42.  Patch string enabling the game over visuals to be displayed Now, the main character encountering the enemy will lead to the game over artwork to be displayed. However, you might notice that we cannot exit the game at this point, so the user is stuck in this screen. Additionally, the score counter still increases if the character behind the game over artwork is eating carrots! Let’s fix these two issues to complete our AR game.

Restarting the Game In the Patch Editor, find the patches attached to the initial Screen Tap action (Figure 8-43). If it’s difficult to locate the patch, you can click the Search Patch link in the bottom left of the Patch Editor and type in “screen tap”. 188

Chapter 8

Creating an Augmented Reality Game

Figure 8-43.  Searching for the Screen Tap patch To the right of the Screen Tap patch, you will see our first Value patch that we created. We need subsequent screen taps to continue gameplay. To do this, add another Value patch and rename this new Value patch Reset Game. Change the type of patch to Boolean. Connect the first Value patch’s output to the Reset Game patch’s input. Add a Pulse patch, and rename this Pulse patch to Start Play. Connect the output of Reset Game to the input of Start Play, as shown in Figure 8-44.

189

Chapter 8

Creating an Augmented Reality Game

Figure 8-44.  Reset Game and Start Play patches highlighted in the Patch Editor Next, find the yellow game_over_art Visible patch. There will be a Switch patch directly connected to the left of it. Rename this Switch patch to End State. Connect the second output of the Start Play patch to the third input (entitled “Turn Off”) of the End State Switch. Now, when testing the game in the Simulator, you will be able to exit the game over screen by tapping. Tapping again restarts the gameplay; however, we will still need to fix our score counter.

Accurate Scoring There are two more tasks left for us to finish when it comes to our scoring issue. First, we need the score to reset to zero when the user starts a new game. We also need to stop counting score increases when the game over screen is present. To address the first task, connect the first output of the Start Play patch to the third input of the Counter patch. This will jump the score to 0 when the condition is met (Figure 8-45).

190

Chapter 8

Creating an Augmented Reality Game

Figure 8-45.  Score counter showing 0 after a new game is initiated Now, after exiting the game over screen by tapping once, tapping again will restart the gameplay and reset the score counter to 0. To ensure the score is not increasing during game over, find the End State patch. Add a Not patch and connect the End State patch output to the Not patch input. Add an And patch, and rename this And patch to End Check. Connect the output of the Not patch to the first input of the End Check patch (Figure 8-46).

Figure 8-46.  End State, Not, and End Check patch string Connect the output of the Reset Game patch to the second input of the End Check patch. Make sure that the Reset Game patch is also still connected to the Start Play patch. We’ll need to adjust a patch string we previously created to insert some new logic. Find the Objective Achieved patch in the Patch Editor. Click on the line connecting the Objective Achieved patch to the Pulse patch. Delete the line (Figure 8-47) by pressing delete on your keyboard.

191

Chapter 8

Creating an Augmented Reality Game

Figure 8-47.  Connected line removed between Objective Achieved and Pulse patch Next, add an And patch, then connect the output of Objective Achieved to the first input of the And patch. Connect the output of the new And patch to the Pulse patch input, which is missing a connection. Finally, connect the output of the End Check patch to the second input of this new And patch. Now in the game over screen, the score counter is paused and will display the user’s correct final score.

Adding Multiple Instructions As the game’s designers, we know exactly how this game works. However, our users might instantly be able to understand the way to interact with the game. For an improved experience, adding instructions can help bring clarity to our users. Now, head to the Patch Editor and add in the Runtime patch, which tracks the time since the effect has started. Add a Less Than patch, and connect the Runtime output to the first input of the Less Than patch. Change the number of the second input in the Less Than patch to 3, signifying three seconds. Next, add the Conditional Instruction patch and connect the Less Than output to the first Conditional Instruction input (Figure 8-48).

192

Chapter 8

Creating an Augmented Reality Game

Figure 8-48.  Patch string for conditional instructions Head over to the Scene panel and select the Device row. Going over to the Inspector, look for the section entitled Instructions. Click the “+” to the right of Instructions. In the flyout menu that appears, find the Tap to Play instruction. You can simply type “Tap to Play” in the search field to instantly locate it. Click Insert to add the instruction. Next, find the Turn Your Head instruction, and click Insert. Find the Conditional Instruction patch, and now in the Option drop-­ down, select Tap to Play. Next, find the Switch patch directly connected to the Screen Tap patch. Add a Pulse patch and connect the output of the Switch patch to this newly added Pulse patch. Do not remove previous connections between the Switch patch and other patches. Add a Timed Instruction patch and connect the newly added Pulse patch to the first input of the Timed Instruction patch. Head over to the Scene panel and select the Device row. Click the “+” to the right of Instructions. In the flyout menu that appears, find the Turn Your Head instruction, and click Insert. Find the Timed Instruction patch, and now in the Option drop-down, select Turn Your Head. Change the duration field to 3, as shown in Figure 8-49.

193

Chapter 8

Creating an Augmented Reality Game

Figure 8-49.  Patch string for the Turn Your Head instruction It will now be clearer to the user that they need to tap the screen to begin the game and turn their head to interact with the main character. Huge congratulations—you’ve finished developing your very first AR game! This was no easy feat and leveraged most of the skills you’ve learned so far in this book plus a few new techniques unique to this chapter. This game project file can now serve as a template for you to customize in the future. Feel free to customize it by swapping assets or remixing decisions made throughout the development process.

Summary Combining the techniques we’ve learned across our various past projects, we’ve created our first AR game. This includes utilizing a face tracker to control our main character, using animations to bring an additional dimension of movement to characters and objects, and using logic through patches to keep score. In the next chapter, we’ll learn how to prepare your Meta Spark Studio projects for publishing and review the steps of the submission process.

194

CHAPTER 9

Publishing Effects We’ve previously learned how to test effects on a device and share test links and QR codes. However, effect tests are only available for up to 50 users every 24 hours, and those links must be manually re-shared every time you change an aspect of the effect. To release your AR effect to a wider audience, the best way is through officially publishing it. Published effects are discoverable in the effect galleries on Meta’s social platforms, such as through the Facebook camera effects section, Instagram’s Stories and Reels creation flow, and more. Published effects can also be enjoyed by an unlimited number of users. Making sure your AR effect functionally works both in the Meta Spark Studio software and on a device test is the first step to ensuring it is ready to publish. However, there are additional criteria that will need to be met before an effect is ready for wide release.

Preparation Before Publishing Before submitting an effect, make sure to prepare the following elements to ensure a smooth publishing process: •

Effect name



Effect demo video

© Jaleh Afshar 2023 J. Afshar, Hands-On Augmented Reality Development with Meta Spark Studio, https://doi.org/10.1007/978-1-4842-9467-3_9

195

Chapter 9

Publishing Effects



Effect icon



Finished project file

Naming Your Effect Every published effect is required to have a name. Effect names can be up to 20 characters long and include spaces. The name will appear on content using the AR effect, such as Facebook or Instagram Reels and Stories, and in the effect browsing galleries.

Creating a Demo Video Another important element to prepare before submitting an effect for publishing is creating a demo video. The demo video is a short clip that shows your potential users what to expect if they were to try your effect. There are multiple ways of creating a demo video. The most natural method may be to send the effect test to your mobile device and use the built-in record button when testing your effect by pressing and holding, as shown in Figure 9-1.

196

Chapter 9

Publishing Effects

Figure 9-1.  Record button in the bottom middle of the screen when testing effect in Instagram Another way to create a demo video is within the publishing interface itself. On the furthest left side of the Meta Spark Studio interface, find the Publish icon in the Toolbar. It resembles a rectangle with an upward facing arrow. In the window that appears, click the Record Video icon. This opens the built-in video recording interface (Figure 9-2).

197

Chapter 9

Publishing Effects

Figure 9-2.  Recording option within the Meta Spark Studio publishing interface Regardless of which method you use to create your demo video, ensure that what you capture shows the full capability of your effect.

Creating an Icon Published effects also need an icon, representing the essence of the effect. The icon is generally shown in a small container and appears in various places around the effect galleries. The icon can be created from a still from the demo video; however, depending on the type of effect you’ve created, it might be difficult to make out exactly what is happening at small sizes. In those cases, it’s recommended to use a custom icon image instead, as shown in Figure 9-3. 198

Chapter 9

Publishing Effects

Figure 9-3.  Icon from a demo video (left) and icon from a custom image (right) Custom icon artwork should be square and at least 200 pixels on each side. Icons should not have any transparency.

Project File Size One of the aspects that will be checked during the publishing process is the overall size of your project. Ensure the Assets panel in your project only contains files that are actively being used by the effect. Delete any unused files, such as textures you may have imported earlier but are no longer using. Test your effect on a device to ensure your file can load rapidly without excessive loading times.

Project Capabilities Another aspect that will be checked during the publishing process will be the technical capabilities of your project. This means ensuring that the features within Meta Spark Studio you have used are in fact compatible with the platforms you’ve selected to publish to. For instance, some platforms may not be able to run every single type of Meta Spark Studio

199

Chapter 9

Publishing Effects

feature or function you may have used when crafting a sophisticated effect. This exact list of features will vary over time as platforms evolve and become more technically advanced. This is an important reason why testing is so key to the project development cycle. If there are elements of your effect that cannot run on a certain platform, you may be able to catch that earlier in the development process through testing on a device.

Policies, Standards, and Guidelines Every aspect of the effect submission, from the effect project file itself to the name, icon, video, and additional information fields, cannot include anything inappropriate as defined by Meta Spark Policies and the Community Standards and Guidelines for each platform you choose to publish on. Links to Meta Spark’s Terms and Policies can be found at the bottom of the publishing window, shown in Figure 9-4.

200

Chapter 9

Publishing Effects

Figure 9-4.  Default publishing window

The Publishing Process The publishing process is initiated within the Meta Spark Studio interface itself. To publish an effect, open the finished project file in Meta Spark Studio. On the furthest left side of the Meta Spark Studio interface, find the Publish icon in the Toolbar. It resembles a rectangle with an upward facing arrow. Clicking on the icon brings up the publishing window (Figure 9-4).

201

Chapter 9

Publishing Effects

At this step, basic requirements are checked, and you may record your video if you don’t yet have one prepared. Click the Upload button when ready to proceed. You’ll be taken to your browser next, where the remaining information required for publishing will be shown in a form (Figure 9-5).

Figure 9-5.  Form for publishing, appearing in a browser window The options and fields available for an effect will differ slightly based on the platforms you have accounts for and the capabilities in your effect. Fill out the remaining form fields and click Submit when you are done. This will send your submission to be reviewed by Meta Spark Studio reviewers. Upon review, you will be notified generally within ten days of the reviewer’s decision. If your effect is accepted, congratulations! If your effect is not accepted, you’ll be informed of the issues to resolve. 202

Chapter 9

Publishing Effects

Summary In this chapter, we learned the basics of publishing AR effects, including how to create optimal assets for your effects. Through publishing, you can reach a wider audience and share your creations across multiple platforms. In the upcoming final chapter, we’ll wrap up with inspirational resources and suggestions for the future of your Meta Spark Studio augmented reality journey.

203

CHAPTER 10

Conclusion While the projects in this book have now concluded, your AR journey doesn’t have to end here! There are many ways for you to continue expanding your knowledge and experience in this field.

F inding Inspiration If you’re looking to begin your next augmented reality project, there’s no shortage of concepts to try and endless possibilities for what you can create. That being said, it can sometimes be tough to commit to a concrete idea for your next project. If you’re feeling stuck, here are a few places you can go to for inspiration.

Learn from Seasoned Developers A great place to start is becoming familiar with other augmented reality creations that are already on the market. By seeing what others have done, you can get more ideas of what’s possible with this technology. Meta Spark Instagram: www.instagram.com/metaspark This official Instagram account regularly showcases top AR creators from around the world along with pro tips and tricks so you can learn from the best.

© Jaleh Afshar 2023 J. Afshar, Hands-On Augmented Reality Development with Meta Spark Studio, https://doi.org/10.1007/978-1-4842-9467-3_10

205

Chapter 10

Conclusion

Inspiration from Various Platforms While this book’s focus is Meta Spark Studio, it can be incredibly inspiring to see what else is being developed in the market across multiple AR software and hardware platforms, not just the ones you are currently personally using. Some examples to check out include the following: Augmented Reality from Apple: www.apple.com/augmented-reality/ This official website from Apple showcases AR apps created specifically for their proprietary operating systems: iOS and iPadOS. Augmented reality from Google: https://arvr.google.com/ar/ Google’s official website for AR highlights apps that leverage their Search platform, AI capabilities, and even Google Maps.

Attend Events Participating in events, such as conferences or hackathons, can be a rewarding way to learn more about the future of AR and meet other people who are interested in this space. Official hackathons generally have a particular theme or challenge to tackle, which is a helpful way to give yourself focus for a new project. Conferences frequently have industry experts who share helpful techniques, and companies tend to share new software releases or upcoming features at these types of events. By attending, you will have a sneak peek at the latest technologies you can try in the software. Additionally, many of the Meta Spark–related events are free to attend and are hybrid or fully virtual as well, enabling attendance across the globe. Meta Spark Facebook: www.facebook.com/MetaSpark Official page, which includes showcasing hackathons and outreach activities directly hosted by Meta. Meta Connect: https://metaconnect.com/

206

Chapter 10

Conclusion

Official conference from Meta that frequently includes the unveiling of new features for Meta Spark Studio.

Leverage Your Niche Think about your own interests and background outside of AR—what are you passionate about? How could you use augmented reality to enhance your favorite activities? Is there a particular opportunity in the niche you are interested in? Use this specific knowledge you have to brainstorm how AR might bring a new dimension to hobbies you are interested in. This could be something as simple as an AR effect referencing an in-joke that those in your interest group might recognize, to creating a new AR tool that could make a particular activity more fun or easier to do.

Conduct User Research User research is the study of your prospective or current user’s behaviors, motivations, and interests. This type of research is commonly done through interviews, surveys, and feedback on concept sketches or prototypes. It can be incredibly valuable to conduct this type of research in any product creation process, so consider this for your future AR projects. The user research process is a great way to come up with product ideas that you may otherwise overlook. By talking to people, especially those who represent a particular audience you may be interested in targeting, you can better understand their needs and generate new ideas that are optimally tailored to their priorities. In addition, this research can help you validate existing ideas and make sure that they are actually desirable for your target user. Show these research participants your conceptual AR effect designs or even an early interactive prototype to get preliminary feedback on what’s working well and what could be improved. 207

Chapter 10

Conclusion

Some questions to consider when sharing an AR idea with research participants: •

What appeals and doesn’t appeal to you visually in this AR effect?



Is there anything about this concept which was unexpected for you?



Why do you think someone would or would not use this?



What are three words you’d use to describe this AR effect?



Does this remind you of anything else you’ve used? If so, what was it and which aspects of this AR effect reminded you of it?



If you could change one thing about this concept, what would it be and why?

This can be especially helpful when developing an ambitious new idea, as there is always a risk of investing significant time and resources in any project. By doing user research upfront, you may increase the chances of success for your AR effect idea.

Giving Back to the Community Now that you have gained some experience in creating augmented reality effects, you have a valuable skill that can help other people who are interested in exploring the AR world. Sharing your newfound knowledge can be done in a number of ways, such as through direct mentorship, writing tutorial articles, hosting workshops, or helping others out in forums and online groups. Another great way to contribute to the AR community is by creating project templates for others to reuse and customize. 208

Chapter 10

Conclusion

In addition to being a personally gratifying experience, giving back to others can also help to build your professional network and broaden your perspectives by hearing the creative ideas of others interested in the field.

Final Words We’ve only just scratched the surface of the endless innovative opportunities that augmented reality can enable. I hope this book has inspired you to experiment further with this exciting technology. Thanks so much for joining me on this AR journey. I can’t wait to see what you create next!

209

Index A Add experience ads Instagram, 77 sharing effect Facebook, 77 Instagram, 77 reels, 77 stories, 77 video calling Instagram, 78 Messenger, 78 Add Experience button, 77, 133 Advanced render options use depth test, 64, 153, 162, 169, 175, 185 write to depth buffer, 64 Alpha channel, 47 And patch, 181, 187, 191, 192 Android, 7, 76, 79, 81 Animation, 25, 33 Animation patch, 33, 157, 158 Animation sequence Inspector, 154 Apple, 7, 10, 206 AR game, 188, 194 AR Library, 20 ambient sounds, 23

blocks, 26 Color LUTs, 28 music, 22 patch effect, 24 script packages, 27 Sketchfab, 22 sound effects, 23 textures, 25 3D object, 22 Assets panel, 41, 42, 44–47, 51, 59–61, 64, 67, 74 audio, 12 3D model, 12 fonts, 12 GIF, 12 illustration, 12 photo, 12 Audio patches, 33 Augmented reality (AR), 1 from Apple, 206 creation, 6 creation software, 8 differentiator, 3 digital content, 1 Effect House, 8 experience, 4 eyewear and headsets, 2

© Jaleh Afshar 2023 J. Afshar, Hands-On Augmented Reality Development with Meta Spark Studio, https://doi.org/10.1007/978-1-4842-9467-3

211

INDEX

Augmented reality (AR) (cont.) from Google, 206 industries, 1 metaverse, 3 technology, 2 and VR, 3 works, 6

B background_canvas, 147, 148, 152 background_mat, 59, 60, 150 Background rectangle, 58–60, 148, 150, 151 Background segmentation texture extraction hair, 42 person, 42–44, 47 skin, 42 Body landmark patches, 34 Boolean, 165, 189

C Camera object, 40, 41, 43, 145, 146 cameraTexture0, 41, 46, 149 Canvas, 147, 148, 151, 152, 161, 168, 173, 184 Character plane, 153, 155, 160 character_mat, 153, 154 Color grading color LUTs apply to camera, 74 finding, 73 212

Color LUTs, 92, 93 Community, 208, 209 Concept sketch, 37, 38, 104, 140 Conditional Instruction input, 130–132, 192, 193 Conditional Instruction patch, 130–132, 192, 193 Counter patch, 182, 190 Custom texture, 64, 65, 96–98

D Demo video, 196–199 Description, 176 Device patches, 34

E Effect file, 39, 74 Effect, 37 End Check patch, 191, 192 End State patch, 191 End State Switch, 190 enemy_mat, 168, 169 Enemy plane, 168–170 Enemy X patch, 186 Equals patches, 180, 181, 183, 186, 187 Extended reality (XR), 3

F Facebook, 6, 8, 10, 76, 77, 81–84, 135, 178, 195, 196 Face landmarks patches, 34

INDEX

Face mesh, 63–66, 94, 95, 97, 98 3D, 95 Face reference templates topology, 96 face_template.png file, 96 Face Tracker, 62, 63, 66, 95, 152, 153, 156, 164 First Project.arproj, 39, 75, 87 Flat targets, 103 Frames per second (FPS), 155

G Game Ends patch, 187 Game environment, 145, 150, 151 game_over_art object, 188 game_over_art rectangle, 184, 185 game_over_art Visible patch, 188, 190 Game over screen calculation, 186–188 game_over canvas, 184 game_over_art rectangle, 185 game_over_art_mat, 185 player_face_tracker, 184 scene panel, 183, 184 Game project accurate scoring, 190–192 breakdown, 140 concept sketch, 140 enemy character, 167–172 environment, 145–152 game over, 183–188

multiple instructions, 192–194 objective, 161–167 playable character, 152–161 restarting, 188–190 score adding text, 176, 178 counter creation, 179–183 user interface elements, 173–175 score points, 140 setup, 141–144 template, 194 Game_Project.arproj, 141 Get Started button, 10 GitHub, 20, 39, 60, 88, 90, 95, 105, 107, 134, 142

H Hardware camera, 4 webcam, 4 computer, 4 display monitor, 4 touchscreen, 4 HoloLens, 5 laptop, 4 Magic Leap, 5 Meta Quest Pro, 5 smartphone, 4 ThinkReality, 5 Head Rotation patch, 156, 157 Headset, 2–5 213

INDEX

I, J, K

M

Icon, 198, 199, 201 Image resources, 99 Import Free button, 22, 24, 26, 28 Import texture, 60, 61 Inputs, 6, 49, 118, 122, 157, 165 Inspector, 14, 41, 43, 44, 46, 47, 54, 57, 59–61, 64, 65, 67, 69–71, 145, 158, 162, 163, 168, 185 Inspiration Apple, 206 Events Meta Connect, 206 Meta Spark Facebook, 206 Meta Spark Instagram, 205 Instagram, 6, 8, 10, 76–81, 134, 135 Interaction patches, 34, 72 Interactivity adding and managing planes, 66 positioning planes, 67–70 iOS devices, 7, 39, 141 iPhone 13 Pro, 39, 141

Materials, 51, 53, 107, 114, 115, 119, 123, 124, 149, 150, 153, 162, 168, 174, 185 applying textures, 45–48 Assets panel, 59 existing materials to objects, 56–58 Inspector, 57 and rectangles, 51 visualization, 59, 60 Math patches, 35 Menu Bar, 16–17, 32, 39, 87, 105, 141, 156 Meta Connect, 206 Meta Spark Desktop Player, 76 Mac, 76 Windows, 76 Meta Spark Facebook, 206 Meta Spark Instagram, 205 Meta Spark Studio, 1, 8, 9, 10, 26 account, 80, 83 Assets panel, 12 functions, 16 Inspector, 14 installation, 9 Scene panel, 12 Simulator, 17 Toolbar, 15 Viewport, 14 welcome screen Blank Project, 11 Create New, 11

L Less Than patch, 192 Logic patches, 35, 129 Loop Animation patch, 165, 170–172

214

INDEX

Meta’s social platforms, 195 Metaverse avatar, 4 Mixed reality (MR), 3 Mobile Meta Spark Player, 76 Mouth Open patch, 72

N Niche, 207

O Objective Achieved patch, 181, 182, 191, 192 Objective plane, 162, 163, 165 objective_mat, 162 Objects add, 48–50 canvas, 48–50 duplicate, 49 face mesh, 63–66 face tracker, 62, 63, 66 insert, 48, 49 and organizing assets, 50–52 plane, 66–70 rectangle, 49, 50, 52, 58, 59 size fixed, 54 height, 54, 55 relative, 54, 55 width, 54, 55 visualization, 52, 53

P Particle system birthing, 109 color, 114–116 emitter, 109, 112, 113 line, 113 point, 112 positioning, 110, 111 shader type, 115, 116, 120 shapes, 109, 113, 114 Patch animation, 33 audio, 33 body landmarks, 34 browse, 32 controls, 32 counter, 179, 182, 190 device, 34 editor (see Patch Editor) Equals, 180, 181, 183, 186, 187 face landmarks, 34 generating and connecting, 71–73 interaction, 34 built-in options, 72 Mouth Open, 72 instructions, 193 conditional instruction, 130–132 find the image, 131, 132 logic, 35, 128, 129 Loop Animation, 165, 170, 171 215

INDEX

Patch (cont.) math, 35 not patch, 129–131 Pulse, 157, 182, 187, 189, 191–193 Screen Tap, 164, 188, 189, 193 shader, 35 string, 31, 128–130 Switch, 164, 188, 190, 193 time, 35 transition, 165, 166, 170, 179 Unpack, 179–181, 186, 187 user interface, 36 utility, 36 visible, 128, 130 Patch Editor, 71, 72, 74 Loop Animation, 170 Objective Achieved, 191 Runtime, 192 show, 32, 156 target tracker patches, 128 To String, 182 transition, 159 visibility patch, 128 Placeholder checkerboard pattern, 58 Plane, 152, 161, 168 adding and managing, 66 bubble, 66 bubble plane, 67–71, 73 and face mesh, 66 face tracker container, 66 positioning, 67–70 216

Playable character animated textures, 153 Canvas, 152 Face Tracker, 152 Plane, 152 player_face_tracker, 152 user movement, 156 player_face_tracker, 156, 172 Position patch, 160, 166, 170, 179 Publishing, 85 browser window, 202 capabilities, 199, 200 demo video, 196–198 file size, 199 guidelines, 200 icon, 198, 199, 201 name, 196 options and fields, 202 policies, 200 preparation, 195 standards, 200 upload, 202 window, 201 Pulse patch, 157, 187, 189, 192

Q QR code, 80, 81, 83, 84, 195

R Rabbit animation sequence, 155 Rectangle, 112, 115, 120–126, 148, 149, 173, 174, 184

INDEX

Replacing Color LUTs, 92, 93 textures replace, 89–91, 94 swap, 89 Reset Game patch, 191 Rotation, 118

S Saving, 87–88 Scene panel, 12, 14, 15, 40, 43, 48–51, 147, 148, 193 Score text object, 182 scoreboard, 173, 176, 178, 184 scoreboard_container, 173 scoreboard_container_mat, 174, 175 Search AR Library, 20 Search Patch, 188, 189 Segmentation, 40, 42–44, 48, 145–147 Shader patches, 35 Shader Properties, 46, 115, 120 Shaders, 25, 35 Shader type stars_mat, 116 flat, 154, 162, 169, 185 Sharing Effect, 77, 78, 134 Show Patch Editor, 32, 156 Simulate Touch, 142, 143, 167 Simulator, 39, 40, 45, 52, 53, 55–58, 60, 61, 63–65, 67, 68, 70, 73, 74, 107 camera, 143, 144

effect user, 150 preview, 132 restart, 171, 183 semi-transparent rectangle, 175 touch settings camera, 143, 144 Simulate Touch option, 142, 143 updated material settings, 175 user interface, 178 Video, 18, 143 Video Library, 18 webcam, 19 Sketch, 104 Sketchfab, 13, 21 Spark AR Studio, 8 Software ARCore, Google, 7 ARKit, Apple, 7 Effect House, TikTok, 8 Lens Studio Lenses, 8 Snap, 8 Snapchat, 8 Meta Spark Studio Facebook, 8 Instagram, 8 Messenger, 8 Meta, 8 operating system, 5 unity, 7 Unreal Engine, 7 Start Play patch, 189–191 Switch patch, 164, 188, 190, 193 217

INDEX

T Target, 102, 104 flat, 103 ideal, 102 quality, 102, 103 Target marker preview, 121, 123, 130, 132 Target tracker, 104, 106–108 material’s texture, 107 texture, 107 Target tracking detect, 101 effect, 102 effect, setup, 105 recognize, 101 test, 135, 136 Test Effect, 81 Testing sharing effect, 134 Test on Device, 75, 76, 78, 79, 81 Text Font Size, 176 2D Text, 176, 178 Typography, 176, 178 Texture, 107, 115, 124 animated textures, 153–155 animation sequence, 154 import, 60, 61 Texture extraction, 145 Assets panel, 41, 42, 44 inspector, 41, 44 person, 42 Scene panel, 40, 41, 43 218

segmentationMaskTexture0, 44 3D assets AR Library, 117, 119, 138 heart, 117–119 star, 116, 126 3D Face Mesh, 95 Timed Instruction patch, 193 Time patches, 35, 192 Tolerance level, 183 Tolerance setting, 181 Toolbar, 15, 18, 20, 67, 70, 75, 92, 110, 133, 134, 143, 167, 183 To String patch, 182 Transition patch, 158–160, 165, 166, 170, 179 Turn Your Head instruction, 193, 194 2D Face Mesh, 95

U Unpack patch, 179–181, 186, 187 Upload button, 202 URL, 83–85 Usage guidelines, 99 User interface patch, 36 User movement tracking Animation patch, 157 Head Rotation patch, 156, 157 player_face_tracker, 156 Transition patch, 157, 159, 160 User research, 207–208 Utility, 25, 36 Utility patch, 36

INDEX

V Value patch, 164, 165, 170, 182, 189 Video Library, 18, 40, 143, 144 Viewport, 14, 18, 32, 68, 70, 110, 114–116, 118–120, 142 Virtual cameras, 19 Virtual objects, 3

Virtual reality (VR), 3, 4 Virtual world, 3 Visibility patch, 73, 128, 188

W, X, Y, Z Webcam, 4–6, 19, 143, 144

219