Workshops
Friday October 7, 2016
All workshops will take place at the Stratton Student Center, 84 Massachusetts Ave, Cambridge.
9:30am - 12:00pm
Laszlo Gombos (Samsung)
Winston Chen
MIT Student Center, Room 307 (Mezzanine)
NOTE: 9:30-10:30 am
WebVR allows anyone to experience and develop VR on the web. All you need is a compatible phone. Most major web browsers support an experimental version of WebVR (Samsung Internet, Google Chrome, Firefox, Oculus Carmel and Microsoft Edge).
Sandra Rodriguez, PhD (Eyesteel Films, MIT Open Doc Lab)
Compelling Immersive Storytelling & Human Centered Design
MIT Student Center, Room 491
The essence of every experience, in particular in immersive and interactive productions, lies in the ability to peak curiosity, inspire, invite play and yes, tell a story. VR and AR projects have opened up new realms of possibilities in terms of interactive narration. But what makes a VR experience “good”? What makes it stand out? What attracts users (and sponsors)? What are fundamental perceptual mechanics (visual, acoustic, haptic) that ensure a comfortable immersion? And how to ensure a compelling experience?
[MORE]
MARTA ORDEIG
Garage Stories: Innovation <> Storytelling Workshop
Hands-on experience workshop that will help you incorporate effective storytelling into your project, either it is a film, a game or even an app; to make sure that you are not only creating “fun experiments” but compelling and engaging experiences for your “visitors”.
WILEY CORNING (MIT MEDIA LAB, FLUID INTERFACES)
DANGER DONAGHEY (DEFECTIVE STUDIOS)
MIT Student Center, Room 407
A crash course where we will attempt to touch on all available functions featured in Unity. From introduction to the editor, we will cover scripting, physics, scene setup and transition; not to forget building and running a Unity executable. We will end with specific tips for working with VR, and discuss your questions or scenarios.
Human-Computer Interaction expert specialized in UX/UI for Virtual Reality R&D, and design and evaluation methodology for Collaborative Virtual Environments. Teaches Human-Computer Interaction Methods, VR, Human Factors, Statistics, and Digital Humanities in the Computer Science Department for the HCI Master’s program. Facilitates and mentors Study Abroad programs at several VR labs in Spain and the Netherlands. Initiated and coordinates the SUNY Oswego VR First lab.
The recent advent of untethered mobile virtual reality devices requires a drastic reduction in the utilization of bus bandwidth as well as minimizing render time through optimized shader computation. The extremely low latency requirements to reduce motion to photon delays is putting added pressure on the GPU to be highly performant. Learn to better understand your underlying hardware to exploit the features in the core graphics APIs and their extensions for OpenGL ES and Vulkan.
12:00pm - 1:00pm
Lunch Break - We will not be providing lunch at this time but there are plenty of places to grab a quick bite within walking distance of your venue!
1:00pm - 3:00pm
Andre Balta (Cubic)
Carl Dungca (Cubic)
MIT Student Center, Room 307
Cubic’s presents a broad stroke over game development and how that applies to Virtual Training and Game Based Learning. The following topics will be covered:
- Applications of Game Engines to Naval Training
- Applying the “Studio” model to DoD
- Using Game Design for Instructional System Design
- Technical Challenges with Game Based Learning
- And further concepts regarding how AR/VR/MR will be used to advance virtual training products
[MORE]
Absolutely any experience with VR or AR, whether for work or leisure, depends on a good interactive narrative. The virtual or mixed reality must (1) make sense to the user based on who they are (2) respond to the user in a way that makes sense, (3 guide him or her though some process, like adventuring, building, socializing, etc., and (4) leave the user with a valuable take-away. This can be done with any interactive media, but VR and AR have particular advantages and limitations, makin them a genuinely new media. They are also different segments of the same continuum of immersive media; we will look at the big picture there. Attendees of the talk will come away with a basic framework for thinking about how to build VR and AR to engage, entertain, and get work done.
Join us for an overview of the HoloLens developer device, 2D & 3D holographic apps, the basics of holographic development, getting started with holographic development in Unity, working with the HoloToolkit for Unity, deploying Holographic apps to a HoloLens device or emulator and holographic demos!
2:00pm - 3:00pm
Our digital world is currently one that is of emotional intelligence.Affectiva’s goal is to bring that missing element to our devices in order to enrich our experiences and close the gap on how machines understand and respond humans. This talk will be centered around the SDKs and their capabilities so that hackathon recipients can get the most out of them.
3:00pm - 4:00pm
Augmented reality has increasing become known with now three major hardware players underway and a number of software toolkits to facilitate application creation. Meanwhile surgical simulation has been increasingly incorporated into device and procedural training with a broadening of use areas as well as in the sophistication of the computational physics and haptic interaction devices used with these experiences. Surgical simulation, at times referred to less precisely as virtual reality, is already mandatory for surgical robotics training and is widely used in endoscopic and catheter procedure training. A few examples of combining AR with surgical simulation have been created and these will be increasingly deployed in multiple areas. I will talk about a couple of these type projects that I led including an interesting ear microsurgery simulator built for a medical device company and will sadly never see commercial use due to the demise of the product it was built to support. I’ll also discuss current and expected future developments in AR/VR relative to the expanding field of surgical robotics.
Thinking about creating a mixed reality application in the home decor space? Don't miss this workshop if you want to use the biggest collection of 3D models of home decor products available on Wayfair for your AR and VR apps! Get a sneak peek and beta access to Wayfair's 3D Model API and learn how to discover and use the models. We will go through basics of using the API, and walk through examples in Unity.
The rapid increase in numbers of 360 content coupled with the emerging WebVR standard opens the door for opportunity target VR with web content. Using familiar tools such as WebGL and Three.js developers can create interactive 3D and 360 experiences. With support of the web, these experiences can reach a wide audience across VR, mobile, and PC platforms. Samsung Internet for GearVR will be used to demonstrate how these new web based experiences are available to today on a mobile VR environment.
4:00pm - 5:00pm
This talk will target both beginner and advanced Unity developers by providing an introduction to tools for Vive development including the SteamVR plugin, HTC's Unity plugins and additional HTC SDKs.
Google Tango is enabling technology for devices to understand their position relative to the world around them. Tango's computer vision platform empowers developers creating next-generation spatially aware application experiences. This workshop explains the Tango hardware and software platform, the technologies in use, how they work, and how to use them. The session will prepare developers to use Tango via the C, Java and Unity SDKs, and conclude with a worked example of building a Tango-enabled Augmented Reality application using Unity. A limited number of Tango Development Kit devices will be available for attendees use.
Mixed Reality experiences such as HoloLens operate by blending virtual objects with real world. It requires a new way of thinking about how developers and designers should create graphical, audio and typographic assets, develop interactive scenarios and gameplay, provide action feedback and generate object layouts based on real environment and device requirements. At this session we will talk about existing experiences made for HoloLens, ideas and decisions behind them. We will also take a look at general recommended approaches of creating HoloLens experiences and common mistakes people make when start developing for HoloLens.
5:00pm - 6:00pm
Preparing 3D assets to be consumed by AR/VR devices can be a long and painful journey. Today there isn’t a simple content preparation pipeline available. And when it comes to industry specific content, you sometimes get completely stuck. Autodesk Forge provides tools that can help in this area, and in this workshop we’ll present and discuss the various options and potential workflows. Finally we’ll present what we’ve been doing with the Oculus Rift and the Hololens as well as a simple workflow for WebVR.
Saturday October 8, 2016
9:00am - 10:30am
This Unity crash-course will give you a quick introduction to the editor, then delve into C# scripting, focusing on programming for VR and AR. Patterns and tips in this session will be applicable to any of the platforms: mobile VR, desktop VR, and AR. We will look at how to drive interactions using the user’s head and (optionally) hands, and how to build User Interfaces that work in 3D space. Time permitting, we can break down some interesting VR/AR apps and see how we could reproduce some of their designs. After this workshop, you will be armed with the foundation needed in the following platform-specific workshops.
10:30am - 12:00pm
High quality mobile VR is demanding not only on GPUs but across the system. It requires close cooperation of many components, notably low-latency, zero-copy paths between video, GPU and display, as well as careful power management to ensure predictable timing within thermal limits, which is especially important with mobile VR.
1:00pm - 2:00pm
This session will cover the high level design guidelines for creating VR/AR setups derived from the different realities on a continuum; design guidelines from Human perception, to help create good experiences; interaction design guidelines based on the different layers of VR/AR content; and Human-Centered iterative design guidelines to help facilitate an effective and efficient design team process for rapid prototyping and Hackathons.
2:00pm - 3:00pm
Mixed Reality experiences such as HoloLens operate by blending virtual objects with real world. It requires a new way of thinking about how developers and designers should create graphical, audio and typographic assets, develop interactive scenarios and gameplay, provide action feedback and generate object layouts based on real environment and device requirements. At this session we will talk about existing experiences made for HoloLens, ideas and decisions behind them. We will also take a look at general recommended approaches of creating HoloLens experiences and common mistakes people make when start developing for HoloLens.
3:00pm - 4:00pm
Google Tango is enabling technology for devices to understand their position relative to the world around them. Tango's computer vision platform empowers developers creating next-generation spatially aware application experiences. This workshop explains the Tango hardware and software platform, the technologies in use, how they work, and how to use them. The session will prepare developers to use Tango via the C, Java and Unity SDKs, and conclude with a worked example of building a Tango-enabled Augmented Reality application using Unity. A limited number of Tango Development Kit devices will be available for attendees use.
4:00pm - 5:00pm
This session will be extensive hands-on development for platforms not yet released to the public This is absolutely epic and great for beginners! Starting your journey in VR using these devices or upgrading, this a mind blowing leap forward in technology.We'll dive right in and teach you how to make your own Daydream device for testing. Then we'll cover the Unity workflow and what's unique about Daydream. Finally we'll actually Build and Run a Daydream app we make together live.
5:00pm - 6:00pm
This talk provides an end-to-end introduction of setting up a scalable, distributed data processing pipeline on Azure to store, analyze and serve AR/VR data streams captured by AR/VR devices. It also covers how cloud orchestrate multiple devices together to enable exciting application scenarios.
This Unity crash-course will give you a quick introduction to the editor, then delve into C# scripting, focusing on programming for VR and AR. Patterns and tips in this session will be applicable to any of the platforms: mobile VR, desktop VR, and AR. We'll look at how to drive interactions using the user's head and (optionally) hands, and how to build User Interfaces that work in 3D space. Time permitting, we can break down some interesting VR & AR apps and see how we could reproduce some of their designs. After this workshop, you'll be armed with the foundation needed in the following platform-specific workshops.