As a software developer, I spend my time building websites and web apps, but for a long time, I’ve also had a side interest in virtual reality. I’ve named my Oculus Go “Betty” and get giddy talking about experiencing gondola rides through Venice, hopping on breathtaking roller coasters, and traveling through veins in the human body. Since I use React, mainly, I was excited to learn I could develop virtual reality experiences with a library I already know and love. T , I recently created a virtual reality application called , which allows the user to select an immersive meditation environment, each of which comes with its own mantra inspired by the very excellent show In May 2018, shortly after I built my app, Facebook released a revamped and rebranded version of React VR called with multiple changes and significant improvements. o try my hand at React VR Find Your Zen “The Good Place.” React 360 As I ported my application over to React 360, I took note of some important differences between React VR and React 360. I wrote the following article for developers who possess a working knowledge of React. If you’re unfamiliar with the library, I recommend starting first. here If you want an introduction to React VR (as well as , whose utility functions help manage my application’s state), you can find that and . Recompose here here Viewing the finished demo code $ git clone [https://github.com/lilybarrett/find-your-zen.git](https://github.com/lilybarrett/find-your-zen.git) $ cd find-your-zen $ npm i $ npm start File structure The basic file structure for React VR was as follows: = entry point for my app index.vr.js folder = stores the code that launches my app, includes and files vr index.html client.js = stores images, audio files, and other external resources static_assets And here’s the new file structure for React 360: = entry point for my app index.js = sets up the “runtime,” which turns my React components into 3D elements in our VR landscape client.js = as in the typical React application, provides a place for me to mount my React code index.html = stores images, audio files, and other external resources static_assets I set up the rest of my folder structure as follows: - components // shared components - base-button - content- consts- providers // Recompose providers live here- scenes - home-environment - components - menu - title - zen-button - zens - zen-environment - components - home-button - mantra - static-assets - images - sounds Shared components live in the top-level folder. Stored in my — the first environment to load, where my user accesses a menu of meditation environments to explore — and scenes each have their own sets of relevant components. My state management is handled by and functionally composed into each component that needs access to state. components scenes, HomeEnvironment ZenEnvironment Recompose providers Mounting the app In React VR, my was pretty simple and didn’t give me too many configuration options: client.js // React VR application -- vr/client.js // Auto-generated content. // This file contains the boilerplate to set up your React app. // If you want to modify your application, start in "index.vr.js" import vr.render = function() { // Any custom behavior you want to perform on each frame goes here }; // Begin the animation loop vr.start(); return In React 360, I can mount my application’s content to a surface or a location. Surfaces, , “allow you to add 2D interfaces in 3D space, letting you work in pixels instead of physical dimensions.” In my case, I wrap the visual content of my application in an component, which I mount to React 360’s default, cylindrical surface. This surface projects the content onto the inside of a cylinder — centered in front of the user — with a 4 meter radius. as the docs say AppContent I can create my own custom surfaces in React 360, increasing or decreasing the radius or making the surface flat rather than cylindrical. I also mount the entire app itself to React 360’s default location, which allows my app to take advantage of React 360’s runtime. The new runtime is one of React 360’s significant advantages over React VR. Why? Separating out the rendering or “runtime” aspects of the application from the application code improves the : the time between a user action and the time the pixels in the view update in response to that action. If the data transfer is too slow, it results in a choppy, disorienting view for the user — similar to buffering on a Youtube video or static on a television screen. latency , web browsers are single-threaded, which means that as part of the app updates behind the scenes, that process could block or hinder data transfer. “This is especially problematic for users viewing your 360 experience on a VR headset, where significant rendering latency can break the sense of immersion,” the docs tell us, “By running your app code in a separate context, we allow the rendering loop to consistently update at a high frame rate.” As the React 360 docs further explain If the data transfer is too slow, it results in a choppy, disorienting view for the user — similar to buffering on a Youtube video or static on a television screen. In my , I my (see second code block below) to mount to the default location — giving my entire application access to the runtime — while I register the content I want to display (again, stored in ) to the default cylindrical surface. index.js register MeditationApp AppContent // components/content.js import const AppContent = withAppContext(() => ( <View> <HomeEnvironment /> <ZenEnvironment /> </View>)); export // index.js import const MeditationApp = withAppContext(() => ( <View style={{ transform: [{ translate: [0, 0, -2] }] }}> <AppContent /> </View>)); AppRegistry.registerComponent("AppContent", () => AppContent);AppRegistry.registerComponent("MeditationApp", () => MeditationApp); My deals with mounting my component to locations and surfaces: client.js // client.jsimport function r360.renderToSurface( r360.createRoot("AppContent", { /* initial props */ r360.renderToLocation( r360.createRoot("MeditationApp", { /* initial props */ r360.compositor.setBackground( r360.getAssetURL("images/homebase.png") );} window.React360 = {init}; Playing audio In my folder, I created a file in which to quickly store my data — including the correct audio file and image — for each environment: consts zens.js const zens = [ { id: 1, mantra: "Find your inner motherforking peace", image: "images/hawaii_beach.jpg", audio: "sounds/waves.mp3", text: "I'm feeling beachy keen", }, { id: 2, mantra: "Breathe in peace, breathe out bullshirt", image: "images/horseshoe_bend.jpg", audio: "sounds/birds.mp3", text: "Ain't no mountain high enough", }, { id: 3, mantra: "Benches will be benches", image: "images/sunrise_paris_2.jpg", audio: "sounds/chimes.mp3", text: "I want a baguette", }, { id: 4, image: "images/homebase.png", text: "Home" }] export default zens; To play audio in my React VR scenes, I used a component, which took in a URL for a sound file in the folder as a prop. To prevent audio from playing in environments where it didn’t belong — such as the home environment — I implemented logic via Recompose for “hiding” and “showing” the component based on whether or not we were in an environment with no audio files associated with it. Sound static_assets source Sound // React VR -- components/audio.js import const hideIfNoAudioUrl = hideIf(({ selectedZen }) => { const zenAudio = zens[selectedZen - 1].audio; return export React 360 greatly improves upon this. For playing audio, I use the Native Module. Its method allows me to provide a path (to the audio in our assets folder) and a volume at which to play said audio at a looping pace. Once the audio file stops playing, it’ll start again. AudioModule playEnvironmental Along the way, I realized I need to tell my application when to playing a particular audio file when switching scenes. (Otherwise, while immersed in Find Your Zen, you may wind up listening to audio from your previous environment — i.e., church bells in a city square in Paris — after you navigate back to the home environment). I accomplish this with the ‘s method. stop AudioModule stopEnvironmental Keep reading to see this in action… Using Images In React VR, I used a component to display a 360 degree photo. To display a specific image, , like , took in an assets URL as a prop. Based on which environment the user selected, the app’s state updated to display an image for that environment. Pano Pano Audio source // React VR -- components/wrapped-pano.js import export You may or may not have noticed that, in my React 360 application’s , I write the following line after rendering my application’s components: client.js r360.compositor.setBackground(r360.getAssetURL("images/homebase.png")); This line of code, which immediately sets the background image when the app is first mounted, uses the utility from React 360 to automatically look inside my folder for the correct image. asset static_assets That’s all well and good, but I still want to change the image based on which environment the user selects. Thankfully, I can handle dynamic images from within a React event by using React 360’s module. Here’s some sample usage: Environment Environment.setBackgroundImage(asset(someImage)); To pull it all together, here’s how I dynamically set my background image and audio based on which environment the user selects, using Recompose’s and functions: withState withHandlers // providers/withStateAndHandlers.js import const withStateAndHandlers = compose( withState("selectedZen", "zenClicked", 4), withHandlers({ zenClicked: (props) => (id, evt) => { Environment.setBackgroundImage(asset(zens[id - 1].image)); if export Styling the app React 360, like React VR, uses to easily adapt the application’s layout to any display, whether it be a laptop’s web browser or a phone screen or a VR headset. However, for parts of the application mounted to a location — like in my case — React 360 switches from Flexbox layout to a three-dimensional, meter based coordinate system. That’s why you see this code in my : Flexbox MeditationApp index.js // index.js // other code goes hereconst MeditationApp = withAppContext(() => ( <View style={{ transform: [{ translate: [0, 0, -2] }] }}> <AppContent /> </View>)); // other code goes here The values passed into are , , and , in that order. represents the orientation of an object to the right of the user; represents the orientation upwards or downwards, and represents the perceived distance away from the user. transform x y z x y z In the example above, the should be in the center and 2 perceived meters ahead of the user. View Transforms are all positioned relative to their parents. Practices that worked well for me StyleSheets from allowed me to use JavaScript to style my React components. See my code below: StyleSheet react-native // scenes/home-environment/components/zen-button/style.jsimport export Here, I create and export a object that allows me to reference styles in a terse, DRY manner in my component itself. StyleSheet // scenes/home-environment/components/zen-button/index.jsimport const ZenButton = ({ text, buttonClick, selectedZen }) => { return export State management Because, at the end of the day, this is still just React, you can approach handling state in the same way you would in a typical React application: , , , etc. I chose to use Recompose because I love how it allows me to build functional components. As mentioned earlier, I wrote some posts about Recompose in the context of React VR, which you can find and . I did not need to change anything about my state management approach when porting my application from React VR over to React 360. Redux Recompose Mobx here here Debugging React 360 When you on the application, you”ll see that React 360 bundles all its files into one giant blob that isn’t super easy to grok. Fortunately, because React 360 supports , we can still access the original files, use , etc. Inspect Element sourcemaps debugger