A paranormal experience using Oculus Rift and Myo.
We are team LucidDream, a student project headquartered at Electronic Arts’ Redwood Shores campus. This semester, we are developing a virtual reality experience that maintains an user’s sense of immersion by utilizing three devices: the Oculus Rift, the Myo from Thalmic Labs, and Intel’s Creative Camera. A player will be able to interact with our constructed world by using their hands, and that will add to the realism for which we are trying to achieve. Our final product will be a short one to two minute demo of how these devices can work together to make an utterly immersive experience for our players and show them the potential of these amazing devices.
Chris Cashman
Producer Sound Designer
Shine Li
Co-Producer Game Designer
Fan Su
Programmer Game Designer
Benjamin Uscinski
Programmer
Akshaya Venugopal
Technical Artist
Yunxin Zheng
Programmer
Yue Zhu
Game Designer 2D Artist
Week 15
Based on the feedback we got from the soft opening, we had a team meeting and made several design changes. The goal of this week was to polish the game as much as possible, including making the gestures easier, the calibration more accurate, and the game more responsive.
To achieve this goal, we changed the hand gestures from using two different gestures on each hand to using the same gesture on both hands to make it intuitive for naive players. We also polished the tutorial and added the ending scene, so the experience feels complete. To make the game more responsive, and provide more feedback to the players, we have added lots of visual and sound effects, changed the design of the final boss in our game, and improved the accuracy of our own calibration.
We have also had our last meeting with one of our supporters, Thalmic Labs. In this meeting, we conveyed our passion to push this project forward with their assistance so that we can have our project be their demo on the floors of PAX later this summer.
Meanwhile, we have been also concluding the lessons we learnt from this project, and put it into our post-mortem. We wish our experience gained from this semester will something for future students who might work on a related project to draw inspiration from and help them with any knowledge we gained this semester.
Week 14
Throughout this week, we made it our mission to get our project as ready as possible for Soft Opening, which was this Friday. As we had just gotten the raw SDK that allowed us to make our own gestures last week, we did our best to integrate that into the project. This feature was added at the same time as we implemented a second hand in the experience, so this not only allowed us to calibrate two hands, but we can do it inside the game. This allows us to have a far more immersive experience and the chance to make a tutorial for players to learn the mechanics of the experience. We also gave the project additional polish by including more special effects and sounds to make the game as robust as possible.
With all of these factors in place, we were able to have our demo ready for Softs. We set up in the heart of building 250’s atrium in EA’s Redwood Shores campus, and we had a steady and long line all throughout the day. EA employees and their guests stopped by during their lunch breaks to delve into our experience, and they provided valuable feedback. They were impressed by how they could interact with the virtual world with their hands and how great the environment looked, saying that these aspects made the experience extremely immersive. They did have ideas about to improve the game, such as making our tutorial shorter and our gestures less complicated. We noticed throughout the day that whenever people wanted to make a gesture on one hand, they would also perform that gesture on their other hand as well. Since our recharging gesture is having our players spread their fingers on both hands and the firing gesture is an one-handed finger spread, playtesters would often accidentally recharge instead of firing. As a result of this, we are going to make the recharge gesture be just having both hand making a fist and firing is having both hands spreading their fingers. This will reduce gesture confusion, as well as deal with the unexpected notion of people automatically making the same gesture on both hand when they are supposed to just use one hand. The other major thing we noticed was that people had a hard time hearing the sounds since the atrium is prone to loud echoes, so if take this to the floor of a major conference, we will need noise-cancelling headphones.
This feedback will help us focus on what we need to do in the upcoming (and last) week of production in the semester. We have a few things to tweak here, a few things polish there, but we are excited to put the final touches to this amazing project.
Week 13
We learned several lessons over the weekend when we set up a booth at Carnegie Mellon University’s Silicon Valley campus’ Spring Carnival. Many people felt that the gestures were intuitive and they enjoyed the environment that we set up for them, but they felt that the calibration took too long and there needed to be more ghosts in the scene to make the experience better and more interesting. Also, there were times in which the gestures were not being properly registered
This past week, we were able to accomplish a great deal. We got access to the raw data SDK for the Myo. This will allow us to build a DLL that can support a calibration system within the game. This keeps the sense of immersion in our experience intact by removing the need to calibrate the Myo outside of the game environment, which was a long and complicated process that took too long.
Week 12
At the beginning of the week, ETC alum and audio designer Katelyn Mueller came by to help us with our 3D sound issues. One of the projects that she worked on during her time at the ETC was called Sonology, a project that primarily focused on providing realistic 3D sound to a guest, and it so happens that she works at EA, so she was well-suited to come by and provide tips and ideas on the issue at hand. She examined our audio setup, as well as played our game, and she offered valuable advice, such as working with low-pass filters to make the sound appear muffled when you turn your head away from the audio source in the game. This would give the effect of how sounds change and get muddled when you are not entirely focused on them, or when they originate from a location behind you. We are going to implement this and the other ideas she offered us to enhance the experience.
We also sat down with our advisors, who noted that while our experience was coming along well, we still had an issue with the motivations of the ghosts in the scene. To our advisors, the ghosts seemed like they were place in the scene and performing just to get your attention. There was nothing for our players to see why the ghosts were doing what they were doing, especially when it came to ghosts who were performing overly complex actions in the experience. As a result of this talk, we redesigned the game so that it was more user-friendly and straight-forward. Instead of having ghosts performing several actions in different areas of the room, we going to make them perform a single, yet entertaining action that our players can find with only a moderate amount searching. We only have between 1-2 minutes to play the game, so this will keep us under that time limit while still displaying the potential of our devices. This weekend, we will be playtesting this newer version of the game at the Silicon Valley campus of Carnegie Mellon University during the Spring Carnival to get their feedback and see how it will function as a booth experience.
Week 11
Halves and this week provided us with opportunities to playtest our experience with naive guests, and we took note of what they had to say. A group of graduate students from Norway were touring EA, and we were so lucky as to have them come by and play our game. Everyone who participated thoroughly enjoyed being able to enter a virtual space and interact with it using their hands, and they liked the idea of the game. We even scared a few people with our spooky experience. They did, however, notice that we have a kinks to work out and polish. For instance, the gyroscope in the Myo would occasionally go out of line and cause the arm in the game to move into awkward positions, breaking the sensation of immersion. Also, while they enjoyed finding the ghosts in the room, some of the specters were a bit difficult to spot and capture, most notably the mask ghost, which is the first one in the game. Sometimes, people would fail to look at it and start the game, so they would get frustrated. Other times, they would accidentally seal it before it had left it’s perch on the wall. The last main critique came from one player who noticed a seam in the ceiling’s texture.
In response to this feedback, we have made significant changes to the experience. We have removed the mask ghost and replaced with a ball that moves around and bounces on it’s own. Before it moves, there are sound effects from around the room, such as a xylophone and piano, that will distract the player and teacher them to look and examine everything around them, including objects in front of, to the side of, and behind them. With this set up, we are doing our best to make the 3D sounds sound realistic, but we are running into a few issues. Luckily, we got into contact with EA audio designer and ETC alumnus, Katelyn Mueller, and she has agreed to stop by next week and give suggestions about how to handle the situation. We look forward to meeting with her and polishing our experience further.
Week 10
After the whirlwind of GDC last week, we were greeted with the need to prepare for our Halves presentation that happened this Wednesday. We deftly and effectively conveyed the progress we have made to our audience, thanks to our extensive rehearsals and feedback from our advisors, Carl and Salvador. The level of preparation we achieved allowed us to expertly answer questions when they were asked of us. After the presentation, we invited our guests to playtest the latest version of our game, and most found it to be an entertaining experience while providing us with invaluable feedback about the design and what needs to be worked on to properly polish it. Halves have re-energized us, and we look forward to taking what learned from the event and applying it to our game in the coming weeks.
Week 9
GDC happened this week, so we took the week off. See you next week!
Spring Break
While scheduled as our Spring Break, we spent our week refining our project and adding new features to our demo that we are showing at Halves. Being able to have a full week in which we did not have to worry about classes or any other external issues allowed us to fully focus on building our experience and it was a pleasant change of pace. Three ghosts have been added to the experience, as well as their corresponding audio/visual effects. The room has also been given a bit of retouching to further accentuate the creepiness of the game, and we have a full model of a player character for us to rig. We now have a stable build that has integrated all of these elements and have a demo ready for Halves.
Week 8
On Monday, we received the exciting news that Thalmic Labs’ Co-Founder and CEO, Stephen Lake, was going to be in the area on Wednesday and would like to see what we have been able to put together so far. With this information in hand, we added additional polish to our demo that we ran on Friday of last week so that we had something a little more refined for Mr. Lake. On Wednesday, we met with Mr. Lake and a gave him a short presentation on who we are and what are aiming to accomplish in our project. After getting him up to speed on our development, we took him to our workspace in order to show off our experience, and Mr. Lake was impressed by what he saw. He gave a few suggestions about how to possibly improve the game, as well as additional ideas about how to better incorporate the Myo into the experience, but he was impressed by the overall progress we have made in developing our game and looks forward to what we create at the end of the semester.
Week 7
After Quarters last week, we have ended our pre-production phase and moved into the next part of the production cycle. At the beginning of the week, we devised a schedule for our project to follow for the rest of the semester. We also fleshed out a story for the game, as well as figuring out several game design issues. Our player’s character will now be standing in the middle of a haunted room where they will attempt to capture ghosts that are lurking inside using supernatural powers. We have constructed a basic scary environment to perform some playtesting centered on the sensation of being in this creepy space while wearing an Oculus Rift.
This playtest will see how people respond to the 3D sound. Late last Friday, we performed an internal test to see how we reacted to sounds coming from different directions. This experiment showed us that while people can easily tell where a sound is coming from if it emanates from the left or right in front of them, there is great difficulty in differentiating sound coming from behind a person. In some cases, the playtester thought a noise coming from behind them was actually being made in front of them. This feedback gave us a lot to think about as develop our experience further so that we can craft a terrifying, yet navigable world. We look forward to testing our rough design next week.
Week 6
We had our Quarters presentation this week, and it went over exceedingly well. We clearly conveyed our goal and vision for the project while fielding questions from our audience, and we received valuable feedback when people playtested our demo afterwards. Later that day, we talked with our client, Dan Driscoll, and he gave additional comments on the direction we should take from this point on. He informed that if it would make our lives easier, we could drop the Intel Creative Camera from the project. We should really focus on making the interaction with the Myo our top concern, and the camera was proving to be a distraction.
Another change that we have made for this project that is rather significant is that we have also dropped using Frostbite. Due to both the setup issues as well as the general difficulty of learning a completely new game engine in the span of the semester, we felt that it would be in our best interest to revert back to Unity so that we can rapidly prototype. With these important changes in place, we will continue to improve our project and have a new iteration ready before Halves.
Week 5
This week, we have been working hard on making sure that we are prepared for our Quarters presentation next week. We talked with Anthony Daniels in order to get his perspective on how our project should proceed. He offered excellent insight into how should try to design our experience, both with the game world and outside in the real world. With the proper theming in place, we can ensure that our players will fully immersed in the world we are creating.
We also worked on having a new demo to show and playtest at Quarters. In this demo, the player is put into a room filled with boxes and three balls. Using the Myo, the player can either spread his or her fingers to push these items away, or make a fist to pull these things towards them. While it still needs additional polishing, it does showcase the possibility of our devices to convey a fun and immersive world to our guest.
Week 4
Last week, we started installing and learning about Frostbite, but we unfortunately ran into a few issues in setting up the game engine on our computers. At the beginning of the week, only two of our teammates were able to get the engine up and running. The rest of us encountered technical difficulties that prevented any further progress in experimenting in Frostbite. While this was detrimental to the development process, we were able to come together and discuss possible game concepts for us pursue for the rest of the semester. On Tuesday, we deliberated and shortened the list of ideas down to four, and then presented them to our advisors on Thursday to get their feedback. With our list of game ideas in hand, we plan to present them to our client, Dan Driscoll, on Monday to see what he feels is the best fit for our experience.
We also had a phone conference with our associates at Thalmic Labs, the makers of the Myo device. The conversation revolved around hammering out a few issues we encountered at the beginning of the semester. At the end of the discussion, we learned that not only would we be receiving two new, updated Myos, but we would also be granted access to the raw data of the Myo. This last development means that we will now be able to make our own gestures for our project, which grants a much wider range of possibilities for us to explore in the game, and we are extremely grateful to be given this wonderful opportunity. We are excited by this news and look forward to next week to explore the new possibilities.
Week 3
We had our presentation on Tuesday in order to show off the capabilities of our devices. In attendance were Michael Duke, a senior gameplay engineer for EA, and Rich Hilleman, the chief creative officer of EA, as well as our client, Dan Driscoll, and our advisors. After detailing how were able to integrate the three experiment devices into Unity, we had a demo set up to show off what we were able to accomplish in the span of two weeks as well as the capabilities of these pieces of technology. After playing with our demo, both Michael and Rich gave us valuable feedback on how to improve the experience and what we could do to make the development process easier on us considering that we are implementing devices that are wholly unfamiliar to us.
After the presentation, we were told by Dan that we would be getting access to the Frostbite game engine so that we would be building our experience in it for the rest of the semester. This represents a monumentally important shift for us, as this game engine is entirely new to us. The rest of the week was spent both installing and reading tutorials on Frostbite, as it has a bit of a steep learning curve. As we worked to get the new software onto our computers, we have been brainstorming on our own to come up with an experience that will best utilize our devices and new engine. We will convene next week to narrow down our ideas and receive feedback from our advisors about what direction we should take.
Week 2
During this past week, we have been busy putting together our demo for next week. By incorporating the finger tracking from the Intel Creative Camera, the gesture detection from the Myo, and the 3D visualization from the Oculus Rift into Unity 3D, we have developed a demo that allows you to pick up a ball and throw it through a corresponding hole in a wall. This will activate switches on the wall that you can slide around to complete an electrical circuit. When this is accomplished, you activate a robot that is linked to the circuit, and you beat the demo. At this point, the demo world needs additional polishing, but we are confident that we will have what we need for Tuesday’s demo.
Week 1
As the semester begins, we have made great strides in getting set here in our office at Electronic Arts. Our client, Dan Driscoll, has tasked us to create a virtual reality world that maintains a user’s immersion in the experience. This means that we need to construct virtual experience that prevents a player from losing their suspension of disbelief and make feel like they are really a part of the world we have made. In order to accomplish this, we have been granted access to three pieces of cutting edge technology, the Oculus Rift, the Myo from Thalmic Labs, and the Intel Creative Camera. We aim to create a world were a user can reach out and feel like they can touch and directly interact with the environment around them.
We are excited to be given this opportunity to be positioned right at the precipice of future tech and have access to such amazing advancements. The first two weeks have been designated as a time for to get acquainted with the tech at our disposal, since each device is completely to not only us, but the developer community as a whole. The devices have been integrated into the Unity 3D game engine, and while that was happening, we also chose a team name, LucidDream. We feel that this name best reflects the goal of the project of making a sustainable and immersive virtual world that a player has direct control over.
The next week will be focused on getting the finer movements and gestures of the player’s hand properly read and translated into the world so that we will have a demo to show other potential clients. We are very excited to not only be here, but to have the opportunity to work on technology that can take VR in a whole new direction.
Our 30 Second Promo
Our 3 Minute Promo