Detailed below is our postmortem.
1. Introduction and overview of the project
Project Thin Line is a mobile VR experience dealing with women’s reproductive rights. Through the use of interactive narrative, we aimed to encourage healthcare providers to identify their biases towards women who had an abortion, evoke empathy, and raise discussion around this topic.
2. What went well (or what went right)
2.1. Pre-production
Being a pitch project, some parts of the pre-production happened before the semester started. For example, we knew that we are basing our design on previous prototypes, we already defined the barriers and goals, and we knew our medium. Even though we haven’t started working on the actual product before the semester started we were ready to go and started playtesting in the first week.
Being set on the design and knowing that we’re going to iterate on the details but not on the overall structure, allowed us to focus on figuring out those details and polishing them.
In our experience, we used the help of actors. The writing process took longer than it should (and I’ll elaborate more on that) however, once we got to the recording sessions the script included 20 minutes worth of content for a 10 minute experience. That includes feedback and hints on how should players proceed written so to fit the story. Since we’ve been playtesting constantly we were able to identify how much guidance people might need and incorporate that to the script.
2.1.1. Research
Before the semester started we’ve done research on various ways to encourage a change in perception. We looked into other transformational experiences such as Spent and Peacemaker. With the help of Jessica Hammer, we chose to follow Geoff Kaufman’s research on perspective taking. That helped define our design.
This was a student pitch project, however, with the help of Drew Davidson we were able to partner with Kristen Kamp from the Ryan Program. In a way she served as a subject matter expert or as a connection to other subject matter experts. This topic, being sensitive as it is and in a way unapproachable, required a substantial amount of research and the need to have someone at hand to confirm that we are not straying away from realistic scenarios. Having Kristen helped significantly and reduced some of the workload. We held regular meetings with her and even though it felt time consuming sometimes it also meant that she was able to give feedback fast. She connected us with family planning residents, counselors, nurses, etc.
In addition to Kristen’s help, we also read and watched as many pieces as possible about women’s experiences with abortion from pro-choice and pro-life websites.
2.1.2. Playtesting
Throughout the semester, we held regular playtesting sessions in every single week from the very first week, and collected a huge amount of data on narratives, interactions, graphic design, etc., to help us development our experience. We playtested with ETC students, faculties, visitors, registered playtesters, and our target audience, the health care providers working with women who had abortions.
We divided our playtests to three different types. The first was playtesting the story. This was usually done in paper playtest format. Players would sit next to a desk and were read segments of the text based on which objects they chose to interact with. We used that method throughout the semester to test the following:
- How much information can players retain?
- Can players make sense of the non-linear story?
- What was boring and not interesting, what was incredibly interesting and what was missing?
- How did they feel about the character?
From playtest to playtest we would update the story, change the phrasing, try new techniques in order to enhance the emotional connection. It helped us define where we should focus on.
The second type of playtest we had was centered around the interactions and the objects. The Google Daydream is relatively limiting with only gaze, voice recognition and a controller. Our goal was to balance between easy to understand but still interesting. These were done in VR and were meant to playtest the following:
- Was it easy to do and did the players understand how they did it (or did it happen by mistake)?
- Was it rewarding or frustrating?
- Do players feel they have the right amount of agency?
Lastly, we sent out surveys. Those were used either to test different types of stories, gather information for brainstorming, or trying to understand others’ perspective.
In total, we had 126 playtesters. We playtested almost every week until halves. It helped create milestones for us and gave us tools to move on to production.
2.1.3. Prototyping
Since we were playtesting almost every week, that also included a new prototype each week. Not playtesting the interactions on paper was the right decision. Those will never feel the same in VR and they are significant enough for the experience. The more the merrier.
2.2. Development
2.2.1. Narrative
For our main character and her story we wanted someone relatable but not an extreme case. We wrote several short stories and sent them out as surveys. Each story had different elements to it and they ranged from being young, alone, and unable to support yourself to being married with financial stability and not wanting children.
We didn’t choose one story of those but we did note what worked, what people liked, and what they hated. We also met with Geoff Kaufman, whose research we’re basing our design on, who suggested we keep her a medical student to strengthen the in-group relationship and enhance the connection with our target audience, medical residents.
The story was composed of segments that could be listened to in any order (per room). They were the character’s memories from each object she interacted with. A good idea was to define who is the character talking to and remember what she knows and what she doesn’t. For some objects, she talked directly to them (Teddy bear, Ballerina) and for others, she ignored the object and talked about the memory itself (Pocket watch, notebook).
The object themselves were chosen either based on the story or based on playtesting and the story was written after. Choosing the objects before the story allowed the designer and the programmer to start working on the interactions as the story was still in the works. Playtesting helped identify which objects people liked either because of they were unpredictable and therefore interesting or because they were fun to play with.
2.2.2. Programming
Towards the end of the pre-production phase, we made a technical decision to reform the existing system for the needs of a better structure, decoupling and debugging. We finished the current architecture in a week and it was running well since then. It provided us with:
- A modularized and component-based structure – we were using Unity, and Unity itself was a component-based engine. To take the most advantage of the engine, we modularized and capsulated our code as much as possible so that we could work on different features in parallel. Each feature we created could be just plugged in and run, and more reusable and safe.
- Completely decoupling – there were no public references anymore. It made us easy to make changes without worrying about breaking other stuff. Everything was running by itself. Also, we could now also authorize designers and sound designers to make changes and test out concepts faster and safer.
- Event-driven – based on the nature of our experience, which is event-driven, it made sense to create an event-driven system to support that. It made the system more CPU-efficient. And it empowered the designers–designers could now raise an event by a single button click in the Editor to check individual responses and interactions.
- Data-driven – everything was data-driven, saved in data files, even the events were now just pure data files flowing through the system. Data and implementation were completely separated now. It made the implementation more reusable, and made the bugs easier to spot and fix.
- In-Editor tools – some of the tools were created to visualize positions, events, paths, etc., to empower the designers. Some other tools were for pipeline purposes, like importing and exporting assets. We also customized the Editor for our specific needs in design and QA.
Identifying the performance issues earlier in the process through playtesting allowed us to tweak the performance parameters, focus on draw calls and make decisions for how to render art before the production phase.
2.2.3. Design
- The research that we based our experience on, helped us to identify our high-level goal, barriers and shape our design pillars.
- To have a collaborative approach between the artists and programmers, we started making flowcharts for each of our object interactions. This helped everyone be on the same page once the team members started working within their discipline.
- Instead of having a high-level concept/idea, each idea was broken down and written down in the most detailed manner including edge cases and possible error conditions to help with programming. Every day began with discussing what feature was going to be implemented and discussion of that feature between the programmer and designer. The feature was then tested right after its implementation.
- Weekly playtest created weekly checkpoints that helped us rapidly prototype ideas rather than making assumptions about whether they would work or not during meetings.
- Weekly playtests also shed light on the technical constraints of the product earlier in the process. Because we identified the performance issues early, we were able to make smarter decisions about creating art during production phase to reduce draw calls.
- Having a programming background helped make faster decisions on what was possible and what was not possible for the platform we were using. It also led to the rapid logic development and directly proposing the logic to the programmer rather than merely the idea.
2.2.4. Art
- To make the build run in mobile VR, we build each scene lower than 5000 polys ( in quad). To reduce the draw calls in Unity, it is more important to make less objects in the scene as possible. Combine objects in Maya scene and reduce the number of the objects.
- Baked lighting is a good way to enhance the performance.
- In Unity, it is impossible to bake lightmaps in custom shaders, like toon shader. It is possible to create a toon shader looking lightmaps in Maya using Arnold render.
- Link for toon shader in Arnold https://youtu.be/4hcrd0FFa6
- To bake lightmaps in Maya, make sure each objects UV aren’t stacked.
2.2.5. Motion Capture
- The key of success in motion capture is all about preparation.
- Before starting, measure size of the environment and props in Maya, and make sure to check that unit in cm or inch. It is easy to measure objects in Maya using Create > Measure Tools > Distance Tool
- Bring correct size of props, storyboard, audio, or recorded voice overs to capture precise data. It helps you to reduce the workload in the later process.
- Best is to set the Character Rig close as possible to the skeleton provided from motion capture studio.
- After importing the motion capture data in to Maya, it is important to set Character Definition correctly at HumanIK menu. When creating Character Definition, set the time slide to frame zero. The motion capture rig and target rig should be both facing Z positive in World Space. When implementing the motion capture data to target rig, make sure both skeleton is in T-pose, and the root joint facing Z positive in World Space.
- You can adjust the animation in both Maya and Motion Builder.
2.2.6 .QA
- We began QA earlier in the semester that helped us build a firm architectural framework.
- Weekly playtests also contributed to weekly QA.
- After every feature was coded by one programmer, it was QAed by another programmer and tweaked until it worked flawlessly while the other programmer continued working on the next feature.
- Every guest/ETC student who wanted to try our experience worked as excellent QA testers and feedback was always incorporated.
- Towards the end, we invited people from ETC just to come to our room to try and break our game. This was an open invitation that lasted for days. This also meant that the phone needed to have a working build at all times.
2.2.7. Sound
The project ended up needing more attention to sound than we thought. We worked with a sound designer that was very committed, also because this was his project for another class. He worked with us for ~30 hours, not including work he did on his own.
Important notes to remember:
- Finding a committed sound designer is important. ETC students won’t have the same amount of time as our sound designer had. John Dessler directed our sound designer to us since he was a student in his class looking for a project.
- Making a list of all the sounds that were required is important.
- Prepare for recording sessions to take longer than you think. We had in our budget enough money to have another session however, we were lucky with scheduling since it wasn’t planned.
- You can speed up the process by editing the sound files in advance using Premiere (or similar software) and then exporting it to OMF for the sound designer to make corrections.
- For a mobile VR experience: none spatialized sound can be an MP3 in order to reduce the overall size of the project.
- Having the possibility to have several sounds playing at the same time and adjusting their levels in Unity helped speed up the process.
2.3. Deployment
2.3.1. Publishing
The experience is currently on the Google Play store and on the Oculus store. Since we kept in mind that this might not be an easy process and that it might require the SDK to meet certain criteria (such as size) the process was relatively easy and most importantly – possible. However, it does require setting time aside for it for both a programmer and an artist.
2.3.2. Website
We built a website as a part of our deliverable because of these reasons.
First, our team felt that having a debrief session after trying the experience would be a good way for participants to discuss their thoughts and emotion since our client wanted to use our experience as part of their training program. For this, we needed to way to deliver the discussion guide for the facilitators to use and we needed an efficient way to achieve this.
Second, we wanted an efficient way to collect data. Our client wanted the participants to go through a post-survey to measure the impact of the experience.
Lastly, most of our target audience would have none or minimal experience using VR so we needed to provide a step by step instructions for them.
By having a website, we were able to provide everything to users in an efficient and simple way.
We discussed a lot on what the layout should look like and after multiple iterations, we settled down on having two different pages for individuals and facilitators. Since we did not have funds to maintain an expensive website, we decided to develop our own and use Github Pages to host it instead of paying for prebuilt services.
For the website, we used Jekyll since it would be easy to host it on Github which is fast and easy to maintain.
- What could have been better (or what went wrong)
3.1. Pre-production
We spent too much time playtesting and not making decisions. In addition, at the beginning of the process, there wasn’t anyone in charge of the design and decisions was made by all. It took awhile to make them and not everyone wanted to be part of the process so it was a waste of their time.
- Asking in advance what each team member wants to do and deciding who is the go-to person can save time and frustration.
- One-on-one meetings and daily were not held on a regular basis. That resulted in some problems not being addressed as soon as they arose.
- The writing process took more than half of the semester and that resulted in making additional art assets for each playtest that were scrapped.
- Scheduling the motion capture and the artists could have been done sooner even if the story itself wasn’t done. If anything, redoing the motion capture would have been a “nice to have”.
3.1.2. Playtesting
- We never playtested with our target audience but that was also never part of our scope. Our client was in charge of arranging those for us and that took time. We could have tried making those efforts ourselves as well.
3.2. Development
3.2.1. Narrative
- Just from a time perspective: having the story done earlier could have allowed us to make more changes after it was already recorded.
3.2.2. Motion Capture
- We used CMU’s motion capture studio. There is almost no other alternative in the Pittsburgh area (unless you know someone who is taking an IDeATe course). It is run by one person and if he is unavailable (which usually isn’t the case) that can be a problem. Luckily for us, it all turned out well but there was a week in which we couldn’t capture or get the data we already captured.
- We assumed that because the characters don’t have faces they need to be more active. It ended up looking a little strange for them to be moving so much. Record both a stationary and a non-stationary option and ask for both.
3.2.3 .QA
- QA efforts can be divided between team members. That way they also try out the game and see where it is.
- ETC students are also willing to QA (and not just playtest). We mostly assumed they wouldn’t so we only used them at the end. People like breaking things more than they like to playtest.
- There wasn’t documentation for the QA process.
3.2.4. Sound
- Sound recording will take longer than you think for a script that is 20 pages. It will be 4 hours and not 2.
- The sound designer is not an editor and does not want to be part of the decision making process.
- Define all the sounds at the beginning and then go over the game again to spot sounds you might need.
4. Lessons learned and conclusion
At the end of the semester, we ended up with a 10-12 minute mobile VR experience, polished, and a website to help run it. Having a dedicated and talented team was a great contributor to that.
Being a pitch project, some of our lessons learned were a result of us knowing what our design was before the semester started, having our equipment on the very first week (allowed us to start making prototypes), knowing what our target goal and audience.
Some of the lessons learned are:
- Playtesting often and early:
- Even if you’re not sure what you’re making, playtest something every week. By the end, we covered most options and are able to say that the design decisions we made are indeed either a good or the best option.
- When the prototyping phase is over, there is enough information to know what people like and didn’t like, to help guide the development process.
- Communication:
- Daily and weekly group updates help keep everyone informed.
- One on one meetings to identify people’s goals or wants is very important.
- Task management and documentation:
- Tasks should be updated as soon as possible. Even if they are small or minor. It helps to keep track of time.
- Documenting the process as often as possible is never a bad idea.
- Identify the limitations of your medium fast (and then assume they are even more restricting than you think):
- We did tests to see how much the Daydream can handle. We ended up prioritizing having the game look nice over, for example, the possibility of reflecting the screen.
- Decisions are important:
- Being a pitch, we knew what our overall design was. Even though faculty had feedback on it, we knew that changing it meant going back months. Sticking to it helped.
- Know why you are making the decision you are making. Either because of playtesting, research, limitations or anything else. As long as there is a justification for it.
- Having someone in charge of the decision making for different aspects of the game saves time.
- Hard deadlines are useful:
- We used playtesting each week as a deadline at first, that forced us to make very specific progress and define our tasks for the week.