Since we had effectively eliminated Meta 2 and HoloLens from our consideration the previous week, we needed to make sure that we as a team could actually develop for the Magic Leap and AR Kit. We wanted to experiment with some of the AR Kit possibilities with environment mapping and gestures, however it so happened that the Developer Portal was down this week, so we were unable to get accounts. This put mobile AR on a temporary hold as a viable platform and testing.
This meant that we had to figure out the Magic Leap. This is the first time the platform was made available to ETC projects, so none of our fellow colleagues have ever developed for it. Being a very new platform, this meant that this held true for the outside world as well. There was very limited documentation outside of Magic Leap regarding common practices, pitfalls, tips and tricks, and suggestions. In addition, due to the rapid development of the firmware, many of the questions that were found on forums were either already outdated, not yet answered, or were the focus of a follow-up firmware release. This was most definitely worrying, as we were effectively a very small 4 person team on our own. Needless to say, our programmers braved on and attempted to create a series of “Gold Spike” builds to prove to the team that developing for the platform was something feasible. And so, after hours of tinkering, we come to this picture of the first successful build compiling and running on the Magic Leap hardware.
It might not have been much, just a floating cube that rotates based on hand position and gestures, but it gave us a huge boost of confidence. Our instructors urged us to make a decision about platform as soon as possible, so since we couldn’t afford to wait for the iOS developer portal to be online, we ultimately decided upon the Magic Leap to be our final platform.
In addition this week was also key in constructing our team design language. ETC requires each project team to have a logo and a team poster. We liked the vector art style, so we came up with a stylized book pages logo, and an accompanying poster in a similar vein.
Finally, we really needed to get rolling on the story front. We are working with a CMU School of Drama student, Kate Busatto who is helping us formulate a story and write the script. Since we no longer were in a loop of choosing platform first then story, or choosing story first then platform, we finally were able to discuss story and purpose ideas with Kate in the context of headset AR. Following the thoughts of our instructors, we ended up with the following main guiding principle:
AR exists as the intersection between the virtual, created world, and the familiar space and items of the guest’s real world. The interaction between the two is what makes the experience magical and compelling.
This is what we believe to be what separates VR from AR, and what we will need to capitalize as we work forward in style and story. Due to a number of snow days that cancelled class and made travel hazardous, we had a fruitful Skype conversation discussing these ideas with our team and our writer to come up with some general higher level story ideas.
A question and concern that frequently came up was where the threshold of “game” was. We wanted meaningful interactions within our story ideas, however the threat of those interactions turning our experience into a game was always on our mind.
This was a question that we wanted to have some clarity on from our instructors moving onto week 4.