This week was the beginning of our wrap up for the semester. Development on prototypes is officially over, and our focus has completely shifted to documentation and what this project will be after we are finished with it. One piece of documentation we proposed earlier on in the semester is design concepts for a few song mixing tools we were unable to get to during our development. To this end, we have decided to create design documentation for both Signal Flow and Compression, using all of the VR song mixing knowledge we have gained with our previous prototypes.
We have also begun our outreach to the community to inform them of our project. This includes posting to reddit, linkedin, and social media. We are in the process of uploading our project on github, and should have that finished by the end of next week.
Perhaps the biggest outreach success we have had came from a meeting we held with Ben Opie, a professor at Carnegie Mellon’s School of Music. We found out that he is going to be teaching a course called “Computer Technology in Music Education” in the Spring, and wanted to get his thoughts and feedback on our approach in educating players using VR. He was so excited by our project that he would be willing to make it part of his course. This would be a great opportunity to expand Mixthesia beyond just a semester-long project at the ETC. If our project ends up being adopted within an academic space, it would be a clear form of success for us.
Looking ahead to next week, our focus will entirely be on preparing for Final presentations, finalizing our open-source project, and finishing our introduction video.
Tech
This week’s tech focus was two fold:
- Cleaning up of the Unity project so that we could put out a clean version of the source code up on GitHub along with reading up on licensing for the source code
- Writing a tool that developers could use to import and easily hook up their own custom track stems to our project and run a build with it
Most of the work went on researching about how to easily achieve this custom stems import for our project. We finally came up with a solution of using two different scripts.
- The first one will be a FMOD JavaScript script that basically can be run from FMOD and it would generate a
ScriptableObject
for Unity based on the song selected in FMOD
The inspiration for this came from here : FMOD ScriptableObjects - Once the data is created in the Unity project, a custom Unity editor script can be run that will use this generated
ScriptableObject
as a reference to create the new scene files and support building out the Quest and Rift S builds from the editor script itself
Next week’s focus will be to finish both these tasks and also make a couple of changes to our build so that we use it efficiently for the upcoming virtual BVW festival!