We cannot believe that it is already week 14. Next week, April 29th, our project is going to participate in the inaugural Manufacturing Futures Forum on the main campus so we are working hard on it.
On Monday, we had soft openings. Faculties came and playtested our product in virtual reality. We also guided them through the PC prototype as well and explain the student’s learning experience step by step. The faculties gave us a lot of valuable advice, including some minor bugs we found during the playtest and adding the feature that the students could have the opportunity to choose which step they want to start with.
For the programming part, we finished implementing all the textures on the lighting. We also have adjusted the environment lighting to make it as realistic as possible. We also have changed the beginning tutorial and also the ending scene to make the experience more consistent.
For the arts, our artist has also finished polishing all the small interactable objects for the experience.
For the PC parts, we worked hard to finish the most difficult part, the leveling, before Friday, and we made it.
On Friday, April 26th, our client, Sandra, came to visit us and gave us her final suggestions about our project before the end of the semester. Sandra was happy about what we have achieved this semester and she would love to see us showcase our project at the forum.
It is only one week left before the soft opening. We are wrapping up our deliverables for the soft opening. Our project consists of two components: the PC version and the VR version.
In the PC version, there are only 4 steps need us for the Unity Interactions. Because we need to build up our Unity interactions based on the website framework done by HCII, we could only finish 2 steps out of 4 before the soft opening. However, we will make sure that we will deliver a complete product by the end of the semester.
Here is a peak of our PC demo for the first 2 steps:
Our artist has finished adding the textures of the machine and some interactable items. The programmers have implemented the new textures and let’s take a look:
We also fixed a few bugs but there is no time for a big change or implement new features for the VR version. We are ready for the soft opening.
After the soft, we will modify our prototype based on faculty’s feedback. We will also finish the PC version and start writing documentation before the final delivery day.
After gathering all the feedback from the playtest sessions last week, we fixed and improved a lot of parts and were ready for another round of playtesting. On April 9th, 6 ETC students came and playtested. We were glad to see how our modifications improve the experiences. The follow-up questions were the same as last week and our goal was to collect more information from them.
Based on their feedback, we improved our VR training and were ready for the client’s onsite visit on Friday, April 12th.
On Friday, April 12th, we showed our client, Sandra, the latest VR demo. She also tried it with the hint turning off.
We collaborated with the researcher from HCII for our PC tutorial. Team Hot Metal is responsible for all the interactions in Unity and implementing the communication of interactions to TutorShop. The research from HCII is in charge of PC instruction and HTML implementation. During the client meeting at week 12, we presented “step 4: EOS Layout Inside” of our PC version.
We found out that building up and improving the PC version is far more complicated than we expected, while we only have one programmer responsible for this part. We are confident that we could finish the PC version by the end of the semester. However, by considering its quality and quantity, we are afraid that we could only finish half of them for the Soft Opening.
The semester is coming to the end. Our project is content-heavy with an overwhelming workload, but our team is going to work harder to achieve our goal for the following weeks.
In week 11, we have built a complete version of our VR training and were ready for playtesters. Throughout the whole week, we conducted ten playtest sessions. Among them, there are ETC faculties, ETC students and Ph.D. students from the client’s lab. Our target audience is graduate students at engineering school so we are very selective on the background of our playtesters. ETC students and faculties share a similar background with a decent amount of VR experiences so their feedback focuses on the understanding and intuitiveness of our VR training. Students from the client’s lab could help us correct the clearness of the procedures. All of their feedback are quite helpful for us to improve later: Here are some of the photos during playtesting:
Before the playtest, we gave them a brief introduction about our project:
“We are team Hot Metal. We are working on a VR tutorial of a 3D metal printing machine for CMU Engineering Department. What we are going to show you is our latest VR demo, which will be used for engineering students after they took the class and got enough knowledge from our PC tutorial. This demo is supposed to be played by the students who have already learned the operation. So if you are confused, let us know.”
One student acted as a guide to help them go through the demo if the player needs help and two students were observers who wrote down everything they saw during the playtests.
After the playtest, the observers asked the players the following questions:
Do you know what you are doing during the playtest? If no, then why?
Are all instructions clear for you? If no, what are they?
Do you have trouble finding some objects? If yes, what are they?
Do you understand the controllers’ interaction? If no, then why?
Do you feel you are learning something from the experience?
How do you feel about the sound effects?
Do you have any other suggestions or confusions?
We found playtest very helpful for our projects. Based on the observations and feedback we got from the playtest, we could not only find out our programming bugs but also our design defects. Now we are very clear what we need to improve next week.
In Week 10, we continuously polished our VR project based on the feedback from clients’ visit last week. We set our goal of the week on Monday:
1. showcase a small interaction between website and Unity
2. polish and modify the problems we found listed above and get the VR version ready for playtest
3. lock in what we are going to deliver by the end of the semester and what does the scope of work in the summer gonna look like
We worked hard on trying to make the experience more intuitive and smooth while putting our clients’ suggestions into consideration. Our programmers are continuously working on the VR demo to prepare for the playtest next week.
We changed the interaction of doors and switches in the scene. Instead of dragging the door by the edge, the guest can now click on any part of the door to open or close it automatically. The algorithm is shown below:
Here is the latest build we had at Week 11.
Besides the VR version, our team also communicated with the researcher at HCII about their needs from us in the PC version. After discussion, we set our scope of the PC version, which includes the Unity Interactions of these 4 specific steps:
04: EOS Layout
Inside Machine
06: Mounting and Leveling
of Build Plate
07: Cleaning
Lens-Double Check all Setup Conditions
08: Start Build
We will also be responsible for Implementing the communication of interactions to TutorShop and the researcher at HCII will be in charge of PC instruction and HTML implementation.
In conclusion, our project is making progress every day and we are looking forward to our first-ever playtest session next week.
In order to make up the 5 days we are going to miss for GDC, we chose to work on our project for the fisrt 5 days of spring break.
We went through the feedback we got from our half presentation. We are confident that since we have finished our VR linear experience, without the final art assest, by the end of week 9, we can finish our project on time by the end of semester. For the hinting, beginning on week 10, we will inivite the students both in the NextManufacturing lab and in ETC to playtest our prototype. We will iterate and modify our design according to the playtest feedback.
For the programming part, we continuously finish up and debug our build. We use placeholder for safety equipment and implement our latest 3D model into our current build. For the art part, our artist is working on unwrapping our 3D model before adding texture on it. Let’s take a peak:
Our designer starts to work on the PC design but we still need to talk more about the requirments from our clients.
On Tuesday, March 12th, our clients, Sandra, Bruce, and Lu come to ETC and we present our latest demo to them.
In week 8, we mainly focused on our half presentation. We divided our presentation into different parts and each person is responsible for their delegated slides and presentations. We rehearsed several times throughout the whole week.
We talked about our project goal, design progress, level management, CTAT implementation, 3D model and our future plan for the rest of the semester. We did a pretty good job on the presentation.
On the day of our half presentation, March 7th, Nick, the lab technician, John, the Ph.D. student and Lu, the HCI researcher, came to watch our presentation. We also presented our latest demo to them. They are very satisfied with our presentation. They think it is well rehearsed and informative. Nick and Todd also provided some minor suggestions about our current build. After the half presentation is done, we could focus back on our VR build.
During week 7, our team is still focusing on the VR version. Our artist has made significant progress on the model of the machine. Now it is completed with two doors–one inner and one outer, plus the monitor and the small table on the right-hand side of the machine. Here are the screenshots:
Our programmers continuously worked on step-by-step procedures. We have implemented the text instructions on the tablet and the highlights for the hinting purpose. We also have added name tag on each object the student touched to provide more information for the students. Here is our demo video recorded for week 7 client meeting.
During week 7, we also had several groups of people came to visit our project. On Thursday, February 28th, we had Heather Kelley and John Sharp coming and they provided a lot of suggestions about how to display text information in the VR environment.
On Friday, March 1st, representatives from Deloitte Digital and Anthony Daniel both came and expressed interests in our project as well. Deloitte Digital showed us what they had done similar to our project. Anthony Daniel also played our current build and addressed several concerns about the design of the tablet. We will put these suggestions into concern and improve in our next version of the build.
On Week 6, we continuously work on the linear VR experience. Here are some screenshots for our latest progress:
The above picture shows the steep controller we used for level management. We use a stepManager system as the bottom structure to manage the linear level system. We gave every step several parameters, and every step has events to be triggered when the step starts and ends. When a step is created, we also gave them step events for ending and start, and also store each step and their following steps in two dictionaries. Then we leave the AddStep Interface for .json file to store and add levels. For every step, we need a structured method calling the step ID of every single step. Then, we search in .json file for glowing objects and outline objects and use specific functions to provide visual feedback. In the .json file, we store objects with names and split with the comma. Therefore, In Set step function we formalized strings and find objects in the scene through their name and call relative methods. Then we leave interface for both moving to the next step and also calling specific step. In most occasions, when the system receives the event that the guest has finished the current step, we call the function of moving to the next step. In some other occasions, we need the player to recall back to specific steps and looping in several steps until they finish the last requirement. In the loop, we need to call a specific step instead of moving to the next step. When calling the function, we pass the step index as the parameter. We use a stepManager system as the bottom structure to manage the linear level system. We gave every step several parameters, and every step has events to be triggered when the step starts and ends. When a step is created, we also gave them step events for ending and start, and also store each step and their following steps in two dictionaries. Then we leave the AddStep Interface for .json file to store and add levels.
For every step, we need a structured method calling the step ID of every single step. Then, we search in .json file for glowing objects and outline objects and use specific functions to provide visual feedback. In the .json file, we store objects with names and split with the comma. Therefore, In Set step function we formalized strings and find objects in the scene through their name and call relative methods.
Then we leave interface for both moving to the next step and also calling specific step. In most occasions, when the system receives the event that the guest has finished the current step, we call the function of moving to the next step. In some other occasions, we need the player to recall back to specific steps and looping in several steps until they finish the last requirement. In the loop, we need to call a specific step instead of moving to the next step. When calling the function, we pass the step index as the parameter.
Above is our latest machine model. Before the half, we need to add more details on the outer door and build the right half part which contains the monitor of the machine and the safety knots.
We had a draft UI for the tabletl that the student will hold in their left hand during the VR experience and let’s take a peek:
For week 7, the last week before the half presentation. We have made a priority list of what we are going to do on our prototype:
a. The whole interaction experiences including art assets we have. We will use placeholders for the rest.
b. Instructions implemented in the tablet.
c. The tag on each object when the controller hovers over it.
d. Prioritize visual feedback.
e. Sound effect.
Our whole team are excited about what we are going to present in the half.
On Monday, we discussed the feedback we got from Quarter Sit-down last Friday. We looked up similar past ETC projects and related companies online together, including Labster, M-lab, and MediSIM. We discussed what we could learn from them and what we should try to avoid in our design. We separated the tutorial content into three layers: the procedure that students need to do; the procedure that lab technician needs to do; and any other information. We prioritized the contents and tasks according to the project’s needs:
Things students must touch in VR
The less important but cool contents and implement into VR
Overall system working in PC
We agree that we need to keep in mind what is the most important part and the hardest step for the students to understand. These parts are where illustration and animation could really help.
After prioritizing our tasks, we are much clearer about what we need to do this week: programmers would finish the most important interactions from step 5-8 in VR; designers would research more on the VR instruction display; the artist would continue working on the model of the chamber.
We worked hard for the demo on Friday and here are some highlights of our progress:
On Friday, our clients, Sandra, Nick and Todd came and we presented our progress and demo to them. It was the first time that we showed our VR demo. Clients were surprised about how efficient and effective the training in VR could be and was very excited to see our progress in the future. Here are some photos of their visit:
With courage and support, Hot Metal team works hard and is looking forward to our next demo!