As I finished last week’s blog post with Monday’s happenings, this week began on Tuesday morning, with an early start at the studio (as per usual). At the studio, I continued working on the mathematical version of Bernie’s mechanical prototype, first making it so that the wave’s main counter resets when it reaches the end of the total length of the wave, rather than a screen afterwards (a change that I realised I had to make while writing last week’s post). After this, I briefly tried to make it so that the displayed wavelength could be changed separately from individual wavelengths, but the offsets of the different parts of the test wave were making it so that I couldn’t have a ‘one size fits all’ solution. Eventually, however, I put the sine waves from the test wave into the Desmos Graphing Calculator and tweaked some values, and realised that I was able to change the wavelengths by globally changing the x coefficients (as I’d wanted) as long as all of the x offsets were positive, so I changed the negative offsets in my code to positive ones. From here, I copied the wave values I had into a new wave, making all of the wavelengths equally shorter, and saw that it worked perfectly well. Therefore, I added the wave size variable that I’d initially intended on implementing (to determine how much of the wave is visible on-screen at once), and accounted for it in all of the necessary calculations. However, I realised that I was dividing by it in places where I should have been multiplying by it, and after fixing this, it worked as intended. I also thought that it’d make sense for the wave size variable to also adapt the wave’s increase speed, so that it moved at the same rate regardless of how much of it’s visible, so I added this functionality in.
After this, the task was to make this new version of the prototype interactive, like the last one. Therefore, I added a 2D cursor object to the scene (in world space, rather than in the UI layer), using the circle sprite that I’d made for the prior version of the prototype, and I attached it to the camera (so that I could just be manipulating its local position values, relative to the camera’s position and direction, such that it’d always occupy the same space on-screen, even while the camera’s moving). I then started writing a cursor management script, taking a while to determine the variables that I’d need for the cursor’s movement and thumping, and the input mode switching. After adding the variables, I went upstairs and waited for the Reflective Journal session to start, thinking of how to word my idea for the essay’s topic, settling on “Are video games affected by cultural exclusivity?” (although I may slightly adapt this if I feel it’s necessary).
The Reflective Journal session began with James speaking to people who shared their topics, with people chipping-in with ideas and suggestions where they could. For my topic, James brought-up how controversial it can be for certain societies to be portrayed in particular ways in games, how people can be upset by altered representations of historical events or people (and how historical events and people are often interesting enough to not require adapting), and the questions of whether we need a diversity of representation in games, and whether games have to be “approachable” to those who are foreign to the culture they represent. He also said to look at the Reflective Journal 2 structure guides from last year, and for us to consider what we seek to get out of our research and essays. After a short break, Seth Giddings took over the session, speaking about the process of tackling the subject of an academic essay (talking about toys as an example). He said to have an idea of the issues surrounding our topics, considering their significance, and to generate key conceptual terms to influence our searches for sources and generate new issues and arguments themselves. He also said to avoid trying to answer our questions immediately, instead keeping things fluid and allowing the information we find to influence our arguments. Then, he spent a while giving us information as to how and where to find sources for our essays, academic or otherwise. You can see all of my notes from the session below.
After the session was over, we returned downstairs and I spent some time with the others before heading home for the evening, and feeling fairly unhappy. After some standard evening occurrences, I programmed the wave management script to store the current sine wave values of the leftmost position of the wave on the screen (being where the cursor is) each frame, storing the values when the counter on the FOR loop for calculating them is zero. From here, the current values can be passed to the cursor management script, so that the current position of the cursor can be compared against the current vertical position of the wave, determining whether the player is successfully synchronising. I then wrote subroutines for cycling between input modes, setting the current stick deadzone based on the current input mode (as the two rotational input modes need a wide deadzone, and the vertical input mode needs no deadzone) and taking the inputs of the two analogue sticks. Then, I added the two stick inputs to the the project by adapting the project’s input settings. Afterwards, I programmed the three input mode behaviours to set a value for the cursor’s displacement from its origin, writing the code from scratch (rather than just copying it from the last prototype) to make it cleaner and to accommodate for a Boolean I’d decided to use to determine whether the left or right stick is currently in-use (so that I can have a separate deadzone for detecting deliberate movement in the sticks, as having no deadzone in the first input mode makes it tricky to determine which stick to use at any one time). I then created a subroutine to set the cursor’s position based on its displacement value, and called all of the subroutines so far in each frame’s update.
After testing that the cursor movement was working (it was), I changed the direction in which rotating the stick moves it for the second input mode, and then widened the deadzone for both rotational input modes. Then, as it felt a bit too hectic and jerky to match the test wave I’d initially created, I turned down the wave’s speed and slowed down the lerp for the cursor’s displacement value, which made it feel smoother and generally better (to me, at least). I then used Desmos Graphing Calculator again to determine a thumping motion that can be drawn from the wave counter, and programmed a line of code to calculate the distance between the cursor and the wave, remembering to make it only calculate this distance when the wave counter has surpassed its initial offset value. I then programmed the cursor thumping subroutine, changing the cursor’s scale using the modulus of the sine of the wave counter, and then realised that I was dividing by the thumping rate when I should have been multiplying by it. I also realised that I hadn’t been using the current sine wave values that I’d set to be used earlier when I was calculating the distance between the wave and the cursor, instead using the values that had last been set (usually being the ones on the opposite end of the wave on-screen), so I fixed that. I then added an offset to the thumping motion (so that it can be timed with the beat), and slowed the lerp for its scale down to make it smoother. After that, I called it a day and headed to bed.
On Wednesday morning, I went to the studio yet again. After sorting-out some CBT-related stuff, I programmed the cursor thumping such that it only occurs when the distance between the cursor and the wave is within a certain threshold, and made it so that the thump intensity is relative to their proximity within this threshold. I realised that this change meant that I also had to halve the intensity, so I did that. Adam then came into the studio, and I showed him and spoke to him about the prototype I’d been working on. He said that the thumping is quite distracting (which I’d say is fine, considering it’s a placeholder for the visual that Bernie actually wants for indicating success), but I should get some audio into the prototype to see how it feels with that. He then told me that Bernie needs to find a simple sound or musical section to transform into a waveform for testing. When he asked me if I had any simple music that could easily provide a waveform, I thought that electronic music would probably do the job, and the first song that came to mind was Moby’s Porcelain. I tried importing the song into Audacity to make a loop, but Audacity doesn’t support the M4A file format, so I sent the file to Adam for him to make a loop with Logic on his MacBook. He did a good job of making the loop, and sent it back to me when he was finished. Adam then said that I need to speak to Bernie about how he creates waveforms that I can work from (I suggested that he uses grid paper, with thick lines denoting beats horizontally and different wave values vertically, and Adam suggested that he uses long rolls of paper to be more free with the waves he generates before turning them into something I can work from). I also showed Adam the Desmos Graphing Calculator, to demonstrate how I come up with, test and manipulate the values for the waves. After Adam left the studio, I imported the Porcelain loop he’d made into the Unity project and attached it to the manager game object as an Audio Source component.
After lunch, I added three circles to a new UI canvas object to act as indicators for the current input mode, like with the previous version of the mechanical prototype. I programmed the circles to set their alpha levels based on the currently selected mode, and after correcting a typo that stopped one of them from changing, they worked as intended. After going to my final CBT session, I headed back to the studio and we discussed names for the game with Adam. He said that, as we were indecisive (and none of the names we’d been choosing between jumped out at him), it might be best to use a placeholder on Instagram until we settle on something. After this discussion, I felt unhappy and headed home. Eventually, I got to making the difficulty test that Bernie had requested, with waves of decreasing wavelength with differing amplitudes to test how difficult certain wavelength and amplitude combinations are to match. I started by putting three sine waves of varying wavelengths into Desmos Graphing Calculator to see where they intersected, determining my ranges for the different wavelengths. I then programmed sine waves of differing wavelengths into a single wave, and copied this wave for four different waves with different amplitudes. Then, I added a text element to the UI canvas to reflect the currently selected wave, programmed the waves to have different waves and programmed the text to be displayed, and after briefly widening the text box, the system worked as intended. After watching a Nintendo Direct (with some solid announcements), I headed to bed, feeling very unhappy having thought about certain things.
After heading to the studio on Thursday morning (and watching some more Nintendo Direct-related stuff), I made a build of the second version of the mechanical prototype. However, I remembered that the thumping was out of sync for the difficulty test waves, so I set the thump offset in the wave management script based on the current wave and made a new build of the prototype. I then spoke to a few of the others in the studio about my work schedule and daily routine, and they said to me about the importance of taking breaks and avoiding burnout. After this, I started thinking about the variables I’d need for the new movement script, which would utilise the state system that I planned the previous week. While doing this, I started to feel quite unhappy, and headed to the lecture theatre.
At midday, we attended a lecture hosted by Stuart from Sennep, in which he spoke to us about applying for jobs in design studios. Speaking about cover emails, CVs, portfolios and websites, Stuart gave us a pretty large number of specific tips, speaking from his experience of looking at applications to Sennep. Due to the high amount of specific tips, I won’t go over or even try to sum-up everything he said, but you can find all of my notes below. It’s safe to say that the lecture was a good one, and I appreciate Stuart coming over to give us it. After the lecture, I was feeling very unhappy, but after my mood was alleviated (thanks to some much appreciated help), I returned to the studio. I showed Bernie the prototype build that I’d made for him, and he said that the intensity with which the circle’s size was changing was distracting, and the enlarged circle was making it hard for him to tell where the circle was in relation to the wave, so it needed to be toned-down. He also said that the success threshold needs to be smaller, and that the score tracker in the previous version of the prototype was helpful for seeing how well testers were doing, so I needed to add it again.
After Adam spoke to us about testing opportunities, I tweaked the thump’s success threshold and intensity values for Bernie, doing so via trial and error. I then added a white score circle to the UI canvas as a radial fill image, with a black circle on top of it to cover the inner portion and make it look like a ring (without having to make a new sprite). George then showed me the environment he’d made for the brass civilisation, and we spoke about what we’d be working on and how assets could be changed if needed. Then, I realised that I’d miscalculated the total lengths of the difficulty test waves, so I reduced them appropriately. I then spent a bit of time helping Richard out with a movement system that accounts for the directions of the camera and the analogue stick, and spoke to George about what was left to work on for the current interaction vertical slice. At the same time, I’d been programming a score management script, setting and increasing the player’s score when necessary, and updating the fill amount of the score indicator to reflect the current score as a fraction of the maximum score.
At this point, we discussed how branding for the game was going, and when Ella said about how we hadn’t decided on a name, George suggested that “Babel” could be the name, and Bernie liked the idea. Ella and I, however, disagreed, saying that not only was it already used by other games, but its religious reference makes it non-universal, and its link to a pre-existing story implies that it’s supposed to represent that story, even though it really doesn’t (the story of the Tower of Babel was an influence for the game, but we took it in a different direction). Stuart heard us arguing about the name, and said that we could show people the names that we had to choose from alongside a description of the game and see which they liked. He also told us that the name isn’t a very important part of the game, so we shouldn’t dwell on it too much if it’s not perfect. He then looked through the potential names that Ella had written-down, and gave us feedback on them, saying that “Mellowdee” is too literal and annoyingly misspelt, “Musceo” is too mucus-related, “Resonote,” “Resotone” and “Simitone” are good, and “Gloria” would be interesting, as it’s a name, and the connotations of it referring to the female protagonist and her violin would make for a good wildcard. After explaining the annoyance of seeing existing words within fake words and the simple origin of Sennep’s name, Stuart headed-off, and we really appreciated his help throughout the day. At this point, we seemed to settle on “Gloria” as the name for the game, and felt that we could switch to something else should we decide upon a better name later.
After the discussion, I felt that I’d finished the score management script, and was ready to begin tweaking the score increase rate. However, after grabbing some food, I realised that the score indicator wasn’t actually being referenced, so there was no actual representation of the score on-screen. After fixing this, I also moved the maximum score calculation to be called every frame, so that it updates based on the current wave properties. Also, through trial and error, I tweaked the score increase rate to roughly line-up the bar being filled with the end of the wave being reached (for the initial test wave, at least). After heading home, I eventually looked for some information to help me to decide upon how to approach the new movement system. Since rays aren’t accurate enough (unless cast in high quantities), I looked for how to handle checking whether the player is grounded, and learnt that capsules can be cast in the same way as rays, so I considered it as a possibility. Then, after some searching, I looked into the Character Controller component, finding that I could use its built-in Move function to handle player movement while accounting for collisions and not having to worry about being affected by undesired physics. I also found out that transform.TransformDirection exists for obtaining the directional vector in world space for a vector relative to the direction of an object, which could be useful for making future code more efficient. After this, I went to bed.
On Friday morning, I headed to the studio and tried to set up social media accounts for the game based on our newly-decided name, though I struggled to find a consistent name that wasn’t taken. While I couldn’t find a simple Gmail account name, I was able to secure @PlayGloria for Twitter and @playgloria for Instagram, making for a decent level of consistency. Sam Green then came into the studio, met George and Bernie for the first time, and told us that he’d got a member of the new God of War‘s audio team (called Eric Pratt) on-board to help out with implementation of the game’s audio. Then, after we got our results for the prior semester, everyone uploaded their game’s social media accounts to a shared Microsoft Word document that Adam had set up. Adam then explained the importance of user testing, and said that we’d be having biweekly group testing sessions (on top of any user testing that is carried-out for our games individually). Then, he explained next week’s documentary workshop, and made way for the studio’s guest for today, the musician and musical tutor Alex Ayling.
At this point, Alex gave us a talk about music in games, explaining that “game music” is more of an idea than an actual thing (since game music is actually just music). One of the core ideas of the talk was that we tend to associate sound and music with environments and spaces, and that we should bear this in mind when designing the sound for our games. We were also told to consider how audio cues and feedback influence how players interact with the game, and how music creates and changes context. The entire talk was very interesting, and Alex did a great job with it, but there were so many points that I won’t try to sum everything up here. Instead, you can find all of my notes from the talk below.
Before heading to town for lunch, we spent some time explaining some of the concepts of the game to Sam. Adam then showed us some of the documentaries for last year’s FMP games, demonstrating the audio that they used. He then went around the class and had a few people explain their games and the current audio directions for their games, so that Alex (and the class) could give them feedback. After this, Alex (and Kim, who’d also come to the studio) began speaking to some of the teams about their projects, offering suggestions where he could. At this point, I gave Bernie the controller to show him the skating system from the old build of the game, as he didn’t seem to be aware that it was a thing. While he tried it out, Sam came over and showed me the interface for Wwise, as well as explaining how it all works and how it’s integrated with Unity. We also agreed that we need to set-up version control for the project, probably using GitHub, so that we can all simultaneously have hands on it.
Bernie then gave me feedback on the skating system, saying that losing momentum while turning in the opposite direction didn’t make sense, as the player is able to quickly change directions when using the standard movement system. He also said that while turning, the player should preserve their momentum in the direction in which they were moving, skidding a bit (like with ice skating). I also showed Bernie some footage of the skating movement in Super Mario Galaxy 2 to show him what influenced my desire for the skating movement in the first place (you can see this footage below). Bernie also gave me some feedback on the standard movement system, saying that the reduced movement speed from turning around was too aggressive, so I doubled the player character’s acceleration rate and that apparently made it better. Adam and Richard also got their hands on the controller, and they pointed out that the sudden boost in speed when moving in cardinal directions was a bit too much, so I need to make sure the speed is capped and consistent in all stick directions when I code the new movement system. Bernie then suggested that we could have the player “ksssh” out of the skating motion, skidding for a duration dependent on the speed reached and gradually (and quickly, I imagine) coming to a halt. He also suggested that I could piddle about with the movement system, as at the moment, the main movement is walking around and the skating system is like holding the sprint button in a platformer, but if we have the skating as the game’s sole movement system, we could have the player manually accelerate and decelerate.
After speaking to Bernie, I helped Richard fix his camera script, and when I noticed that Ella and Bernie were speaking about how to handle the Triangle Man encounter, I popped into the conversation to say what would be possible. Alex then came over to our team, but said that he didn’t want to intrude too much, as we already have a composer sorted (being Sam Green). He said that the game idea is good, and suggested looking at Music of the Spheres, and to consider how the game mechanics could be influenced by different aspects of the instruments in the game. Adam then spoke to our team about the project, saying that he wanted me to be able to get through the linear coding processes as soon as possible, so that I could get to the iterative processes of responding to testing. He suggested that more of the asset adjustment work be on George’s end than mine, and then began to speak about animations and how we should handle designing and creating them (he said about how animations convey character, and it’s not about how smooth they are but the strength of their keyframes, so we should look to older animations for inspiration). The only active work I’d done on the project in this time was adding a couple of components to the player parent game object in the project for the new movement system, so nothing really worth noting.
After the discussions, Ella sent me the concept art for our game’s main protagonist (now canonically called ‘Gloria’), and I set about using Photoshop to turn it into a circular profile picture for Twitter and Instagram. I had to clean up some edges, give it a background (with the background colour suggested by Ella) and give Gloria a white outline, and the clean-up job did take a fair while (since I was having to use a mouse, due to how I don’t usually take my graphics tablet to the studio). After setting it as the game’s Twitter profile picture, I found that Instagram needed a larger image, so I re-edited the picture in Photoshop with a larger canvas size. Once this was all sorted, I headed home and got no more work done for the rest of the evening.
Between Saturday morning and afternoon, I came up with a description for the game, so that Adam could add it to the winchester.games website’s listing of the game. It took a bit longer than expected to sort out, and from the others’ suggestion, I had to remove references to the main character’s name being Gloria from it, as that’s meant to be left to the player to determine. What I came up with can be seen below. I was intending on starting this week’s blog post during the afternoon or evening, but I ended up feeling incredibly unhappy, and found myself incapable of getting anything done at all.
Divided by their musical dialects and surrounded by the sounds they’ve come to understand, civilisations have become secluded. You’ve grown-up to the tunes of your hometown’s string instruments, foreign to all other music. However, compelled by an unfamiliar melody echoing from the distance, you see no choice but to seek its source. With your violin in-hand, you’ll venture to the far tower, attain understanding through harmonisation, and learn that music knows no barriers.
In Gloria, you will explore various towns, experience a variety of musical styles and meet a charming cast of characters. Through the harmonies you build, you’ll come to understand their hardships, and help them to overcome the loneliness they feel.
This brings me to Sunday, which I’ve spent working on this blog post (in fact, it’s past 01:00 on Monday as I’m writing this). For the week ahead, I need to be finishing-up a basic version of the movement system, likely without the jumping, gliding or beat-based mechanics, as well as program a basic script to change the volume of an audio track based on proximity, and then tie together the movement with the synchronisation mechanic via logic and a detection and interaction system. This is so that we can have our vertical slice for the brass civilisation’s main interaction finished for testing and potentially changing soon, as it was meant to be finished this week, which wasn’t the case (likely due to how I had to spend more time than I’d planned working on the mechanical prototype for Bernie, as well as finishing last week’s blog post throughout Monday). I also need to make some progress with Reflective Journal, as the deadline’s getting ever-closer. Anyway, I’ll see how it all goes, and let you know in next week’s post.