I began the week on Monday with the usual trip to the studio, eventually continuing to work on the wave mechanic prototype. I first added a Canvas Group component to the score circle objects to set their alpha value to zero, so that I could see the prototype without the score being visibly tracked. I then opened Ella’s key art for the game in Photoshop, so that I could find the RGB values of the different characters’ main colours and use those of the player character and the brass character to define the initial colours of the two lines. I then calculated the midpoint of the two RBG values, hoping for the two colours to gradually merge into that colour, but it looked pretty disgusting. Therefore, I asked Ella to pick a better green on my laptop and sent an image of the colour to myself over Slack, so that I could check it on my (properly calibrated) phone screen. Being satisfied, I set this as the target colour for merger.
Ella then asked me for some clarification about the circular mechanic that had been theorised the previous week, concerning which side of the circle circumference the shapes would be falling towards the player from. She said that it’d make more sense if the shapes fell only from the outside than if they fell from both sides, as that would indicate that a single character is being synchronised with (rather than two), and it would also theoretically feel less intense. I also thought that once the player synchronises with the character, the player’s movement circle could expand around the character’s, so that the player is then dropping shapes onto the character. After this conversation, I showed George the state of the prototype at the time and informed him as to what I was working on (being programming the synchronisation and visual feedback of it). After speaking to George, I programmed a subroutine to increase or decrease a synchronisation value (between 0 and 1) based on whether the player is within vertical range of the wave, using the code I’d written to increase the score value as a frame of reference. I then added some code to keep the value between 0 and 1, and created a subroutine to calculate the values of the colours at the ends of the two trails, being set based on the difference between the initial colour values and the target value, and the current synchronisation value.
I then tried to set the current values of the start and end colours of the trails, but when this didn’t work, I tried to instead determine gradients for the trails to set their colours based on. That also didn’t work, however, so looking for possible reasons online, I saw that someone had set the material of of their line when setting its colours to use a particular shader. Therefore, I reverted to setting the colours at the trails’ start and end points and created a new material (with the legacy additive particle shader), and attaching this material to the line renderer components made the colour setting work. I then made the material use a non-additive shader (as the lines had a translucent quality that I didn’t really want), and removed the fade to zero alpha, realising that the colour fade itself wasn’t clear enough. At this point, I made a build for Bernie to test with in the planned two-hour testing session, planning to add the music to Bernie’s wave and synchronise the two while he carried-out testing. Bernie tried-out the prototype build I’d made for him, but he didn’t like the slower lerp speed I’d given the cursor’s vertical movement (as the cursor wasn’t immediately where he wanted it to be), so I made him a new build with a faster lerp. This was when people began coming in to test our games, and the prototype wasn’t immediately running well on Bernie’s laptop, so Jamie (from Year 2) tested the build on my laptop while Bernie restarted his laptop and resolved the issue (by disconnecting his laptop from the TV). After this, I was free to continue programming while people tested the game with Bernie and, after a short while, George.
This was when I made it so that the music that Bernie designed his wave from starts playing when the wave counter reaches a certain value, depending on whether or not the wave has already looped. This meant also importing the audio file, of course (to replace the unused, placeholder Moby Porcelain loop). I also made it so that the line trails are cleared when switching between waves, as the character trail was remaining static until the wave counter surpassed the wave offset. Daryl (from Year 2) then came over and gave me his feedback on the prototype, saying that it needs more visual feedback for successfully synchronising. He said the feeling of matching-up reminded him of the intense rhythm game osu!, and suggested possibly using particle effects to suggest success, likening it to the idea of the player being “on fire.” Bernie then came over and spoke to Ella and me about the feedback he’d received, summing-up some of the main points, but agreeing to go back over what everyone had said to give us as accurate and concise a representation of the feedback as he could. While the testing was happening, I’d been adjusting the speed of the wave to make it line-up with the music it was designed with.
Once the audio was lined-up as best as I could make it (since the track itself starts very slightly too late), George came over and summed-up to me the feedback that he’d received. Since a lot of the negative feedback that he’d received was down to players misunderstanding the intention of the mechanic (assuming the mechanic to be a test of skill), as the prototype lacks any context, George said that we should aim to make more of a vertical slice, so that we can see whether the intended experience comes across there. Adam then came over and said that we need to think about what our team can most quickly make to test the effectiveness of everything in context, saying that we need to integrate the wave mechanic prototype in the context of the 3D environment we’ve developed, even if it hasn’t got everything we want just yet. He also said that we need to come up with a schedule, and that the next testing session on 11th March would be a good deadline to aim for. After this, I felt unhappy, and was not only incapable of getting anything done for the remainder of my time in the studio, but I also found myself unable to get anything done throughout the evening, feeling very unhappy.
On Tuesday morning, I headed to university early and eventually switched the colour-setting code for the trails to gradient-setting code again, so that I could control how quickly the trails transitioned to the green colour. To make the green more obvious, I made the trails transition to it sooner, and I also made the ends and individual points of the line rounded (so that unpleasantly jagged edges are kept to a minimum). Adam then came and tried-out the prototype as it was, asking me to make the character’s line thicker and the player’s thinner. After I’d made these changes, he said that it felt easier to play (even though the changes were only cosmetic). I also accidentally left the far end of the character’s wave slightly thinner, and Adam said he liked it, so I felt I may as well make it thin to a thickness of zero at the end (to make it even more gradual). Adam also liked this, and so did I (I felt it made the wave feel less artificial). I also slightly adjusted the z value of the player’s trail, so that it always sits in front of the character’s wave (and the player always knows where they are).
George then said that the wave prototype needs to be placed in the context of the wind to be improved, as it doesn’t work very effectively with the brass music it’s currently designed around. He also said that he could design the trees of the wind civilisation to have adequate spaces between them to accommodate for the wind and the camera, and I said that there could possibly be one larger tree in the civilisation that the camera orbits around, with the wind in front of the camera. Bernie then came to me and described all of the main areas of feedback from Monday’s playtesting session, having compiled and condensed what people had said. He said about how people were split on whether they liked the first or third input method, so I said it’d probably be best to stick with the first since it allows for being controlled similarly to the third (and also more closely and intuitively represents the vertical movement of the wave mechanic). He also said that three of the eight players said that the cursor’s position resetting when the analogue stick is neutral was annoying, which probably meant they’d prefer to be telling the cursor to move up and down rather than directly set its vertical position, so we decided that, if I have time later, I can try to program that input method (it should be pretty simple) for Bernie to test. Also, he said that most people found the prototype to be fun, and people found fun in both matching the character’s wave and simply moving the player wave around to make shapes. After I’d spoken to Bernie, I tried to help Fred out by giving him suggestions as to how to approach transitioning between two gameplay modes. We’d also been waiting for Ella to come to the studio before speaking to James as a team, but soon before lunch, we were informed that she wouldn’t be able to make it in, so we didn’t have a team discussion with James this week.
After moving over to the studio, I made it so that rather than relying on storing only the y positions of the points on the player’s trail, I store the initial vertical displacement value of each point from the origin, so that I could manipulate the values before setting the y positions in world space. When I tested the change, it wasn’t working due to the displacement array indices being out of range, so I created a subroutine to set the length of the array based on the current number of samples for the trail each time that number changes, storing the array values in a temp array of the new length and then copying the temp array back to the main one (as per a suggestion found online). This solution fixed the issue. After this, I headed upstairs to attend our scheduled talk.
Our talk for the afternoon was with Emma Reay, a PhD researcher who bridges the gap between game studies and children’s literature studies, who’d previously given us a lecture last semester. Her talk focused on why people write about games, how people do it, the differences between games journalism and games criticism and how we are part of the generation that can establish new norms for games criticism (to distance it from the arbitrary, traditionalist values of other academia). She also pointed-out some good sources of essays to cite, and ended the session by providing feedback and pointers for a few people’s ideas for essays and journalistic pieces. I won’t explain everything she spoke about, but it was definitely a good talk and a useful session, and you can see all of my notes below.
After the talk, we returned to the studio, where George and Bernie looked at the prototype. They wanted the colour merger to occur further down the trails than I’d made it (so that there’d be a longer head of the player’s colour), so I tweaked it to occur a third of the way down, rather than a quarter. When George said the colour differences weren’t obvious enough (as he’s colourblind), I put that down to the muted quality of my laptop screen, so I sent a build of the prototype to him, Bernie and Adam for them to test on their screens. Richard suggested that we should aim to be able to differentiate between the colours in greyscale, and he sent me a link to a GDC talk in which he’d heard that principle. Bernie said that I could adjust the colours to be more differentiated on my screen, so that we could account for bad monitors. We then briefly spoke about the possibility of using unlit shaders in the game for a particular visual style. While all of this was happening, I’d been programming it such that a counter would increase and change the vertical positions of points on the trails based on a sine wave, with an increasing amplitude and frequency as the synchronisation value increases and further down the trail. This makes it so that the two trails look like they’re entwined during synchronisation, with the two trails’ sine waves (at different x offsets) opposing each other. I also made it so that the fade to green occurs further down the trails, as Bernie said that there wasn’t enough blue on the player’s wave anymore. After this, I went home and eventually recorded some gameplay footage of the different builds of the wave prototype, and then edited the footage together to demonstrate the iteration and progression in a video that could be posted to social media. You can see this video below. After that, I headed to bed.
On Wednesday morning, I headed to the studio with the intention of getting work done for Reflective Journal. I struggled to find the focus, since I was feeling quite unhappy, but I made some steady progress. I briefly spoke to Adam about my Reflective Journal topic, and he suggested focusing on just a few parts of it, agreeing with me when I said that I could look into everything and see which areas would yield the best results. He lent me his copy of the book for the V&A Design, Play, Disrupt exhibition, and I generally did research for Reflective Journal until I went home, after which point I continued doing the same thing until I went to bed.
On Thursday morning, I went to the studio early and did some more Reflective Journal research. After becoming quite unhappy, I headed to the lecture theatre before the lecture scheduled for midday. The afternoon’s lecture was from author and games industry veteran Fergus McNeill, and he spoke to us about writing narratives and writing for games. The lecture was really good, and Fergus went over a number of things, including how to manipulate the audience and control how they’re feeling (via various narrative elements), as well as how to tell whether the pacing of a story is correct, though I won’t rewrite everything he said. You can find all of my notes below. I still felt very unhappy afterwards, but I was helped out of that during lunchtime, for which I’m very thankful. After lunch, I did some more Reflective Journal research, and then we had a team discussion, initially speaking about how we could have a visual language of musical symbols for each civilisation/character. Then, we spoke about how we’d approach reflecting progression throughout the game and the game’s final encounter, questioning whether the characters should follow the player after synchronising or just stop at points along the way for the player to encounter them again, specifically wondering how we’d display the relationship between the ‘Triangle Man’ and the wind player. I said that programming for the characters to follow the player wouldn’t be an issue, but doing it on top of programming bespoke behaviours for the different characters to interact with and explore the environments (as Bernie had suggested, saying that The Last of Us‘ Ellie works because of her interactions with the environment) would be too time consuming. After some discussion, we settled on having characters wait ahead of the player after synchronising, and then show up behind the player at the top of the tower for the ‘Piano Man’ interaction, although Bernie was concerned that the player wouldn’t know how or when the characters followed behind.
After the team discussion, we spoke to Fergus as a team, looking to get some feedback on our game’s idea and narrative. He liked our ideas for our game and narrative (which was relieving and appreciated), and the biggest suggestion he gave for us to consider was that we could sit down, list all of the game’s plot events and determine what the player needs to know at each point in the game, using that to determine how we should go about conveying necessary information to the player using visual and aural cues. He also suggested that rather than trying to explain how the player can’t communicate with NPCs from other towns, we could demonstrate and emphasise the connection between the player character and the characters in their hometown, so that we can show that that connection is missing. He also said that it’d be great if there were something clearly visually or aurally different about the music of the ‘Piano Man’ that would make the player want to see its source, such as far-reaching ripples almost connecting with characters but not quite. He also suggested early in the discussion that we could use a narrator to help the player to understand parts of the narrative, possibly cutting-out narration once the player reaches the tower, but he acknowledged that it could be a lazy way of getting the story across, and that we don’t need words to tell our story. The discussion was extremely helpful, and Fergus made for a great visitor to the studio. You can find all of my notes from the discussion below.
After the discussion with Fergus, I did some more Reflective Journal research, and eventually went home, feeling incredibly unhappy. I was incapable of getting any work done that evening, and ended up heading to bed feeling hopelessly unhappy. This continued into Friday morning, when I went to the studio and, before the day’s session had started, went home with Adam’s permission. I was at a new low point that morning, feeling worryingly depressed, though I won’t go into details. After a long talk with a good friend in the afternoon, I ended up feeling reasonably okay, and was able to get some Reflective Journal research done for a while. However, I was unhappy again by the time I went to bed.
The time that I spent working on Saturday was spent gathering research for Reflective Journal, which has proven to be a frustrating time sink. This brings me to Sunday, which is when I’ve been writing this blog post. I was hoping it would take a bit less time than it has, but there’s not much I can do about that now. Over the coming week, I’ll have to be continuing to work on my Reflective Journal essay and journalistic piece (the latter of which I’m saving for when the former is complete), but I also need to program at least the main portion of the game’s movement system, as well as integrate the wave mechanic properly with overworld traversal, such that all of the features of an environment can be tested in a vertical slice. I’ll see how that all goes, as I’ll definitely be prioritising the Reflective Journal work over everything (since that has to be handed-in on Monday-week, and contributes an unreasonable amount to our final grades, considering how little time we’re able to spend on it). If all goes well, we’ll have something new to show-off in next week’s testing session, and perhaps I’ll be able to either further the development of the wave mechanic or start on one of the other synchronisation mechanics (which would be fun). I’ll let you know how everything goes in next week’s post.