The Second Week

I finished last week’s blog post on Monday evening, and covered Monday’s work there, so this week’s post starts with Tuesday. After getting to the studio early, I imported the new version of the wave model that Bernie gave me, as well as the audio track to go with the wave. I had to substantially scale-up the model (to match its vertical scale with the older wave model), and I switched out the material for the wave, such that it was a solid white. I also changed the material for half of the panels that marked a second’s worth of wave behind it, since I couldn’t see the panel against the black background of the scene, though I realised that this was actually because of the direction in which it was facing, and I reverted the change I’d made. Once I’d imported things properly, I decided it’d be best to separate the two versions of the prototype, so I copied everything over to a new scene in Unity and lined-up the new wave with the old ones, before removing all unnecessary game objects from the new scene. Then, I made a new version of the wave management script, with everything copied and then the unnecessary sections of code (concerning switching between which ball and wave models to show) removed, as well as the setting of variable values at the start of the script, so that they can be fully determined in the Inspector.

After this, I moved the new wave from side to side while watching through the game camera to see when it moves off-screen, so that I could determine its end position. Then, I actually attached the new version of the wave management script to the manager object, and made a new version of of the score management script to reference the new wave manager (instead of the old one). After halving the scale of the wave model (as it was too easy to match it with the cursor), I realised that I also needed to make a new version of the score increase script to reference the new score management script (this was a mildly tedious snowball effect). Then, I noticed that the score counter wasn’t constantly increasing while the cursor was over the wave, particularly when the wave went through one end of the cursor and out the other side without any other part exposed, which led me to believe that there was an issue with the collision between the two being detected.

At this point, we spoke to James as a team about the project (we had to do so as soon as we could, as Ella had to leave later in the morning). As he’d seen the basis of Bernie’s mechanic while I was implementing it the previous week, James likened the mechanic to the physical games of moving a hoop along a wire, which made sense. When Bernie described how he’d come up with the form of the wave (by observing the volume of some music over time), James said that having a link to frequency and amplitude is nice, and that we could use pitch shifting in Unity for the player character’s violin (to make the player’s cursor movements affect the audio output). He said that the mechanic itself is ‘fine and dandy,’ but he believed that we could come up with something more interesting. Essentially, he suggested that we could think about how the player could have to traverse the wave, or even morph it, rather than abstractly match it. He said that if we were deciding that the synchronisation wasn’t a literal musical interaction, then we could have fun with it, rather than adhering to it making musical sense. James also said that we shouldn’t go out of our way to please Adam, but rather make sure we’re making the game that we want to make, and that we’re worrying too much about ‘the solid canon’ of our universe. He told us not to let worrying about ramifications get in the way of trying something new, and that he could be a launchpad for us to bounce ideas off of. Then, James suggested that we might have been sucking all of the fun out of our fun idea, and that we should be revelling in the fun a bit more, retroactively incorporating instrument features into other areas of the game (such as having moving trumpet valves instead of staircases in the brass society). He said that we should not just be thinking of the instruments, but how they’re played, and how their respective civilisations would solve various problems with what they know. James then suggested that our synchronisation mechanic could be inspired by harmonics, possibly having to manipulate sound waves to get them to match, and that we should avoid making the development of the game into a checklist (else we’ll hate it), but we should rather make the game that we’d want to play.

After we’d spoken to him as a team, James suggested a possible solution to my collision issue (to make the synchronisation scoring actually work). He said that I could use the Cross Product to see whether a point on the wave is within the space of the cursor, though I said that that would mean having to know the elevation of a number of points on the wave, which could be a tedious process (though I suggested that I could do this with a preliminary pass of an upwards-pointing ray cast from beneath the wave, storing the distances to it at fixed intervals). After all of that, we decided that it’d be much simpler to try just setting a Boolean to true when the wave enters the cursor’s collider, and to false when it leaves. Below are the diagrams that James made while explaining his potential solution.

After speaking with James, I started writing an email to reply to Sam Green, attempting to answer as many of his questions as I could, with George and Bernie there to help me if I needed them. Then, Bernie sent me a more amplified version of the wave (to make it cover more of the playing area, and thus be more interesting and challenging to play), and I imported it and aligned it with the prior version. While testing out this new version, we found that the sudden dips in the wave were very difficult to hit, so Bernie made and sent over three less amplified versions (of varying amounts). This was when we headed upstairs to attend a talk from Adam Rosser, a games journalist for BBC Radio 5 Live. The talk was both interesting and entertaining, and Adam definitely made for a great visitor, giving us a number of things to consider for our work. While you can find all of my notes from the talk below, there were a number of core concepts that I found to be the biggest things to take away from it. Firstly, thanks to ‘survivor bias,’ we tend to hear more about successful games than failures, but things that fail are just as important as things that survive. Secondly, when we’re making games, the games are secondary to the ideas, so we need to be able to sell the ideas simply and succinctly, especially when trying to sell the idea to the press (as they constantly receive numerous emails to trawl through). Thirdly, in the current industry environment, indie developers are in front of their games, so it’s important for us to show who we are as people on social media, being open, posting relatively frequently and aiming for times of peak traffic (to be noticed). From there, games can also be an insight into the people who made them, but also, journalists don’t just want to hear about the game, but rather about its development and the people behind it.

After the talk, Adam Rosser came back to the studio to speak to us while we were working, and we showed him the prototype that I’d been working on for Bernie’s mechanic. After this, I imported the three less amplified models that Bernie had sent earlier, and I set their values appropriately (to be lined-up with the older models). I then realised that I hadn’t generated colliders when importing one of the models, so I made sure that they all had colliders generated via the model imports. Switching-out the wave models, Bernie and I tested how difficult each one was to synchronise with. We determined that any steep drops are too difficult to keep up with while the wave is moving sideways, and I explained that the satisfaction of the mechanic might come through a feeling of rhythm, and therefore the waves could be timed with the rhythm of the music, using sine waves with alternating vertical positions, amplitudes and wavelengths at different points. Bernie then tried to cover the horizontal extremities of the screen with his hands while I played, seeing whether that made the game more exciting to play. Having a portion of the wave covered made the game a bit more reaction-based, rather than prediction-based.

After this, Adam (Procter) had us show Adam (Rosser) the initial mechanical prototype that I’d made, with the player having to synchronise the vertical position of a white bar with that of another white bar, moving constantly in a sine wave motion. Adam R. said that the visual feedback that the prototype had felt good, and the prototype was inherently more interesting for it. He also suggested that controller vibration could be used in the newer prototype, based on the player’s distance from the wave, to make the player feel more uncomfortable when they’re further from it. Afterwards, he left, and I’ll definitely say that he made for a great visitor. I then continued to work on replying to Sam Green’s email, eventually finishing it after having to discuss a few details about the game with George and Bernie, as well as how to respond to certain questions. Bernie then proceeded to put two EDGE magazines at the horizontal edges of the screen while I played the prototype, occluding a large chunk of the wave and making the game much more reaction-based and a bit more stressful. I suggested that the wave could initially be confined to a small visible area, but have more be revealed gradually to make the experience become gradually more comfortable, giving the player a growing sense of understanding.

img_3470

After I headed home, I received another email from Sam, replying to what I’d sent him earlier. Rather than replying immediately, I continued working on the prototype, switching from relying on collisions staying inside the cursor’s collider to toggling a Boolean when a collision enters and leaves the collider (in the new version of the score increase script). This proposed fix didn’t work, as the cursor was still saying that it wasn’t colliding when the wave was inside it, and switching the script to being attached to the waves instead of the cursor didn’t work either, so I switched it back. This was when I switched to my backup plan, making a version of the score increase script that uses rays cast towards the cursor from in front of it, rather than relying on colliders. This raycasting solution worked, which was a relief. After this, I added the audio clip to the manager game object as an Audio Source component, and programmed the audio to be played when the wave reaches a certain position (though I did realise that I was referencing the wrong wave management script, so had to fix that). I also briefly tried to synchronise the audio track with the wave, though I realised that it was set by default to start when the wave was at a position of zero, which happened to be perfectly lined-up with the start of the wave (even if the wave’s increase speed wasn’t properly synchronised yet). Eventually, I headed to bed, and was feeling very unhappy.

On Wednesday morning, I headed to the studio early (as usual), though I was still feeling quite unhappy. After adding links for my new blog to my portfolio website, I started to feel terrible, and spent a large portion of the day incapable of getting any work done. In the afternoon, I showed Bernie the version of the prototype with audio added, though he said that the desynchronised music was distracting him and making it more difficult to play. After that, I spent even more of the day incapable of getting anything done, until I had to go to my CBT appointment (it was meant to be the last session, but my sessions were extended for an extra week). After my CBT session, I spent a bit of time speaking with Ella about her character and environment designs before heading home and eventually replying to Sam Green.

On Thursday morning, I headed to the studio early again and continued working on the prototype. I created a plane game object, made it incredibly thin and gave it a new red material, placing it in the position of the cursor. I then re-enabled the tiles that Bernie made (to represent one-second increments) and timed them moving past the red line using my phone’s stopwatch, so that I could synchronise the wave speed with the audio (with the bars taking a second to pass). While I settled on a speed for the wave, I noticed that the latter sections of the wave didn’t seem to be synchronised with the audio. Once I’d finished with the wave speed tweaks, Bernie told me that he and Ella had been speaking about an idea where a trail could be left behind the player’s cursor, with different base colours for the player’s wave and the target wave, and if the player synchronises successfully, the two colours blend to leave a trail in the colour between them (as an means of visual feedback). After explaining this, Bernie tried out the version of the prototype with the synchronised audio, and he also noticed that the later section was out of whack. However, we decided that it wouldn’t be worth changing the wave to fix it (since the music being responded to was just a placeholder for testing), so I made a build of the project and moved on. I then started working on a new Unity project for the beat-focused movement system (so that I could separate the experimentation from the main build). Once I’d created it, I spoke to George about how to handle textures with the ground (as the brass town would likely need a paved or cobbled floor), and while George felt that it’d be better to have one whole textured floor (allowing for seamless blending between different textures), I believed that it would be better to have multiple separate parts textured individually (to save on VRAM consumption by culling unnecessary textures from memory). I then spoke to George about how we could approach our team name, deciding that it needs to be both snappy and memorable, and then the whole team had a brief discussion about the names for the team and the game.

In the afternoon, we attended a lecture from Games Design & Art alumni, Bobby and Claudia, who are currently part of educational games website MangaHigh. The lecture was about their experiences developing their final major project, Hurry Hurry Heal Me, and the work they carried-out afterwards. While they spoke about a number of things (you can see my notes below), the main points that I pulled away were: the importance of working backwards from the goal to determine what needs to be done and how quickly; the freedom of adaptability and testing when you give yourself a month-long testing period before hand-in; the need to rewrite code to keep it clean and efficient; the ability to use footage of people testing the game in promotional material; the importance of knowing when to give up with a process or try a different solution; the importance of knowing when to post on social media, and which hashtags to use at those times; the importance of giving people a keepsake to hold onto, so that they remember the game; the possibilities in outsourcing and collaboration that arise from attending developer events; and the need to take a laptop, controllers and the game everywhere to make the most of testing opportunities.

After the lecture, we returned to the studio and Ella asked the team which words came to mind when thinking about the game (so that she could work on a toolkit for branding). We then continued trying to come up with names for the team and the game, and I was briefly pulled into the other room to help Jamie out with his third-person camera script. After this, I placed objects from the main Unity project into the project for the beat-focused movement system, setting their values appropriately. Afterwards, Adam, Claudia and Bobbie came to the studio, and Adam asked everyone to come up with a “two-liner” (from Adam Rosser’s talk) to describe their game. Adam said that George’s was too long, so I offered my idea for one, being “Overcome loneliness and attain understanding by harmonising musically.” After this, Bobbie and Claudia spoke to people around the room, while we worked in the meantime. In preparation for rewriting the player’s movement script, I spent some time determining all of the player’s various states, and which Booleans would be needed to denote them. Then, I spoke with Bernie about the idea he’d brought-up concerning visual feedback for synchronisation, and we spoke about the possibility of the player having to hold the button corresponding to the correct emotion while moving the cursor (where the other character changes the colour of their wave, with different colours corresponding to different emotions for different characters, and the player has to choose which emotion and colour to play, with the colours only mixing properly if the emotion is properly matched).

For a while, I then spoke with George and Bernie about how to handle sections and checkpoints in the synchronisation interactions. I felt that the interaction should be divided into multiple smaller parts, as the wave has to somehow correspond with the music, and I feel it’d be tedious for the player to have to start over from the beginning of the track if they failed to synchronise by the end (especially multiple times in a row). Bernie suggested that wave checkpoints could be stored, but that the music needs to continue from the beginning to the end, looping if necessary, as he felt that the story that the music is telling can only be told as one whole. George felt that the entire song and wave should be played from the beginning to the end, and if the player fails to synchronise at any point, the interaction is halted and restarted from the beginning. After discussing these things (and basically settling on seeing what was feasible and felt necessary through play-testing how frustrated players get), I spoke to Bernie about how I felt that the wave needs to be synchronised with the rhythm of the music, using the example that Guitar Hero feels terrible to play when the audio and the gameplay aren’t properly in-sync. I said that Bernie could determine the wavelengths (in terms of beats), amplitudes and vertical offsets of the different parts of the waves, and I could generate the waves using adapted sine equations through code, though that would mean programming a new solution for handling the waves. Bernie said that the wave’s motion rhythmically matching the music would be intuitive, referring to how people instinctively tap along to music, and would likely feel good (based on the experience of testing the previous version of the prototype).

After talking to the others, we spoke as a team to Claudia and Bobbie about our project. After we described our game to them, they gave us some feedback, and we generally discussed things. They told George and Bernie to handle the hard aspects of modelling first, saving the easier, quicker tasks for later, and they said to figure out a pipeline for the others to be sending me assets for implementation. They also suggested saving the Unity project over the cloud for simultaneous work on it, with George possibly sorting out environments while I’m working on other aspects. Also, when we described the plan for the camera direction to suggest the direction of the goal to the player, they told us to test how successful this is. They also asked whether the characters that are synchronised with follow the player afterwards, suggesting that gradually building up instruments would give a good sense of progression in a musical manner, and Bernie accepted that this would be indicative of our intended “upwards spiral.” In terms of schedule, the pair reiterated that we should work backwards from what we want to determine what needs doing, and they suggested having an open schedule structure, where everyone’s on the same page, everyone can have a hand in deciding the schedule and we’re able to determine what’s blocking what. We were also told that it’s important to plan breaks, which is something that I admit I’m not very good at. After speaking with us Bobbie and Claudia left the studio, and we got back to work. Their experience with having developed a successful FMP and achieved jobs in game development definitely made them useful and insightful visitors, and their presence was appreciated.

img_3492

After speaking with Claudia and Bobbie, I thought of how I’d be able to implement a mathematical solution for Bernie’s mechanic, using variables applied to sine waves within different values of a wave counter value. This meant looking at Unity’s API for manipulating the LineRenderer component, making sure it’s able to do what I wanted it to, and it seemed to be what I needed. After looking into it, I headed home for the evening, and watched Game Maker’s Toolkit video about designing games with cognitive disabilities in mind (it’s a really good video, and I recommend watching the whole Designing for Disability series of videos, if not just watching everything on Mark Brown’s channel). I decided that it’d be easy to incorporate an “easy mode” into our game to make it more accessible, making success windows wider for synchronising and for jumping in time with the background music. I then created a new Unity project for the second version of Bernie’s mechanical prototype, using maths and real-time line rendering, and once again looked at Unity’s API for line rendering. It seemed relatively simple and sensible, and I saw that I’m able to use it to draw lines in batches of vertices, referencing particular indices for points on the line and even setting individual colours at those points. Afterwards, I headed to bed.

On Friday morning, I went to the studio early and started making a new wave management script for the new version of the prototype, considering all of the variables I’d need to be using when drawing the waves. When the morning’s session began, I helped Adam out with carrying some drinks for people, and then took part in the National Student Survey (as we all did in the class). After that, I continued working on the wave management script, adding subroutines for toggling playing the wave and for increasing and resetting the wave counter (using subroutines I’d previously written as guides). I then spoke to the others about potential names for the game, and when George suggested ‘Legato’ (the Italian word describing when a musician makes a transition through several notes without a pause), I said that it wouldn’t be a good name. I explained that most of the game’s audience wouldn’t understand what it means, and therefore it’d give no indication as to what our game is, making it useless. Ella also pointed-out that it would mean tying the game to a single language, which would go against the intended universal nature of the game, so she said that it might be worth making-up a name (like with Gorogoa, for example). After this discussion, I added a Line Renderer component to the manager game object, and looked around to see which aspects could be tweaked in the Inspector.

02 28

After a soggy trip to Tesco, I continued thinking about, and adding variables to, the wave management script, deciding to factor-in an initial offset for the wave. I then started to feel quite unhappy, however, and tried to help Ella to come up with names for the game, seeing whether there were any interesting synonyms for words that we associated with the game. After Bernie showed me a 3D model representation of the colours of waves combining during synchronisation, I continued trying to come up with names, dividing key words into parts (taking out suffixes and prefixes) and combining them to see whether anything stuck. I then started working on a diagram to figure out how the different variables would be used for generating waves in the new wave script, though I didn’t finish it at this point.

As a team, we then had a discussion with Adam, with us initially telling him what we’d each been working on and what our problems with the project were (with mine being a worry about the feasibility of creating and implementing all of the game’s animations). As well as stressing the importance of us having a good pipeline soon for designs being provided for George and Bernie to create 3D assets from (with Ella needing to make sure the final assets are consistent with her designs), Adam told us that we need to be testing things as much as possible (in small parts), seeing whether players have enough of a sense of direction when playing. For example, we can test whether certain shapes or changes in audio volume influence players’ direction, though Adam made it clear that we need to test things early, so that we’re not clinging to things too much for them to change later on. He also said that he’s worried that the parts between the musical interactions won’t be exciting enough, and that a large part of our game could end up being a ‘walking simulator’ or a redo of Super Mario 64. He therefore suggested scaling-back to fit all of the game’s story into a single area, so that everything that’s there can be fun and exciting. After he told us that having warning signs to foreshadow ‘Triangle Man’ could be too blunt, he suggested making a small, preliminary prototype of the series of interactions with the character, just to see how effective it would be as a part of the narrative. However, I suggested that the section was very heavy on aesthetics and bespoke events to allow it to be effective, and that making a prototype of it without the aesthetic elements would be a waste of time, as it simply wouldn’t be able to communicate its ideas effectively enough to be tested. We therefore decided that the narrative in the brass civilisation would be a better fit for a small prototype, so we all determined what we’d be working on to get it done, and set that as the plan for the following week (with Adam assigning people tasks to be carried-out on particular days). It was decided that I’d work on finishing the mathematical wave generation script, then implement the main part of the movement script into my new state-based design for it, then work on getting scene audio to change its volume based on the player’s proximity to the character to be synchronised with (as she’s meant to be in a quiet area), then make sure everything works together and is properly triggered. After going into some details, Adam told us that we have a great project, but we just need to work together, make sure we’re all always working on something (as there’s always something to work on), and go back to the core of what the game is to us. As well as my notes from the discussion, you can find Adam’s notes and proposed schedule for the following week below (with the latter photos taken by Bernie, so thanks to him for those). After the discussion was over, I headed home and felt incredibly unhappy, getting no more work done before heading to bed.

In the morning on Saturday, I continued to think about and take notes on the implementation of the mathematical wave system, considering that I’d need to be storing which line point index is being referenced when a point’s position is being set each frame (as the wave would be constantly changing, and I can’t be having awkward lines between the unset points and points that are actually set). After getting some general weekend stuff done, I continued thinking about the implementation, continuing and finishing my work on the diagram I’d started on Friday. I briefly fixed the calculation to determine the distance between point samples, so that it factored-in the allocated world space for the wave, rather than its total length. I also changed the wave counter to only reset once it’s more than the width of the allocated world space off-screen, though in hindsight, I realise that this is a mistake that might need fixing. Then, to remove any unnecessary complications that could result from tweaking it, I decided to remove the variable for adjusting the wave’s wavelengths. After this, I started writing the calculations for determining the wave point positions, though part-way through I realised that the total amplitude of the waves only covers half of the vertical play area, as opposed to all of it, so I had to reconsider some calculations.

img_3496

Eventually, after much more thinking time than actual writing time, I finished programming the main position-determining calculations, though I couldn’t test it without coming up with the logic and variable values (being the wave’s x offset, y offset, amplitude and wavelength) across different sections of the wave. I then moved the Line Renderer component from the manager object to the camera, and set it to use local space instead of world space, so that the wave would be able to stay in place while the camera moves. After having dinner, I created a subroutine to cycle between wave numbers, so that extra waves can be created and switched between for testing. I then started setting some values based on the currently selected wave (with the wave being set to blank when the wave number is set to zero), and moved a cube around in front of the camera to determine some of the wave’s boundaries and its total amplitude. I then spent some time using Desmos Graphing Calculator to help me to generate functional test values for the wave generation system, where I tried to line-up the peaks and troughs of the adapted sine waves for a constant, gradual motion. After I eventually finished filling-in the values for the test wave, I called the subroutine in a condition within the loop for setting the point positions (which is only called if the wave point sample in the FOR loop is within a range determined to be within the wave).

I then realised that the wave values weren’t being initially set, and this was because their setting could only happen if a condition that required them to already be set was satisfied. Therefore, I moved the setting of the initial values (being the constants for each wave, which didn’t need changing based on the value of the wave counter) into another subroutine, which I decided would only be called when the wave number is changed. After this change, I could finally put my test values through the wave generation script, and it seemed to work seamlessly immediately, which was an enormous relief. I then tweaked some values to make the wave a bit nicer, changed some conditions in the wave counter increase subroutine to make it increase to the correct value for it to loop (which I’ll need to change back again later), added a condition for the playing toggle to be on for the wave counter to increase, and made the wave counter reset each time the wave number is changed. After all of this, I sent a short video of the system working (seen below) to the team, and after preparing some images for the blog, I went to bed.

Sunday was spent with a mixture of working on this blog post and feeling incredibly unhappy, and there’s not really anything else that’s worth talking about on this blog. I didn’t finish the post on Sunday, though, which meant that the work had to carry over to Monday. On Monday morning, I headed to the studio early and continued to work on the blog post. I briefly showed Bernie the prototype and explained how it works (so that he can provide me with waves to make using it), and he showed me 3D models he’d made of waves with varying wavelengths and amplitudes, which I could program as an ongoing sequence to test the difficulty of following certain wave motions. After continuing with the blog post a bit more, I sent builds of the two previous versions of Bernie’s prototype to him, so that he could have people test it in one of the other studios. I also spoke to Bernie about how he could handle the tests, saying that it’s probably best to simply ask people how they feel in general about the prototype while we know additions that we want to make to it, and once we’ve made those additions and changes, we can ask specific questions to get more specific feedback and direction for improvement. After continuing with the blog again for a while, I showed George the new version of the prototype and explained what the wave creation process is like. He said to make sure that I have a simple, clear mode of visual feedback implemented to reflect successful synchronisation, even as a placeholder until I can spend more time on Bernie’s idea for colour-based indication. He also said to make sure this feedback is placed on or around the cursor, as that’s where players will be partially focusing, so I suggested that I could have the cursor thumping on a sine wave to the rhythm of the music (based on the wave counter value), with the intensity dependent on the distance between the cursor and the target wave. After continuing with the blog a bit more, Bernie showed me the feedback he’d received from the play-testing session, with the most important feedback being that the first and second input methods were the most comfortable for the player who tried the prototype. Then, sections of working on the blog were intersected by looking for potential placeholder music for different parts of the brass civilisation prototype we were working on. Eventually, I headed home and continued with the blog post there, which takes me to this moment in time, in the early hours of Tuesday.

Over the rest of the coming week, I intend on continuing working towards finishing the brass civilisation’s early prototype. This means adding actual control to the new version of the mechanical prototype, as well as (placeholder) visual feedback, and putting base movement (with physical lateral movement and bespoke vertical movement, unless I’m feeling like getting that sorted will be a stretch) into a state-based movement script (which should also allow for toggling into the synchronisation mechanic), working on gradual changes in audio source volume based on position and sewing everything together with functional logic and test values. This is on top of any small tweaks that need to be made to existing scripts, as well as the actual progress that I need to be making at some point with Reflective Journal. Anyway, I’ll see how it all goes, and let you know in next week’s post. For now, I’m off to bed…