Cyborg: The Musical
To create a space for spontaneous collaboration and interdisciplinary improvisation. To explore a platform that often results in a disconnect from reality, toxic anonymity, glorified violence and combative exchanges, and to repurpose elements of this platform as an artistic medium.
“Cyborg: The Musical” is a fast-paced, whimsical, side-scroller video game. Friendly aliens and robots endlessly run on randomized sound producing blocks and use lasers for maximum music-making merriment.
Anthony Pedroza (programming, game design, visuals) Game Maker Studio
Shomit Barua (sound mapping and audio design, visuals) Max 8/MSP/Jitter
Christian Cuciniello (sound effects) Abelton Live 10
Blert Rizvaloni (characters and sprites) Sprite Creator
3x Piezoelectric Pickups
3x Xbox Controllers
Methodology (Version 1):
Program gaming environment; side-scroller with three lanes for three separate players. Select sound samples. Map elements of the game conventionally reserved for the accumulation of points to musical instruments that are triggered when a player jumps on a block, shoots lasers or collects coins. One lane is designated drums, the second is the bass, and the final track is the melody. Fourth player effects changes in tempo, and mood of the level by selecting color palettes and varying instrument sets and samples.
Too many options given to players. Majority of attention focused on executing specific tasks (jumping, flying, shooting lasers, collecting coins) and music-making was secondary. Samples triggered chaotically; beats, basslines and melody did not sync, the “challenge” of aligning these was too difficult and ultimately not very rewarding.
Simplify player mobility, reconsider gameplay, quantize notes, lock to keys of C & G.
Methodology (Version 2):
Designate shooting of all three lasers to fourth player, eliminate side-scrolling action to create static environment (OG Mario Brothers, Donkey Kong). Mapped two loops for each lane (two drums, two basslines, two melodies, six total, tempo locked) on opposite ends of each lane. Lane assigned a spectrum of amplitude values between each loop pairing, with intermittent noise-making blocks mapped to natural instruments. Player running between left and right ends of screen fades between loop pairings fill-in with noise blocks.
Cyborg:The Musical (version 2)
The game has become too static. Everything is too neat and tidy. The musical elements are not expressive of player interactions. Despite player movements being mapped to specific sounds, there is not enough variance to distinguish player input. The gameplay is not dynamic and players don’t “make” music so much as “press play” and control the volume on pre-existing music (no offense to the DJ’s out there) In regard to primary goal of this project, there is almost no room for improvisation.
*Important to note, while workshopping, our group demonstrated the platform as an improvisational space but did not give anyone the opportunity to experiment with it themselves. As presented, the class became mere viewers, which adds dimensionality to the player. In this new relationship, the players’ initial objective of simply playing the game is superseded by the emergent role of the players as performers.
Pivot. Explore ways to heighten drama onstage. Make gameplay secondary and use the gaming platform less as a space for collaboration and emphasize elements of the video game as the medium of performance?
(reference viewership of gaming Twitch videos)
Methodology (version 3):
Players now divorce themselves of the outcome of “proper” gameplay. They face away from the gaming screen which is projected behind them as a visual element for viewers. Players themselves are made entirely unaware of progress in the game. Side-scrolling action is reintroduced to give a sense of momentum, and points are earned by players through arbitrary movements. OSC data from controllers were mapped to parameters of synthesis but once again, players became fixated on “proper” music-making and deemphasized expressivity. As an alternative, piezo mics were taped to the undercarriage of each controller transforming them into electroacoustic instruments. Sound of joystick movements and clicks of button presses are amplified through three separate channels on the MOTU, and the sounds are then digitally processed in MAX. Signal from mics control three independent granular synthesis engines, each loaded with dramatically different samples (melody, bass, percussion). Combined sounds of the three different granular engines are fed through an audio-reactive patch that is projected on a short screen in front of the players to accentuate players’ acoustic expressivity. Referencing the content of Twitch videos, player’s faces are illuminated from behind the skrim as though from the glare of a gaming monitor. Jerky movements of gameplay now constitute a sort of “dance” and players roughhouse as if actually immersed in traditional gaming environment. Fourth player adopts the role of conductor and orchestrates the gameplay by advancing rounds and responding to intensity of play with correspondingly intense visual effects on the screen. Visual effects are a combination of video-processing glitches and the screen-refresh rate fixed at zero so players leave a striated trail of their animated movements.
Cyborg:The Musical (version 3) - FINAL
Granular and Audio Reactive patch
Final Test Run
The players as performers was far more compelling than players as participants. A number of thought-provoking dynamics materialize; viewing the gaming platform as a plastic object allows one to conceptualize and manipulate its various elements more freely than viewing the platform as a space to inhabit. When untethered from the conventional constraints of the space, the players’ relationship shifts—they do not interact within the space but instead, interacts with the space; the space becomes an entity that also “performs”.
Our group did not complete our primary objective; we did not create the space envisioned. Instead the project required spontaneous collaboration and interdisciplinary improvisation. All group members had varied skill sets that were plotted together in the earlier stages but developed independently. In essence, we engaged in parallel play, where each member explored his medium in isolation, and then we would throw them together and made adjustments to make them fit. This editing process drained the presentation of vitality. The interdisciplinary improvisation occurred during our final test run, when we agreed to decontextualize the gaming aspect and view each element as a performative feature. The end result was dynamic and hyper-stimulating while remaining thematically cohesive.