Alleson agreed to play through Tacoma by Fullbright with me last month. I’ve been looking forward to it for a while. I found it interesting but wouldn’t consider it a new favorite. Spoilers ahead.
It’s a brief story about an emergency on a space station, that does some exploration of the political ramifications of strong AI. You retroactively explore the lives of a diverse crew through out-of-order Augmented-Reality recordings of the days and months leading up to the emergency. There is a plot, but in many ways this is a slice-of-life work, more about worldbuilding than about particular events.
The AR recordings didn’t work quite like I expected them to. Each zone had one or two recordings in it. Our interactions with them were typically to listen through the whole thing, following one group of characters, then rewind to the beginning and follow a different group of characters through it. The recordings are timestamped and fairly complete, so there wasn’t much challenge to reassembling events a la Her Story. There were a few places where the physical performances of the characters were important (at one point you learn a character’s passcode by watching them enter it on a keypad) but this wasn’t terribly common. There was not much work figuring out how to be in the right place at the right time, or of figuring out how to gain access to part of the recording that you otherwise could not see. As a narrative conceit, I’m not sure the AR added a whole lot over having audio logs, for what was happening here.
We played on PS4, and the controls were something of a barrier for Alleson, especially the zero-G sections; it was a good call to limit the critical interactions that took place in the center of the station.
I was expecting something a little more like I imagine a Punchdrunk theater production to be (although I haven’t experienced that, either). I’d love to make, or play, a game where instead of the shorter clips of recording you were given an entire two- or three-hour timeline to scrub through at will. That would put a lot of empty space/time in the game, but it would be a different sort of space to explore and could make a puzzle out of understanding the relevant places/times. I’d also like to explore the idea that you were working with an AI to reconstruct events, so you unlock bits of reconstructued “recording” by examining evidence in an area, and introducing new evidence – say, bringing the murder weapon back to the scene of the crime – recontextualizes events and updates the “recording” you can play back in that area. It would be interesting to have moments where the recording just doesn’t make sense, like a person walking through a wall, as clues to some part of the environment you’re not familiar with yet, or as hints that the reconstruction you’re viewing isn’t correct yet. I also understand that such a project is prohibitively expensive – even two hours of voice-acted and animated content for multiple spaces is probably a stretch for a small team.
Another thought was that this sort of experience would be so cool in VR. I had the chance to play The VOID’s Star Wars: Secrets of the Empire this year. That experience had some pushing buttons and pulling levers, and a whole lot of shooting – not trivial interactions, to be sure, but it would have been way cooler to investigate a crime scene, with a life-size AR recording you could fast-forward or rewind using hand gestures. I suspect I’d like that intersection of these two ideas better than either of them individually.