In his chapter on videogames as “allegories of control,” Alexander Galloway reflects on one way that videogames and film diverge: where film hides and sublimates political allegories, games “flaunt” them by pulling the player into the governing algorithms and demanding that the player “know the system” in order to succeed in the game (91). He goes on to argue that if Sid Meier’s Civilization is “about anything, it is about information society … about knowing the system and knowing the code.” (91).
Bringing Galloway’s argument into conversation with the time I’ve spent playing browser game A Dark Room, it seems that this game’s own algorithmic allegory is less about learning the intricacies of and working within a system but rather coming to terms with a scarcity of knowledge about that system. In A Dark Room, I know preciously little about the short-term or long-term consequences of any of my actions. A Dark Room takes the core clicker-game gameplay loop of “spend resources so that you can generate more resources” but fastens narrative weight to your actions with a slowly unfolding storyline, wherein you (with “you” being another unknown variable) guide and grow a struggling village in an unforgiving environment.
In most other clicker games, haphazardly spending all of my resources on random purchases is a harmless if sub-optimal strategy, since I’m bound to regain the cookies/points/$$$ I spent within minutes. In A Dark Room, I can’t afford to be so careless. Spending all of my wood on animal traps will net me plenty of meat and fur, but it won’t get me any closer to my goal of developing the village and shining more light onto the narrative.
Civilization advises the player to “fortify units… defend them against barbarians.” A Dark Room dispenses no such hints. I can spend a huge amount of my stores to build a smokehouse, but I have no clue if it’s a worthwhile purchase, or moreover, what that smokehouse even does. Similarly, when I turn down the nomad who enters my village begging for lumber (only because I had none to give!), I have no idea if my “decision” will come back to haunt me later, immediately, or at all.
I’m guessing that A Dark Room’s refusal to illuminate players with much knowledge about its systems is more an attempt to induce an atmosphere of helplessness and desperation than to make a point about our (lack of) knowledge regarding algorithms. Still, the net effect aligns well with Galloway’s argument that Civilization and many other games are “about knowing the system,” and with a later point on how games (and modern software generally) prize “flexibility” – the ability to accept, aggregate, and process any kind of input data. (100).
Visiting virtually any website or using any piece of software today means submitting to and contributing to an algorithm whose “flexible” inner workings are largely hidden from us. When algorithmic illiteracy is the norm, a game like A Dark Room can spin its own secrecy as a plot device rather than an overt commentary on how technologies erect a wall between their processes and the user. Sure, it may be more comfortable to read A Dark Room’s algorithmic mystery as just a mood-setting tactic. But given the present scarcity of games that invite critical thought about the dangers of “flexibility” as a software paradigm or its implications for user autonomy and privacy, maybe A Dark Room can illuminate the path.
Galloway, Alexander. “Allegories of Control.” Gaming: Essays on Algorithmic Culture. University of Minnesota Press, 2006.
A Dark Room. Browser. Developed by Doublespeak Games. Doublespeak Games, 2013.
Sid Meier’s Civilization. DOS. Developed by MPS Labs. MicroPros, 1991.