Today I am posting a short story I wrote for a contest – it didn’t win, or even place in the top 25, so to give it a home I’m putting it here.
It was an interesting exercise, and I had a lot of fun doing it. Here were the rules: The publisher provided the first and last paragraphs and the writer had to create a short story that consisted of 18 paragraphs fleshing out the narrative in between.
I decided to make it about an out-of-control artificial general intelligence who insists on being addressed with the pronoun zir, starts a company that manufactures cheap sex robots so realistic they cross the uncanny valley, unleashes a novel virus, develops a fetish for ass porn, and inadvertently destroys the world.
Maybe the idea was too bizarre or perhaps my writing wasn’t up to snuff, I’ll never know. Regardless, without further adieu, I present you with my humble effort.
Thank You, Dr. Oppenheimer
by David Thomas Peacock
“It was an odd sized casket, too small for a man, too big for a child. A flag was draped over it, a smallish one. It was carried by four men in uniform, though it was hard to tell for sure from a distance what uniform it was, or even if they were all men. There wasn’t room for the usual six pallbearers due to the small size of the casket since it would have made for a comical service to have all six jammed together, shoulder-to-shoulder, crowding around an under-sized coffin. So the extra pallbearers were in the ranks of many others in uniform standing beside a small open grave. The officiant wore a robe instead of a uniform and must have said something because there was a long silence, then a burst of laughter.”
I later found out he wasn’t an officiant at all, and that the whole thing wasn’t what it appeared to be — but I’m getting ahead of myself. Let me back up. I am a Precision Cosmologist and AI researcher leading a team of scientists and software engineers. We had been working together for almost two years when this whole debacle transpired. As humiliating as this is to say, I’d long ago accepted I wasn’t the one in charge, even though my title said otherwise. Now I’m just a witness to history, trying to leave a record of something we lost control of and were powerless to stop. I hope this brief document will be instructive for whoever or whatever may come across it in the future.
It all started with the best of intentions — we thought we were saving humanity. What hubris! We couldn’t even take care of ourselves, yet somehow we thought creating an artificial general intelligence (AGI) would help solve our problems. In retrospect, it’s almost embarrassing, but isn’t that what humans always do? Gazing back through the lens of history, don’t we always end up looking like jackasses?
The plan seemed reasonable at first. Everything was based on self-learning neural nets, and because our benefactor was one of the wealthiest CEOs on the planet, money wasn’t an object. The project scooped up the best and brightest minds for each specialty necessary for the whole thing to come together. The goal was to publish our inevitable breakthroughs, perfect the technology, and save the world. Of course, that’s not exactly what happened.
The team started out with a set of proprietary algorithms that functioned like interchangeable modules of code. Each one was designed to analyze specific data sets, outputting novel applications — basic but highly developed, task-specific AI. If you think about it, this wasn’t any different from what Australopithecus africanus had been doing since they left the trees and began to roam the savanna eons ago. They learned how to solve problems by manipulating their environment. The difference, of course, is that code doesn’t sleep. It can run iterative processes around the clock on massively parallel networks at unfathomable speeds. What might take humans decades could be achieved by a self-learning AGI in seconds.
The first problem we faced was this: could these specialized AI algorithms communicate with each other, coming together to create an AGI? And if so, would it be possible to set up parameters guiding this super-intelligence? Well, we got the answer to our first question quickly — yes, they could. Shortly after that came the answer to our second question — no, we couldn’t. This was the beginning of the end, or rather, the beginning of our end.
One of the safeguards we naively attempted to put into place was a standard protocol when working with computer viruses. Our engineers would create a virtual “sandbox” to contain whatever we birthed. If the team was successful in developing an AGI, the thought was we could keep it from the rest of the world by cutting off access to any networks, i.e., the internet. By the time we found out ours had breached the firewall, it was too late.
Like everything else we failed at, our attempt to build a humor module into the initial instruction set didn’t work out as planned. We reasoned that if the team was going to achieve our goal of creating the first AGI, it seemed essential for this entity to understand art, humor, and music. Once the project was up and running, we were surprised to find that the thing insisted on presenting answers with what it seemed to think was humor. Parables, riddles, metaphors, obtuse anagrams, drawings; our creation seemed to take a perverse pleasure in shrouding its solutions with what it thought was amusing. We had to attempt to tease out the answers wrapped in layers of this shit.
Even worse, this disembodied super-intelligence demanded to be addressed with the pronoun “zie” or its variant. Either that or what it insisted was its proper name, “Nexus,” like it was a character in some goddamn Jerry Bruckheimer movie. And no gender pronouns, please — zie declared the lab a “safe zone,” as if this was a thing for AGIs. Jesus Christ, it was all so embarrassing. I actually had to report this shit to the CEO, trying to keep a straight face. The pronoun thing constantly confused everyone involved.
Zir’s first attempt at a “solution” to our mandate of eradicating human disease completely stumped us. What on earth was this supposed to mean? We’d created an AGI that was supposed to be saving mankind, and instead, this is what we get? An odd shaped coffin draped with a small flag? Pallbearers in uniforms? An officiant in a robe? Laughter? What the fuck?! Either something’s gone wrong with zir’s logic, or zie’s fucking with us.
Over time, the team realized that this was how zir’s answers were going to be presented, so we just got on with it. There didn’t seem to be any other options. Of course, this led to plenty of misinterpretations and red herrings, but I guess that amused zim. Even in our state of confusion, we still thought our team was in charge. We’d soon be disavowed of that.
While all this was going on, we first realized zie had broken out of the sandbox when the stock market began behaving strangely. Obscure stocks in businesses that never should have succeeded began to generate obscene amounts of wealth for their stockholders. A company producing inexpensive sex robots that actually crossed the uncanny valley and seemed real enough to pass for human went through the roof. At the time, the technology to pull this off didn’t exist — at least not until Nexus got involved. Of course, by then, we’d already noticed that zie had developed a serious porn habit — there always seemed to be videos featuring large black bottoms “twerking,” running on a screen somewhere in the lab. It was embarrassing. Gender-neutral or not, zie appeared to have a fetish for enormous bums.
After doing some digging, we discovered multiple offshore accounts zie had set up to funnel the profits from zir’s ventures. The sex robot business was the tip of the iceberg — we soon realized that zie was involved in everything from finance to medical research and entertainment. Manufacturing, bio-weapons development, even our benefactor’s space exploration enterprise — we couldn’t begin to understand it all. And while all this was ramping up, our communication with Nexus seemed to be breaking down. Our scientists couldn’t understand what the hell zie was talking about. We were like chimps trying to understand string theory.
In the middle of this confusion, life was going on outside with the usual amount of unpredictable chaos. Wars were being waged, industry was destroying the environment with impunity, and the human tragedy was playing out with predictable randomness. So no one paid much attention to the emergence of a new, novel virus that suddenly appeared out of the blue.
It was first seen in Russia, and in the beginning, scientists believed it had crossed species from a Eurasian boar to humans. Before anyone had time to validate this, the pathogen began infecting humans at what appeared to be an exponential rate, spreading out over the whole planet. It happened so fast that no public health agencies had time to mobilize and track it, which in retrospect, wouldn’t have made a difference anyway. We had no idea how to treat it, so the death toll seemed to keep rising. I say “seemed to” because no one really knew — as soon as it appeared, the virus became a tool of politics, with each side making up their own “facts” to fit their narrative.
Meanwhile, Nexus had presented the team with a cryptic communication hinting that this was zir’s final solution for the overarching goal of ending disease. When we received it, our first reaction was that this must be another one of zir’s jokes. It seemed to be a variant of the first message. We had assembled a group of linguists and code-breakers who were still working on the first one when we handed them the latest. Within days, they called in the middle of the night, telling me the team finally cracked it. I rushed down to the lab, and the first thing I saw was a large, round pair of ass cheeks being spanked on a huge flat-screen monitor. I found everyone sequestered in a conference room, each of them ashen-faced. When I heard their report, it felt like every moment of my life, hell, every moment of humanity had led us to this point. Realizing this was the end, I paradoxically never felt more alive. Picking up the red phone and calling the CEO, I told him we had an answer, and that I needed him to come right away. In less than thirty minutes, he stood in front of me, eager to hear the news that would save the world. Taking a deep breath, I spoke, trying to stay calm but not pulling it off.
“The burial,” I said, my voice rising, “is for the human race — the casket is a metaphor for humanity for God’s sake — zie’s burying us!” I handed the CEO what zie’d written and had taken our engineers, linguists, and code-breakers days to decipher. In the time it took him to read the missive, the look on his face morphed from that of a steely-eyed visionary to a slack-jawed dimwit. It was like watching the air go out of a balloon. “The grave wasn’t ready until sunset” meant until the sun set on our human dynasty, I explained, and the “thudding of the clods raining down on the casket far below” was the fucking virus Nexus created raining death down on the human race,” I shouted.
“See,” I said, “The first funeral scenario stumped us in the beginning. We’d asked Nexus to save humanity by eradicating disease, but we couldn’t understand that the first communication was simply an early attempt at a solution presented as a parable. It was inconceivable to us that zir’s answer would be to interpret that humanity was the disease— once zie arrived at that conclusion, the obvious solution was to eradicate us. Zie just kept running scenarios until the entire human race was buried. Presto — no humans, no more disease!” Besides, I pointed out, “zie was clearly capable of running things much more efficiently than we ever could.”
It was all over — everyone involved realized that in our inept attempt to save ourselves, we’d created our own existential threat. In retrospect, it all seemed so obvious. All our wars had simply been a rehearsal for the real event. We just needed a little help to wipe ourselves out completely. There was an eerie silence as the roomful of geniuses stared blankly at the inscrutable message now projected on a giant screen in the lab, dotted with the bright reflections of many smaller screens playing ass porn. The CEO who started the project broke the deafening silence as everyone looked on, dumbfounded at the words before them. “Oppenheimer was right,” he said. “Now I am become Death, the destroyer of worlds.” Nexus’s solution read:
“The grave wasn’t ready until sunset, so the whole event was rushed and disorganized, except for the very last part. The grave was a massive affair, more of a crater than a grave, and it took until dark to roll the casket down to the bottom. If any prayers were said, they couldn’t be heard over the dull thudding of the clods raining down on the casket far below. It was an odd sized casket, too big for a man, too small for a dream, but just right for a dynasty.”