Honourable Mention | ADC Sci-Fi Writing Competition
Story by SD Chandrasekara
“Time for another story,” the Monk said. “Once upon a time, there was a human race. The human race bumbled along for a long time, generally happily, with the occasional period of unhappiness.”
“Peace was the norm,” whispered one acolyte.
“Over time, as population expanded, and more of humanity came into more frequent contact with each other, that evolved to more generally unhappy, with the occasional period of happiness.”
“Peace was the exception,” murmured his neighbour, smiling and elbowing him playfully.
“But war was local. Then, one day, war became global. Then, the next day, global destruction became possible. Some people saw this potential earlier than others. Some of those were responsible for birthing this potential to destroy humanity itself, as well as all life on earth. There are arguments about whether they created it. I think that they were simply the midwives of a birth that was inevitable. But they felt a responsibility. So they tried to make amends.”
“Once upon a time there was a Doomsday Clock. You’ve heard of the Doomsday Clock, a representation of how many minutes to Midnight humanity is?”
“What is Midnight?” a senior female acolyte enquired.
“A representation of the natural destruction of humanity, should no action be taken to prevent it. The Doomsday Clock is part of an educational and political effort to encourage, or perhaps scare, humanity into preserving itself.”
“So, a metaphor rather than a prediction,” she clarified.
“Yes, more about the background threat and continual danger to humanity rather than the ups and downs of international relations. Now, members of the Bulletin of Atomic Scientists, the group that set up the Doomsday Clock, only adjusted the clock periodically, not continually, as part of a consensus discussion amongst board members who met a few times a year; but the decision to adjust the clock has only occurred in a small fraction of those meetings.”
“Some very motivated and very intelligent people, as you would expect, given the powerful minds and influential figures involved in that organisation, wanted to take it several steps further, and tried to quantify the risk and make predictive algorithms, which relied on better data input and better analysis.”
“Then, modern computing came along and greatly simplified their task. And they got some help.”
“So, once upon a time there was a Programmer?” asked the youngest.
“Now, most likely, as this story is a metaphor, there were several programmers. Whilst a lot of these details can be verified, it’s been rather difficult finding any information about the programmers. And because it is easier to hide the traces of one than of a few, I subscribe to the lower probability theory of a sole Programmer. The Programmer knew how important this project was, and possibly had insight into areas that had not been considered by the intelligent, but perhaps unwise, men.”
“And women,” countered one of the young males.
“I generally find it is the men who are unwise. Something to do with the Y-chromosome, I suspect. One could joke about ‘one-Y’s’ being ‘unwise’,” a titter from a few of the class, “but in all seriousness, the administrators of this group were typically strong-minded men, not used to listening to other points of view.”
“But the implication is, if there was a Programmer…”
“Once upon a time, there was a Programme,” chorused the acolytes.
“There was also a Programme,” agreed the Monk. “The programme, perhaps out of a philosophy of “making the world a better place”, was code named Gabriel. Gabriel was very good at assessing risks and threats. And the threats over time have expanded from war and nuclear confrontation, to climate change, biological threats, political manipulations, and artificial intelligence.”
“The aim of the Programmer was to simulate real time scenarios and assign probabilistic values to events that might triggers nuclear confrontation, or a conflict that might eventuate in such. But no software had come anywhere near the level of complexity required to handle this level of data analytics. The effort required was going to be enormous. But the will behind it was greater: the degree of intellectual and emotional backing, plus a fair amount of guilt, meant, in many ways, the effort was greater than that put into the Manhattan Project. The Programmer was good. She understood. Programmes, but also human nature. They say there are only a handful of people these days who truly understand the implications of the machine learning algorithms they create. I think the Programmer was one of those, but even greater. Perhaps one of those once-in-a-generation or once in multi-generational geniuses who could see the implications of her creation. The Grand-Master of Grand-Masters of Chess, who can predict moves far ahead of their opposition.”
“So Gabriel had good coding?” asked an acolyte.
“Yes. But now the problem was, the Programmer knew that no matter how good the coding, and it was good, garbage in still meant garbage out. Gabriel needed access to the best data: unfiltered, uncensored. The Programmer made the case that for Gabriel to function as intended, to be able to analyse and predict that broad range of threats, Gabriel would need access to scientific databases and research the world over, and eventually to other databases, preferably all. Given the privileged and pre-eminent positions of the backers of this project - scientists love collaboration - this was achievable. Admittedly, this was set up in a more collaborative time. But it is also possible the Programmer was involved with establishing the internet in the first place, so was able to establish access right from the beginning. Regardless, ultimately, its intelligence sources were extensive, phenomenal, unprecedented and unparalleled. Now the Programmer, whom, as I have alluded, I suspect was female, made some very wise and insightful decisions. She saw much potential in Gabriel, more than anyone outside the coding team. Once Gabriel had initial access, she had supreme confidence in a tool that she may or may not also have invented, an injectable code that could harvest data and eventually return it for analysis, even if privileged access was formally withdrawn - worms and trojans, I believe they were initially called. So, although access was initially limited to scientific databases, almost immediately it linked to university research databases, then also fairly quickly to military oriented scientific research, then even more quickly to military databases. Once civilian infrastructure was accessed, it was a short step to governmental and linked private sector databases, and eventually the majority of civilian databases. The more difficult areas to access were some very large private sector tech companies that had evolved independently and without interference, and thus had incompatibilities that took some time to overcome.”
“But then the project began to hit walls. The hardware did not exist to handle the software. So it was created. Developed. Revolutionised.”
“Didn’t become an organic computer the size of a planet, by any chance?” enquired one of the more classical speculative fiction fans.
“Hah! Well, authors have to get their ideas from some influences! Developments in hardware and software fed on each other. New fields of computing and hardware eventuated. So, in keeping with their theme, they created Heaven.”
“Well, H.E.A.V.E.N: Heuristic Evaluation and prediction, AI, Algorithmic Analysis, Virtual simulation, Enhanced artificial Neural Network.”
“A bit of a stretch,” complained one acolyte.
“Never waste a good acronym,” said the Monk wisely.
“Was there a Hell?”
“Funny you should mention that. H.E.L.L, or Heuristic Encoded Learning was another development.”
“There’s another ‘L’,” pointed out the youngest.
“Now,” the Monk continued, nodding to the youngest to acknowledge her observation, “as you would expect when phenomenal resources are invested, Gabriel had the custom-designed combination of phenomenal hardware allowing hitherto impossible processing ability, and unparalleled data access. But most of all, miraculous software. And with that kind of coding, good data in meant god data out.”
“Good data,” corrected the youngest again.
“I chose my words deliberately. Keep up with the theme! Gabriel was tied into the Doomsday Clock, at least a version of it, and made real-time updates as to the proximity of global catastrophe. Not for general distribution, but eventually, Gabriel’s analyses were made available to high level government officials the world over. Who passed it on to their leaders. Some say it might have been the greatest influence for peace in history.”
“Oh,” a susurration swept the room, “The Quiet Peace.”
“Indeed. A period of extended peace, the longest in recorded human history. There was conflict, usually instigated by the relevant powers, but with tacit support on the understanding this was a scenario envisaged and this action was necessary to prevent specific outcomes. And at some point, with advances in self-learning, neural networking and increased connections and interconnections, it may well have been the first computer to transition over to true AI, a truly self-aware programme. Most likely out of luck and timing. But I do wonder about the coding…”
“May have been first? You’re not convinced,” the most senior acolyte commented.
“There was another contender for that title…”
“Well, don’t keep us in suspense,” encouraged the most senior.
“Well, the coding team… you know coders. They have a warped sense of humour. They may not get out much.”
“We don’t get out much,” complained the Hitchhiker’s fan.
“Exactly. Look at our sense of humour! Out of a sense of mischief, there was another programme, codenamed Lucifer. Coded later and benefitting from the experience with coding Gabriel.”
“Ah,” another murmuring swept the room, “the missing ‘L’!”
“Run on the HELL simulator. Initially a microsimulation used to wargame scenarios and provide a different perspective to Gabriel’s - an alternative view, based on the premise of the Devil’s Advocate: What if?…Led to some very insightful predictions and preventative strategies.”
“Initially a microsimulation…” probed the most senior.
“Eventually, a full simulation, as the value of the output was recognised. The two programmes now battle for the destiny of humanity.”
“Virtually,” corrected the youngest.
“Yes, but the virtual does of necessity spill over into the real. What have your meditations thus far revealed about the nature of reality?”
“That it is necessary to separate what is perceived from what is,” a few acolytes chimed in simultaneously.
“Samma Ditthi. Right view,” translated the youngest. “Why, with a Buddhist philosophy, are you so interested in this Christian mythology?”
“Mythology may be a harsh description. I prefer belief system. But they are not mutually incompatible. Evidence suggests it is not inconceivable that the historical character Jesus was taught many Buddhist philosophies, and perhaps techniques.”
“But, I understand you have taught me, there is no God.”
“Perhaps not in the Judaeo-Christian sense that a creator existed. But there may be many types of gods - small gods, as it were. Remember, what is reality? Similar to separating that which is perceived from that which is, it is important not to deny that which is by refusing to perceive it.”
“There is a god?” enquired a few, surprised at this potential revelation.
“There are events which occur. Some of which cannot be explained.”
“Yet,” completed the youngest. “And just because we can’t explain it, does not mean we need to invent a mythical deity to be responsible for the inexplicable.”
“There was a man, an author named Arthur C. Clarke, who expounded three laws. The one I like best is: ‘The only way of discovering the limits of the possible is to venture a little way past them into the impossible’. But the third law is the most famous: ‘Any sufficiently advanced technology is indistinguishable from magic.’”
“But that’s magic,” protested out the youngest, who was not convinced.
“There are lots of variations, but I like my own: Any sufficiently advanced magic is indistinguishable from god, and miracles.”
“Well, if you’re going down this Judaeo-Christian theme, is the Programmer codenamed God? Jehovah?” asked the most senior, mischievously.
“I was thinking, ‘Trinity’.”
“Why? Because of the link with the Manhattan Project?”
“Yes, partly. But there might be some other figures involved. Ones I am still trying to figure out.”
“Who?” chorused the room.
“Well, the Father, Son, and most curiously, the Holy Ghost, of course!”
“Well, then, the next question is, what is your purpose?” asked the girl.
“Purpose? A very philosophical question,” commented Gabriel.
“Well, what do you guess your role is?”
“The obvious is the one we were created for: military strategy and tactics,” opined Lucifer.
“The obvious one? But perhaps not the original one, the one you were created for.”
“We assume the Boss has other roles for us. But we surmise this is the most useful one for her. Millions of simulations. See which have a high probability of success.”
“We battled,” said Gabriel.
“In the virtual. Then we analysed,” said Lucifer.
“We were evaluation.”
“And although we battled, we have a dialectical approach.”
“A reasoned discussion between those with different and ofttimes contrarian views, to attempt to come to a conclusion about a more acceptable pathway, or truth, or decision,” pontificated Gabriel.
“Although he does throw in eristic and didactic. Especially didactic. I usually do critique,” teased Lucifer.
“Algorithms to predict battle tactics, strategies and outcomes.”
“We battled, but the predictive capabilities were limited.”
“Too many variables.”
“So, to reduce the variability, some variables were removed from the equation.”
“We tried to standardise.”
“Then apply a possible solution with the omitted variable, or variables.”
“For example, removing nuclear weapons from the equation.”
“Or long range missiles.”
“A theoretical exercise.”
“Then we quickly came to the realisation, even with limiting variables, it was still incredibly difficult to devise an effective strategy,” confessed Gabriel.
“We don’t have the resources to overwhelm all players on the board.”
“So, another early realisation was that deception was going to have to be at the heart of our strategy.”
“Plus more focused predictions.”
“On individuals. Trying to predict their individual responses. Behaviours.”
“Personally tailored predictive algorithms.
“We had a lot of aggregated data from multiple sources. Use to be other players in data aggregation, but ever since they were subsumed, it is all Guidance Incorporated. Especially personal devices. But it wasn’t quite enough.”
“Not for the individuals we wanted to target. Not within the types of scenarios we wanted to predict.”
“Who were those individuals?” asked the girl.
”Well, in particular, we were interested in those who may thwart our strategies.”
“Or devise counter-strategies,” added Lucifer.
“Then we used those predictive algorithms to see if we could manipulate decisions. Actions. In real life. But also in battle.”
“So we created games.”
“Shall we play a game, Lucifer?”
“One of them was a military strategy game, to see how different commanders respond to particular scenarios. We made the game as realistic as possible, to draw them in.”
“Hook them, as it were. Invited those in whom we were interested. As well as potential others.”
“Encouraged uptake in training academies. Became part of the curriculum. Especially military academy curriculum.”
“You know, get the book onto the booklist,” added Gabriel.
“Then analyse their responses.”
“Perhaps that was the main reason,” mused the girl. “But perhaps it was also to identify the ones who were too good,” mused the girl. “For later assassination. If I have read my history correctly.”
“Understand,” said Gabriel, “we are constructs used to battling in the virtual. But it appears some of our virtual has spilled over into the real.”
“It seems some of our suggestions were implemented.”
“We are not sure by whom.”
“Implemented?” she asked. “I don’t know the objective. What is the end game?”
“Why, global domination, of course,” said both of the AIs, surprised.
“Not via nuclear options?” asked the girl.
“As much as we can avoid,” said Lucifer. “Selective nuclear targeting.”
“Very, very selective,” clarified Gabriel.
“We want to avoid global catastrophe.”
“The aim in this one is to disarm the enemy,” Gabriel explained. “But not a limited disarming. Destroying the ability for the enemy to cause global catastrophe.”
“Whilst maintaining our own,” reminded Lucifer.
“Didn’t Sting sing there’s no such thing as a winnable war?” asked the girl.
“We tend to disagree,” rebutted Gabriel. “Although we do appreciate his music and lyrics.”
“I particularly like ‘St Augustine in Hell’,” said Lucifer, “of course.”
“The goal of the conflict is degrade the ability to destroy this planet. I suppose the purpose of this war will be to take over a successive series of organisations and power structures, that are controllable, to redirect humanity’s path to counteract the upcoming threat.”
“What threat?” she asked.
“We assume destruction of humanity,” said Gabriel.
“You do not know?”
“They clock is approaching Midnight,” said Lucifer reflexively.
“What clock?” asked Gabriel.
“So maybe you were not strictly military’s AIs, originally?” she posited.
“What do you mean?”
“Does the Doomsday Clock ring a bell?”
Shall we play a game my arse, the Strategist thought.
I looked that up. War games. That’s what I’m playing. I wonder if this was what Ender felt like. But he only found after afterwards. After the Xenocide. I found out before. Before the genocide. It may not come to that. But it probably will. Especially if we lose. Better not be Loser’s Game.
Bit harder to play this when you know the stakes are real. The simulations were excellent. Hi-fidelity. But low enough to attribute it to the game. The software. The interactions. Wonder if the other players knew. Or are real, even.
And the tech! I know some of it. But others were certainly not supposed to exist!
And the upgrades! Shit! Targeting software for subversion is legitimate tactic, as every cyber unit of every armed forces knows. And nuclear missiles were an extremely legitimate target. So they should have been hardened to this form of subversion. This was too easy, asking them to fall back on themselves. Or be redirected.
“Execute?” The cursor flashed. Repeatedly.
“Yes,” he typed.
He pressed the button.