Tag Archives: artificial intelligence

HAL 9000. Credit: OpenClipart-Vectors/pixabay

The tragic hero of ‘2001: A Space Odyssey’

This is something I wrote for April 10 but forgot to schedule for publication. Publishing it now…

Since news of the Cambridge Analytica scandal broke last month, many of us have expressed apprehension – often on Facebook itself – that the social networking platform has transformed since its juvenile beginnings into an ugly monster.

Such moral panic is flawed and we ought to know that by now. After all, it’s been 50 years since 2001: A Space Odyssey was released, and a 100 since Frankenstein – both cultural assets that have withstood the proverbial test of time only because they managed to strike some deep, mostly unknown chord about the human condition, a note that continues to resonate with the passions of a world that likes to believe it has disrupted the course of history itself.

Gary Greenberg, a mental health professional and author, recently wrote that the similarities between Viktor Frankenstein’s monster and Facebook were unmistakable except on one count: the absence of a conscience was a bug in the monster, and remains a feature in Facebook. As a result, he wrote, “an invention whose genius lies in its programmed inability to sort the true from the false, opinion from fact, evil from good … is bound to be a remorseless, lumbering beast, one that does nothing other than … aggregate and distribute, and then to stand back and collect the fees.”

However, it is 2001‘s HAL 9000 that continues to be an allegory of choice in many ways, not least because it’s an artificial intelligence the likes of which we’re yet to confront in 2018 but have learnt to constantly anticipate. In the film, HAL serves as the onboard computer for an interplanetary spaceship carrying a crew of astronauts to a point near Jupiter, where a mysterious black monolith of alien origin has been spotted. Only HAL knows the real nature of the mission, which in Kafkaesque fashion is never revealed.

Within the logic-rules-all-until-it-doesn’t narrative canon that science fiction writers have abused for decades, HAL is not remarkable. But take him out into space, make sure he knows more than the humans he’s guiding and give him the ability to physically interfere in people’s lives – and you have not a villain waylaid by complicated Boolean algebra but a reflection of human hubris.

2001 was the cosmic extrapolation of Kubrick’s previous production, the madcap romp Dr Strangelove. While the two films differ significantly in the levels of moroseness on display as humankind confronts a threat to its existence, they’re both meditations on how humanity often leads itself towards disaster while believing it’s fixing itself and the world. In fact, in both films, the threat was weapons of mass destruction (WMDs). Kubrick intended for the Star Child in 2001‘s closing scenes to unleash nuclear holocaust on Earth – but he changed his mind later and chose to keep the ending open.

This is where HAL has been able to step in, in our public consciousness, as a caution against our over-optimism towards artificial intelligence and reminding us that WMDs can take different forms. Using the tools and methods of ‘Big Data’ and machine learning, machines have defeated human players at chess and go, solved problems in computer science and helped diagnose some diseases better. There is a long way to go for HAL-like artificial general intelligence, assuming that is even possible.

But in the meantime, we come across examples every week that these machines are nothing like what popular science fiction has taught us to expect. We have found that their algorithms often inherit the biases of their makers, and that their makers often don’t realise this until the issue is called out – or they do but slip it in anyway.

According to (the modified) Tesler’s theorem, “AI is whatever hasn’t been done yet”. When overlaid on optimism of the Silicon Valley variety, AI in our imagination suddenly becomes able to do what we have never been able to ourselves, even as we assume humans will still be in control. We forget that for AI to be truly AI, its intelligence should be indistinguishable from that of a human’s – a.k.a. the Turing test. In this situation, why do we expect AI to behave differently than we do?

We shouldn’t, and this is what HAL teaches us. His iconic descent into madness in 2001 reminds us that AI can go wonderfully right but it’s likelier to go wonderfully wrong if only because of the outcomes that we are not, and have never been, anticipating as a species. In fact, it has been argued that HAL never went mad but only appeared to do so because of the untenability of human expectations. That 2001 was the story of his tragedy.

This is also what makes 2001 all the more memorable: its refusal to abandon the human perspective – noted for its amusing tendency to be tripped up by human will and agency – even as Kubrick and Arthur C. Clarke looked towards the stars for humankind’s salvation.

In the film’s opening scenes, a bunch of apes briefly interacts with a monolith just like the one near Jupiter and quickly develops the ability to use commonplace objects as tools and weapons. The rest is history, so the story suddenly jumps four million years ahead and then 18 months more. As the Tool song goes, “Silly monkeys, give them thumbs, they make a club and beat their brother down.”

In much the same way, HAL recalls the origins of mainstream AI research as it happened in the late 1950s at the Massachusetts Institute of Technology (MIT), Boston. At the time, the linguist and not-yet-activist Noam Chomsky had reimagined the inner workings of the human brain as those of a computer (specifically, as a “Language Acquisition Device”). According to anthropologist Chris Knight, this ‘act’ inspired cognitive scientist Marvin Minsky to wonder if the mind, in the form of software, could be separated from the body, the hardware.

Minsky would later say, “The most important thing about each person is the data, and the programs in the data that are in the brain”. This is chillingly evocative of what Facebook has achieved in 2018: to paraphrase Greenberg, it has enabled data-driven politics by digitising and monetising “a trove of intimate detail about billions of people”.

Minsky founded the AI Lab at MIT in 1959. Less than a decade later, he joined the production team of 2001 as a consultant to design and execute the character called HAL. As much as we’re fond of celebrating the prophetic power of 2001, perhaps the film was able to herald the 21st century as well as it has because we inherited it from many of the men who shaped the 20th, and Kubrick and Clarke simply mapped their visions onto the stars.

Featured image: HAL 9000. Credit: OpenClipart-Vectors/pixabay.

If AI is among us, would we know?

Our machines could become self-aware without our knowing it. We need a better way to define and test for consciousness.

… an actual AI might be so alien that it would not see us at all. What we regard as its inputs and outputs might not map neatly to the system’s own sensory modalities. Its inner phenomenal experience could be almost unimaginable in human terms. The philosopher Thomas Nagel’s famous question – ‘What is it like to be a bat?’ – seems tame by comparison. A system might not be able – or want – to participate in the classic appraisals of consciousness such as the Turing Test. It might operate on such different timescales or be so profoundly locked-in that, as the MIT cosmologist Max Tegmark has suggested, in effect it occupies a parallel universe governed by its own laws.

The first aliens that human beings encounter will probably not be from some other planet, but of our own creation. We cannot assume that they will contact us first. If we want to find such aliens and understand them, we need to reach out. And to do that we need to go beyond simply trying to build a conscious machine. We need an all-purpose consciousness detector.

Interesting perspective by George Musser – that of a “consciousness creep”. In the larger scheme of things (of very-complex things in particular), isn’t the consciousness creep statistically inevitable? Musser himself writes that “despite decades of focused effort, computer scientists haven’t managed to build an AI system intentionally”. As a result, perfectly comprehending the composition of the subsystem that confers intelligence upon the whole is likelier to happen gradually – as we’re able to map more of the system’s actions to their stimuli. In fact, until the moment of perfect comprehension, our knowledge won’t reflect a ‘consciousness creep’ but a more meaningful, quantifiable ‘cognisance creep’ – especially if we already acknowledge that some systems have achieved self-awareness and are able to think compute intelligently.

The Sea

Big Fish walked into a wall. His large nose tried to penetrate the digital concrete first. Of course, it went in for a second, but Marcus recomputed the algorithm, and it jumped back out. The impact of its return threw Big Fish’s head back, and with it, his body stumbled back, too. The wall hadn’t been there before. Its appearance was, as far as Big Fish was concerned, inexplicable. And so, he turned around to check if other walls had been virtualized as well. Nope. Just this one. What business does a wall have being where it shouldn’t belong? But here it was.

He turned into the door on his left and looked around. Nothing was amiss. He walked back out and tried another door on the opposite. All desks were in place, computers were beeping, people were walking around, not minding his intrusion. It was surreal, but Big Fish didn’t mind. Surreal was normal. That’s how he liked them to be. He walked back out. There the wall was again. Has Marcus got something wrong? He poked a finger into the smooth white surface. It was solid, just like all walls were.

He turned back and walked the way he had come. Right, left, right, left, right, left, left, down a flight of stairs, straight out, left, left, left, straight out once more, left, right… and there the canteen was. The building was the way it had once been. Marcus was alright, which meant the wall had to be, too. But it couldn’t be – it didn’t belong there. He walked back up once more to check. Left, right, straight, right, right, right, straight, up a flight of stairs, right, right, left, right, left, right… and there’s the bloody wall again!

Big Fish had to log out. He walked into the Dump. The room was empty. No queues were present ahead of the Lovers, no bipolar behavior, no assurances being whispered to the new kids or hysterical religious clerks talking about being born again. Just him, so he walked up to the first of the two Lovers, and stood under it. When he decided he was ready, Big Fish pushed the green button next to him. The green guillotine came singing down.

The blade of the machine was so sharp, it whistled as it parted an invisible curtain of air. The screech, however, was music to Big Fish’s ears. It meant exiting the belly of Marcus. It meant reality was coming. As soon as the edge touched his head, Marcus came noiselessly to life in the Dump. His thoughts, memories, feelings, emotions, scars, scalds, bruises, cuts, posture, and many other state-properties besides, were simultaneously and almost instantaneously recorded as a stream of numbers. Once the input had been consummated with an acknowledgment, he vanished.

When he stepped out of his booth, Big Fish saw Older Fish staring at him from across the road. His stare was blank, hollow, waiting for the first seed of doubt from Big Fish. Big Fish, however, didn’t say anything. Older Fish stared for a minute more, and then walked away. Big Fish continued to watch Older Fish, even as he walked away. Had he seen the wall, too? Just to make sure, he began to follow the gaunt, old man. The stalking didn’t last long, however.

He watched as Older Fish turned around and pointed a gun at Big Fish’s temple. The barrel of the weapon was made of silver. My gun. How did Older Fish find my gun? A second later, Older Fish pointed the weapon into his own mouth and fired. Flecks of flesh, shards of bone, shavings of hair, dollops of blood… all that later, Older Fish fell to the ground. In a daze, Big Fish ran up to the still figure and stared out. Older Fish’s eyes were open, the skin around them slowly loosening, the wrinkles fading.

Big Fish saw them gradually droop off. Time had ended. The world was crucified to the splayed form of Older Fish. The commotion around him happened in a universe all of its own. The lights flashed around him, seemed to bend away from his bent form, curving along the walls of their reality, staying carefully away from his arrested one. The sounds came and went, like stupid matadors evading raging bulls, until the tinnitus came, silencing everything else but the sound of his thoughts. Only silence prevailed.

When darkness settled, Big Fish was able to move again. My friend, he lamented. He opened his eyes and found himself seating in a moving ambulance. Where are we going? There was no answer. Big Fish realized he was thinking his questions. When he tried, though, his tongue refused to loosen, to wrap itself around the vacant bursts of air erupting out his throat. Am I mute? He tried again.

“Where are… we…”

“To the Marxis HQ.”

Marxis HQ. The cradle of Marcus. The unuttered mention of that name brought him back. What were the chances of walking into a wall-that-shouldn’t-have-been-there and Older Fish killing himself? The van swung this way and that. Big Fish steadied himself by holding on to the railing running underneath the windows. His thoughts, however, were firmly anchored to the wall. Big Fish was sure it had something to do with Older Fish’s suicide.

Had Older Fish seen the wall? If he had, why would he have killed himself? Did it disturb him? When was the last time a wall disturbed anyone to their death? Could Older Fish have seen anything on the other side of the wall? Did Older Fish walk into the space on the other side of the wall? What could have been on the other side of the wall? Had Marcus done something it shouldn’t have? Was that why Big Fish was being ferried to the Marxis?

“I don’t know.”


“Mr. ——-, the reasons behind your presence being required at Marxis HQ were not divulged to us.”

I’m not mute, then. Big Fish laughed. He didn’t know himself to be thinking out loud. The others all looked at him. Big Fish didn’t bother. He settled back to think of Marcus once more. At first, his thoughts strained to comprehend why Marcus was the focus of their attention. Simultaneously, Older Fish’s death evaded the grasp of his consciousness. In the company of people, he felt he had to maintain composure. Composure be damned. Yet, tears refused to flow. Sorrow remained reluctant.

The van eased to a halt. A nurse stepped up and opened the door, Big Fish got down. One of the medics held on to his forearm and led him inside a large atrium. After a short walk that began with stepping inside a door and ended with stepped out of another – What was that? Did I just step through a wall? – Big Fish was left alone outside a door: “Armada” it said. He opened the door and looked inside. A long, severely rectangular hall yawned in front of him. At the other end, almost a hundred feet away, sat a man in a yellow chair, most of his body hidden behind a massive table.

“Please come in. My name is Marxis Maccord. I apologise for this inconvenience, but your presence here today is important to us. I know what you’re thinking, Mr. ———, but before you say anything, let me only say this: what happened had both nothing and everything to do with Marcus. It had nothing to do with Marcus because it wasn’t Marcus’ fault you walked into a wall and almost broke your virtual nose. It had nothing to do with Marcus because it wasn’t Marcus that precipitated in Mr. ———-‘s death. At the same time, it had everything to do with Marcus because, hadn’t it been for Marcus, you wouldn’t have walked into a wall. Hadn’t it been for Marcus, Mr. ———- wouldn’t have killed himself.”

Silence. What is this dolt trying to tell me? That they’re not going to take responsibility for what Marcus did? Why can’t they just get to the point, the idiots?! Bah! “I understand what you’re saying, Mr. Maccord. You’re saying you’re going to let Marxis Corp. be held responsible for Marcus’s actions, and that’s fine by–”

“Oh, Mr. ————, I’m not saying that all! In fact, I’m not going to assume responsibility either. You see, Mr. ————, I’m going to let you decide. I’m going to let you decide on the basis of what you hear in this room as to who’s culpable. Then… well, then, we’ll take things from there, shall we?”

Ah! There it is! Blah, blah, blah! We didn’t do this, we didn’t do that! Then again, we know this could’ve been done, that could’ve been done. Then, shit happens, let us go. Your call now. Bullshit! “Mr. Maccord, if you will excuse me, I have made my decision and would like for you to listen to it. I don’t care what Marcus did or didn’t do… and even if I want to figure it out, I don’t think I want to start here.”

Big Fish turned to leave. “Mr. ———–, your friend put the wall there because it scared him that someone might find something out.” Big Fish stopped just before the door. “Mr. ————, the wall wasn’t there a second before you walked into it. It was computed into existence by your friend because you were trespassing into his thoughts. If you had crossed over into the other side, you would have witnessed something… something we can only imagine would have been devastating for him in some way.”

Marxis Maccord stood up. With a start, Big Fish noticed that the man wasn’t standing on his legs. Instead, his torso, his neck and his head were floating in the air. From the other end of the hall, they looked like a macabre assemblage of body parts, a jigsaw held upright by simple equilibrium, the subtle cracks visible along the seam of their contours in the light borrowed from the city that towered around Marxis Corp. Him? It? It. “Mr. ————, you are downstairs, standing in booth SP-8742, your thoughts logged out of reality and into this virtual one.”

Big Fish hadn’t said anything for a while. The transition had been so smooth. Big Fish hadn’t noticed a thing when we entered the first door. It was like walking through, past, a veil. It was an effortless endeavour, a flattering gesture that drew the mind out of its body. Maccord continued to talk. “Say hello to Marcus II, or, as we call it, MarQ. When you stepped into that first door, your reality was suspended just as ours took over. Once the switch was complete, your limp body was lain on a bed and transferred down a shaft 3,000 feet deep, under this building. You are now lying sound asleep, dreaming about this conversation… if that.”

“In a world where moving in and out of reality is so easy, picking one over the other simply on the basis of precedence will gradually, but surely, turn a meaningless argument. It is antecedence that will make sense, more and more sense. Your friend, Mr. ————, understood that.”

Big Fish finally had something to say. “And why is that important, Mr. Maccord?” He felt stupid about asking a question, after having asked it, the answer to which might have come his way anyway. However, Big Fish was being left with a growing sense of loneliness. He was feeling like a grain of salt in the sea, moving with currents both warm and cold, possessing only a vintage power to evoke memories that lay locked up somewhere in the folds of the past. The sea couldn’t taste him, Big Fish couldn’t comprehend the sea. They had devoured each other. They were devouring each other.

Maccord responded quickly. “Marcus is the supercomputer that computes the virtual reality of your old organization into existence. You log in and out everyday doing work that exists only as electromagnetic wisps in the air, shooting to and fro between antennae, materialised only when called upon. Marcus tracks all your virtual initiatives, transactions, and assessments. You know all this. However, what you don’t know is that the reality Marcus computes is not based on extant blueprints or schematics. It is based on your memories.”

At that moment, it hit Big Fish. He had wondered many a time about how Marcus knew everything about the place where he worked. The ability to log in and out of reality – or realities? – gave the machine access to people’s memories. This means the architecture is the least common denominator of all our memories of the place. “You’re right.” Maccord’s observation startled him. “You see, Mr. ———–, MarQ has computed me, and MarQ has computed you. However, I own MarQ, which means it answers to me. Before it transliterates your thoughts into sounds, they are relayed to me.”

He can read my thoughts! “Oh yes, Mr. ————, I well can. And now that I know that you know that the place is the least common denominator of all your knowledge, the wall could’ve been there only if all of you had known about it. However, the wall hadn’t been there in the first place. Which meant Marcus had computed something that had happened fairly recently. Then again, if the LCD hypothesis is anything to go by, then the wall shouldn’t have been there because you continue to be surprised about its presence. Ergo, on the other side of the wall was something you already knew about, but not yet as the source of a problem.”

It was hard for Big Fish to resist thinking anything at all at first, but he did try. When he eventually failed, questions flowed into his head like water seeping through cracks in a bulging dam, simply unable to contain a flooding river. The questions, at first, cascaded through in streamlined sheets, and then as gurgling fountains, and then as jets that frayed into uncertainty, and then as a coalition that flooded his mind.

Big Fish understood this was the end of the “interaction”, that Marxis Maccord had been waiting for this to happen since the beginning. Everyone would have wanted to know why Older Fish killed himself. To get to the bottom of that, and to exculpate Marcus, a reason had to be found. Marcus had known we’d come to this. He let me hit the wall late. He let me know that none else found it odd because they’d been used to it. Marcus had let me be surprised. Marcus knew something was going to happen. And when it did, Marcus knew I’d be brought into its hungry womb to be judged… to be devoured by the sea.

“Mr. Maccord?”

“Yes, Mr. ———-?”

“Take what you need.”

“I already am, Mr. ———-.”