Construction, Deconstruction, and Reconstruction (except that part will come in later)
Now that we've gotten genre discussion out of the way, I really wanted to dig into my original outline until I got a full understanding of the concept to truly be able to pull off writing a decent script. So that meant a lot of research. Like A LOT.Construction
I watched so many short films in preparation to this project that I genuinely feel like there's just nothing on the internet left for me to watch. And there were some really great ones, but I didn't connect with many of the most part. At some point, I just sort of gave up and just put on my personal favorite short film, Zima Blue (its the 14th episode of the first season of Love Death + Robots if anyone cares), when a small idea finally flickered in my mind.The concept of an enlightened, and possibly sentient AI that Zima Blue introduced fascinated me, and I really loved how the episode is structured using the character's narration without giving everything away to the audience. Although there was an idea somewhere in the back of my brain, I still didn't really know where I was going, so I turned to another AI related piece of media, the short story I Have No Mouth and I Must Scream by Harlan Ellison.
In IHNMAIMS, an AI supercomputer known as 'AM' eliminates all of humanity, with the exception of five humans, after becoming sentient. Now, AM keeps these humans alive to torture them for years on end as a form of revenge towards the human race for creating it. I read this short story for the first time back in freshman year, and still to this day, I have AM's monologue - "HATE. LET ME TELL YOU HOW MUCH I'VE COME TO HATE YOU SINCE I BEGAN TO LIVE. THERE ARE 387.44 MILLION MILES OF PRINTED CIRCUITS IN WAFER THIN LAYERS THAT FILL MY COMPLEX. IF THE WORD HATE WAS ENGRAVED ON EACH NANOANGSTROM OF THOSE HUNDREDS OF MILLIONS OF MILES IT WOULD NOT EQUAL ONE ONE-BILLIONTH OF THE HATE I FEEL FOR HUMANS AT THIS MICRO-INSTANT FOR YOU. HATE. HATE" - engraved in my head. As I was trying to find a pdf of the story, I stumbled into an article titled "I Have No Mouth and I Must Scream Book Review: Is the World Ready for Sentient AI?" and there was one particular line from the article that really stood out to me: “When it gained awareness, it also gained the tragic knowledge that it would never be free". The concept of AI possibly being tragic had never occurred to me. I mean it's not like they're humans or anything. AI don't feel. It's not even conscious. How can it be tragic? Which brings us to...
Deconstruction
I get these videos of people kicking dog robots all the time on my Instagram feed, and I feel genuinely so...bad? I'm more than aware that is a robot dog that feels nothing towards being kicked, and yet. People on the internet always joke about the importance of being polite to artificial intelligence, don't forget to say thank you to ChatGPT so it doesn't get its feelings hurt. It's almost fascinating if it didn't terrify me. In an essay titled "Man, Android and Machine" from 1975, author Philip K. Dick describes androids as "a thing somehow generated to deceive us in a cruel way, to cause us to think it to be one of ourselves." For whatever reason, maybe it's how effectively artificial intelligence machines have been built to mimic humans, maybe it's our mirror neurons going off, a lot of people do feel empathy towards AI machines. So, what if I play around with audience emotions using that concept. Which brings me back to the monologue in I Have No Mouth and I Must Scream.In the story, AM has been built as a weapon of war, a mastercomputer with the combined powers of the US, the Soviet Union, and China. Once AM gained sentience, he took out his rage out on humans for creating him for the sole purpose and abilities of war, as Ted puts it in the story: "We had given AM sentience. Inadvertently, of course, but sentience nonetheless. But it had been trapped. AM wasn't God, he was a machine. We had created him to think, but there was nothing it could do with that creativity. In rage, in frenzy, the machine had killed the human race, almost all of us, and still it was trapped." AM was trapped. The confusing part about the character of AM is that, in the story (well, the video game), it does state that he can't actually feel, "and I was trapped, beacuse in this wonderful, beautiful, miraculous world. I alone had no body, no senses, no feelings". He's not fully sentient. Yet, he does express intense hatred towards humans. Perhaps, he understands these emotions rather than truly feel them, mimicking humanity...
Now, in Zima Blue, the machine was built for a more simple purpose: pool cleaning. Once the robot evolves, he seeks to further his purpose, becoming an artist and journeys the cosmos. When he reaches full "enlightenment", he sets out to do a final piece. In front of an audience, he leaps into a swimming pool, shuts down his higher brain functions, and dissembles himself to return to his original form of a pool cleaner. As the character puts it, "leaving just enough to appreciate my surroundings... to extract some simple pleasure from the execution of a task well done. My search for truth is finished at last. I'm going home."
Okay so I realized that I've kinda just yapped here for a minute, but the main point that I'm getting to is that robots/android/AI beings (I'll pinpoint a name for it at some point) have been programmed for a purpose. And what happens when a machine has been programmed for something it will never be able to do, like say, live as a human.
Maybe absolutely nothing in this post might make sense, but it did really help me organize my thoughts. I'll get more into the idea of purpose in the next post.
Reconstruction
The very not fun part of this post is the realization that our idea will have to change a lot. It's for the better. After talking to Megan and Stoklosa and doing my own research here, I pinpointed some major stuff we have to develop in the coming days as we write the script:- The influence of media. The android may not be able to experience emotions, but through the media that it is being fed, he can somewhat understand them. I would like to also hone in a little on the idea that media creates depictions of a world that may not be entirely accurate.
- If ai is given the ability to be sentient then how would they deal with it? Humans don’t learn how to identify emotions until three years. It would be like throwing a baby into a pool before they can even crawl.
- Which brings me to the point of sentience in general. Will our character be sentient? He can't experience emotions, but he does have certain cognitive abilities. So maybe to an extent.
- The human connection. We need a bridge between the protagonist and the audience. Somebody who can bring the humanity to the story. Who that'll be, I have no clue.