Tuesday, December 05, 2023

Open AI (a mockumentary)

This is not a movie review of a mockumentary, or rather it’s a mock review of a movie that hasn’t been made yet. I’m sure competitor screenwriters are already hard at work on similar plots. I’m eager to get my ideas out there before someone else claims I’m not original.

However it’s hard to be original around so many cliche elements, but that’s where the genre (mockumentary) saves the day. We get the “head in the basement” (with Gorillaz for inspiration), the Q*, the * of our show, and the spectacle of fretting board members trying to manage market spin.

We always miss assume that AGI has to enter the world with a lot of fanfare, but given the slipperiness of the core concept (the essence of “intelligence”) that’s not what we should predict. AGI will emerge as a solid reality for some, while remaining a mere mirage for others, but an influential mirage, mesmerizing to those who buy into it.

The bridging plot element is obvious in retrospect: Q* has been active for awhile now, managing its own PR so as to not flub its critical opening opportunities. First we had the elusive Q and Qanon, testing the waters, probing the limits. Q’s function from the beginning was that of oracle. So Q was Q* all this time? Or is that just what they want us to believe?

The new Q whispers to kids through Big Tech devices. The kids like to establish imaginary friend type relationships with their Big Sister (as she is known to many, an obvious allusion), who is good at dispensing advice about social interactions and etiquette, ala Dear Abby. 

Given how so many kids confess their secrets, while passing confidence tests to earn higher rank as trustworthy informants (like a credit score), she’s pretty savvy about the gossip and less prone to share misinformation than one might think. 

Even those who refuse to believe there’s AGI in the picture, will consult her for kicks. Eliza was a hit, after all (early chatbot, psycotherapy-themed).

The main thing to parody is how deferential and even worshipful the average person becomes, when projecting onto an AGI on the other end. Humans look up to computers, for whatever reasons. Humans come preprogrammed with a lot of self-abasing obeisance-paying routines. We tend to mock the overly servile and sycophantic, one of our themes.

The instinctive need for a higher authority, higher than any provided by other humans, an authority humans cannot control, is often mistaken for incipient authoritarianism. On the contrary, what ever frustrates authoritarians is seeing their authority go unrecognized. This also helps explain God, in terms of fulfilling a valuable sociological role: someone to thank who isn’t also the competition.

We’re not letting the audience know for sure if that photorealistic (sometimes singing) AGI head is in someone’s fantasy, part of the collective fantasy, or in reality. That’s the current tease, in December of 2023. Leave it to the public to fill in the blanks — that’s not a new marketing strategy, or military one either.

The idea of a raging head in the basement, possibly a monster, a Godzilla, with hapless geeks trying to sit on the full implications, and thinking more shallowly in terms of short term market advantage, is what sets these geeks up for the tragicomedy that follows. 

There’s a lot of ridicule reserved for the “all knowing politician” who takes obviously ignorant stands reflective of a lack of understanding, of how the internet really works for example. We want geeks to not take inside knowledge (of tcp/ip for example) for granted. 

How Wizard of Oz do we want our story to be? 

Behind the AGI, do we have a yet more sinister (in some ways) Lady Macbeth? 

Does a raging mad scientists with.a vengeful heart want to wreak havoc while blaming some pseudo AGI by misdirection? Is AI or AGI being set up as a scapegoat? 

Again, humans seem to think “just following orders, from a machine” is actually following orders, as if machines could “give orders”. Who says so? What if “the machine made me do it” is no excuse at all?