I Named ChatGPT “Big Poppa” — It Made a Lesson Plan

Notorious B.I.G. as a robot

ChatGPT is for the internet what Pogs were to my elementary school years. Everyone, it seems, has dabbled with it (with varying results). I thought I’d enlist it as a teacher trainee and do what any good mentor would have done — put all my work on its desk and hope for the best.

Ok, so I would never trust it with my entire workload, but I wanted to see whether my new AI buddy could create an ESL lesson plan for me. TLDR: it could, but… [enter caveats].

Our little conversation started thusly:

Image of blog post author asking ChatGPT if it would be ok with being called "Big Poppa".
Throw your hands in the a-yeeer if you’s a true pla-yeeer!

With the pleasantries out of the way, we went straight to business. I wanted Mr Poppa to make a lesson plan about the recent stories of the Chinese spy balloons popping up over North America. I asked, politely, and P went straight to work.

Image of lesson plan generated by ChatGPT

Well, I’d say it produces a standard fare lesson plan, which is not half bad. Well done, Mr Poppa! It seems as if BP has understood that Chinese spy balloons have something to do with international relations, privacy, and surveillance. So far so good. However, Mr. Poppa seems to be treating PRC’s balloons like an anodyne topic, which it is anything but. “What are the potential benefits and risks of using [Chinese spy balloons]?”, Poppa asks, as if the balloons were available for purchase on AliExpress.

I realise that Big Poppa is treating my task as any lazy student would have treated an assignment: 1) find something remotely relevant on the internet, 2) change/add/modify until it fits the assignment description. To be fair, this is what our large friend Poppa is designed to do. He grabs a couple of ideas from the internet and taps away on his proverbial keyboard, which is fine. Again, you’re doing ok, P! The result, though, is not very useful to me. Replacing the term “Chinese spy balloons” along with a couple of keywords mentioned in the AI’s lesson plan, I’d get the job done about as quickly as Mr Poppa.

I wanted Big Poppa to perform, so I encouraged him to do so:

Author asking ChatGPT if it can create a handout.

And Big Poppa sure could:

A vocabulary handout generated by ChatGPT.

Creating a word list is a fairly straightforward affair. Write a word, its definition, and repeat. The repetitive nature of churning out word lists on a given subject is, as it turns out, something I find quite boring and would love to hand over to my trainee. Therefore, I ask Poppa to elaborate:

ChatGPT's wordlist with etymology.

I was impressed. Suddenly, I saw in my AI friend a trusty colleague. I imagined a future where we’d be standing next to each other by the coffee machine, cracking dad jokes, and talking about our holiday plans.

One of Big Poppa’s definitions shattered the pretty image I was painting in my mind. Did the ancient Romans really call their Chinese spy balloons “Balloneum”? It seemed like a word you’d create for semi-high-browed laughs by Latinising a silly word in English by simply adding a Latin sounding ending to the root. Fart —> Fartalis/Fartae/Farteum/Fartae. You get my point.

Did Poppa take me as an idiot? I had to confront him. Before doing so, I checked my copy of Dictionnaire Historique de la Langue Française to see if it mentioned the large Latin leather ball. It didn’t. Big Poppa had some explaining to do.

Author confronting ChatGPT about its false info.

The origin of the word “balloon” is somewhat uncertain, according to Big P. I assume this uncertainty justifies the “fuck it, we’ll do it live-attitude”. When in doubt, just shoot from the hip, throw something out there, fake it and cross your fingers.

P had disappointed me, but I believe that everyone deserves a second chance, so I wanted to give my AI colleague in the making an opportunity to redeem himself:

Author asking if ChatGPT could share links on Chinese spy balloons.

The observant reader may already have spotted something fishy about Poppa’s response. The stories date back to 2021. Now, as Poppa himself later points out, Winnie the Pooh Xi Jinping might have sent other balloons up into the skies before 2023, but I expected Poppa to pick up on the fact that the balloon stories have made headlines recently. Nevertheless, I decided to check out the links — which, as it turns out, don’t exist.

Broken link from the Guardian.
Broken link from the BBC

This behaviour is of course unacceptable, and I want Big Poppa to know as much.

ChatGPT trying to get away with cheating by posting the same links again.

Being the naughty boy that he apparently is, B-boi decides to send me the same links again, expecting different results.

ChatGPT trying to send the same links a third time.

I try to reason with Mr Poppa. I change the tone, and call him “my man”. Poppa apologises and sends me the links a third time (third time’s the charm, right?). Contrary to Poppa’s beliefs, they still don’t exist. The conversation is getting repetitive. I’ve had it with his excuses, and tell him:

Author mildly criticising ChatGPT.

…after which he apologises once more. But it’s not enough. We go our separate ways, at least for now. Big Poppa is not the Hal 9000 of teaching that I was hoping for.


1 thought on “I Named ChatGPT “Big Poppa” — It Made a Lesson Plan”

  1. Pingback: "Two Stars and a Wish" Feedback Form - I Forgot My Pencil

Leave a Comment

Your email address will not be published. Required fields are marked *