What exactly is ChatGPT, how does it work, and why is everyone giving so many shits about it?
Firstly, let's forget about ChatGPT per se. The real meat in the sandwich is OpenAI's GPT, or Generative Pre-trained Transformer. That's the tech behind ChatGPT, Bing's weirdo chatbot, and the same technology as Google's up-and-coming Bard.
Ultimately, GPT-n and other pre-trained transformers learn by gobbling up heaps of text data (read: the internet), and creating a crazy database of associations. As an example, if I say "Green eggs and... " you think of "ham", right? That's an association, and GPT makes trillions of them, and stores them in a huge database.
(If you didn't come up with "ham", then you didn't read enough Dr Suess when you were a kid.)
(I know, I know, his stuff is all racist now 1, but when I was a kid he was fine.)
Anyway, after all those associations have been made, the model gets fine-tuned for specific tasks like chatting, summarizing, translating, or just about anything that has anything to do with words (including writing code, which sucks / is awesome for me... more on that later)
At it's heart, GPT is a language model that never stops asking "What word should I put there next?"
When GPT gets down to business, you ask a question (called a "prompt"), the model has a think, spits out a word, and then keeps picking the most likely next word until it has cooked up the perfect response.
Ultimately this tech comes down to a big game of "According to everything I've ever read on the internet, and what I've been told by my coding overlords, what word should I put there next?".
For a more in-depth, better written, and more technical response, check out the . (if you can't see the page, download and you should be good to go. And then, consider subscribing to the NY Times.. it's one of the few publications I'm cool to pay for).
So what's the big deal? Why so many shits given?
It's a fair question... After all, new tech is coming out on a daily basis.. Elon Musk even has re-usable space rockets which is wicked cool... why should this new tech be causing so much panic?
Well basically, not all that many people work in rocketry, and these models are getting ridiculously good at predicting what word comes next.
So good, in fact, that Google's LaMDA model totally convinced one of their engineers it was a real person. So convinced, in fact, he went rogue, created a media firestorm, and ended up getting fired. 2
That's totally awesome, and hardcore, at the same time.
(On the whole "is this shit ALIVE??" question, we're not going to deal with that just yet. It is, however, an awesome topic, and we're def getting into it a little bit later.)
These models are getting so good at predicting what word comes next, they're damn close to being able to take over a whole bunch of white-collar, in-office type jobs.
Need to fill out a grant application? No problem, just fine-tune the model and it'll pump 'em out for you. No need to pay that grant writer $70k a year. Need an article written? All good, takes about 5 minutes and it'll be better than just about anything you've ever written (I'll show you how to do that in a bit).
Even more impressive, these models can write delicious code. I've been writing code for a loooong time, and I regularly use ChatGPT (I have the paid version) to optimise my code. It's a game changer.
Yeah yeah, automation's gonna take all our jobs, been hearing that for years and I've still got a job... So seriously, why all the bitching, and panic?
Coz this time it's real... let's take a real-world example.
Let's say I need a login form built in React using Typescript. Watch how damn quick and easy it is to get ChatGPT to produce almost perfect code in less than 3 minutes. (If you're not a nerd, don't sweat it, there's still plenty to enjoy here.)
-- Insert Video --
(If you're a React nerd, there's a little hint in there as to why all is not as it seems - pretty soon the best of us are going to be making even better daily rates than we do now.)
Now... if I had a dev working in a BPO in the Philippines doing some code for me, I'd be paying them $10 an hour, and it'd take about 2 hours to write that code. That's $20 for that one task, not including the time it takes me to write the ticket, explain what I'm after, and then review the code.
ChatGPT did it, beautifully, super fast, including all requested revisions and updates. And it cost me whatever 3 minutes & 12 seconds divided by $US20 per month works out to. Which is basically nothing.
I know a lot of devs who are silently sweating. Looking at their mortgages and considering a career swap to cobbler, or surfboard shaper, as they figure AI's never gonna take that gig away from them.
But it's not just coders, it's writers, and accountants, and lawyers, and doctors, and... well, pretty much anyone who works in an office.
As an example, if you've worked in large corporate or government, you'll know all about the tendering process.
There are whole departments in large companies all around the world filled with people who do nothing but write tender documents. And these things suck to write. They're long, boring, often very technical, and you have to be super careful to make sure you don't miss anything. Landing a big tender can be the difference between a company surviving or going under, there's often millions of dollars on the line.
With a little training and fine-tuning, ChatGPT can write tender documents that are just as good as anything a human could write. And it can do it in a fraction of the time. For bugger all cost.
And if you're an article writer for a website? Fuggedaboutit!
(Watch how I write a quick couple of paragraphs for this very website in just a few minutes)
-- Insert Video --
And that's me physically writing in the prompts. I'm already aware of at least 3 apps out there that make this process a breeze. And what's Google going to do with the tsunami of GPT generated articles that are about to flood the internet?
(Don't get me started on why I really don't recommend going into Search Engine Optimisation any time soon!)
And that's why there's a collective puckering around the world. History has shown the whole "automation will take away jobs but it'll create more in the long run" isn't an accurate read on what goes down. Maybe after a while jobs get created and the workforce reaches some form of parity, but that's only after a long period of disruption, and averages derived from large datasets hide the impacts happening right now on individuals and their families.
I suspect, and I'm far from being alone in this 3, that heavy shit is going to go down in the next five years. Whole industries are going to be completely transformed, and those who don't adapt will lose out big-time.
On the other hand... there are fortunes and reputations waiting to be made.
So let's not reach for the whisky bottle just yet, and instead brighten the mood by having a chat about the creation of the internet, how money really got made during the gold rushes, & personal hyper-productivity.
Oh, and why this time, being an old bugger is a total, epic, game changing advantage.
Dr. Seuss Books Are Pulled, and a ‘Cancel Culture’ Controversy Erupts. Alexandra Alter and Elizabeth A. Harris. March 4, 2021. ↩
Google Fires Engineer Who Claims Its A.I. Is Conscious. Nico Grant. July 23, 2022. ↩
Elon Musk and Others Call for Pause on A.I., Citing ‘Profound Risks to Society’. Cade Metz and Gregory Schmidt. March 29, 2023. ↩