Bold headline, right? An existential threat to what exactly? Well, um, everything compute-related. But let’s narrow this missive down to, Airtable itself (for example).
Unless you live in a cave, you’re aware of ChatGPT. It’s in the process of roto-tilling pretty much everything compute-related. While not generally seen as a threat to much, we all see the writing on the wall; the computing world will change profoundly with LLMs, and the rate of advancement AGI has demonstrated in recent weeks and months.
Yesterday, OpenAI opened the kimono to reveal yet another massive game changer; GPT Plugins.
Let that sink in.
While all aspects of this new GPT capability are significant, one flies in the stratosphere of all other possibilities - to run computations. While everyone is drawn toward “access up-to-date information” in a mesmerizing mental trance like a Star Wars tractor beam, the most stunning part of the announcement will be vastly overlooked.
We crave the day when GPT can see the Interwebs in real-time. GPT plugins make this happen in an instant with yesterday’s announcement. But as cool as that is, it will ultimately be seen as a distraction away from the tectonic plates that are about to collide.
Run Computations
What will the future be like when you can ask a system to build a process (as code), provide it with some data, and then run that process against it?
As demonstrated here, OpenAI has revealed:
- ChatGPT is a full compute stack.
- ChatGPT has CPUs and GPUs available for everyone.
- ChatGPT has a file system.
- ChatGPT can transform your words into processes and execute them.
Let that sink in a little deeper.
Imagine Zapier comes along and says to OpenAI -
Hey, with that new plugin architecture, what if we made it possible to use our five thousand connectors in natural conversations to get data, documents, and other information that the LLM can use to answer questions, write reports, and otherwise serve as the dynamic gateway to everything the LLM wasn’t trained on?
This hypothetical is not fiction; it’s a done deal.
Airtable Threat
And then Zapier says -
BTW, what if we could also provide the database backend for all manner of natural languages to store or transform data?
This part is hypothetical but a likely reality in the near future. I warned here not long ago that Zapier is coming for more than your adhesive dollars; it wants your data.
Suppose you can interface with any datastore, retrieve data, perform computations on that data, and build your apps’ computational aspects by using your natural prompts in any language; who (or what) stands to be disrupted?
For starters - everything related to integration. GPT can understand how to POST or GET data to and from any endpoint based purely on API documentation. It has this capacity today. GPT plugins take this innate integration ability to another level - they can execute data interchanges, build log files, track performance, and report exceptions.
Let that sink in.
On the Upside
GPT plugins make it possible to ask ChatGPT for information stored in Airtable. The mechanism for building features that interface directly with Airtable is relatively simple. On this vector alone, we can anticipate massive integration opportunities. It will soon be possible to make queries like this:
Collect all records from my Sales table for March 2023 and give me a synopsis of the most significant sales in Montana; analyze those sales and produce a summary of trends by product type.
When Airtable users see what GPT plugins can do, they will demand it. They will expect it.
Airtable could easily provide GPT plugins. However, it’s the Airtable aftermarket companies, consultants, and users themselves who will likely meet this vastly underserved market that is about to explode.