When ChatGPT launched on November 30, it captured attention in a way that other AI products have never quite managed.
Within five days, it surpassed one million users (a feat that took Netflix three and a half years), and almost immediately, the speculation began. In between the excitement of experimentation, people expressed concern about what this tool meant for the future of work and, specifically, the future of jobs. Will AI make all white-collar jobs obsolete? Or will it just sweep away writers? Or designers, too? What about lawyers? Software developers?
AI isn’t going to replace developers, but it is going to change software development forever. AI pair programming assistants will become integral to the development process, taking over certain time-consuming tasks, delivering massive productivity gains, and allowing developers to focus on the more creative aspects of their projects.
AI’s impact on the future of work will be profound, but it will fit within a progression of existing trends. Like past technologies from the PC to the cloud, AI will fuel a surge in demand and experimentation before ultimately becoming a normalized part of the development process.
As people, we don’t handle change well
The latest uproar over AI is a reminder that humans aren’t good at dealing with the uncertainty of change. We tend to dismiss or catastrophize new technologies. Plato once warned that writing would eliminate the need for memory, which, when you think about it, isn’t that far removed from the tweets and TikToks forecasting that AI will wipe out entire professions.
When considering technology’s impact on jobs, we can draw two main takeaways from past innovation cycles.
First, new technologies tend to create more jobs than they erase. For example, the rise of the PC and the internet wiped out about 3.5 million jobs in the US from 1980 through 2018, according to McKinsey estimates. But that same rise also created 19.3 million jobs, for a net gain of 15.8 million jobs.
Second, new technologies are adopted over a longer period of time than we tend to assume. Technology doesn’t just spring into existence. Jobs don’t just disappear. Major technological shifts can take years to play out. The smartphone, arguably the fastest-adopted technology in human history, took four years to reach 40% market penetration.
As the U.S. Bureau of Labor Statistics observes, “[the] immediate effects [of technological change] are probably smaller than anticipated and their full impact unfolds gradually over a longer timeframe than recognized”.
It’s critical to note that these are precedents, and there’s no guarantee that what’s happened in the past will hold in the future. AI will probably follow the usual course of innovation. It will probably upend whole sectors and create new ones, eliminate some jobs but create more. And it will probably do so on a time scale that allows us to adapt.
Just remember that probably does not equal certainly. AI may advance a lot faster than we anticipate, or stall out and have nowhere near the impact most think. Regulated industries and the likelihood of future regulations could impact adoption. There’s also the possibility that AI spins out of control and we end up in some kind of doomsday scenario.
AI is the next step in the march of progress
ChatGPT didn’t come out of nowhere. AI has been under development for quite some time. One early breakthrough in neural networks, training an algorithm to detect cats in YouTube videos, occurred in 2012. And the Transformer network architecture that underpins today’s large language models (LLMs) was first proposed by Google in 2017.
Transformer-based LLMs like GPT-3 and DALL-E are writing essays, turning natural language prompts into images and code snippets, and even identifying protein relationships to speed up drug discovery.
But they’re also building on a steady advance of innovation.
Programming itself has been evolving into something increasingly resembling English. Python, one of the most popular programming languages today, was once regarded to be as close to written English as they come. ChatGPT moves even closer to English by allowing users to generate code from natural language prompts.
What’s more: technology is always evolving in ways that allow developers to work at increasingly higher levels of abstraction. The higher the level of abstraction, the fewer granular, in-the-weeds details the developer needs to think about.
Every programming language is an abstraction of the 0s and 1s that all computers actually run on. Cloud is an abstraction, allowing developers to have selective ignorance of the distributed systems they’re deploying to. Containerization, elastic load balancing, low code environments, intelligent auto-complete tools like GitHub Copilot—all of them take some tedious, manual element of work and abstract it away.
AI represents a next step in these two converging trends; the blurring of programming and natural language, and the progression to higher levels of abstraction.
Where does this take us?
What does the growth of AI mean for the future of work? For developers, it’s going to mean evolving their skills, embracing new technologies, and shifting their conception of what it means to write code.
AI will abstract away an increasing share of basic but time-consuming coding tasks—think debugging, compatibility testing, and documentation. This coding grunt work can eat up a lot of time, and AI tools like GitHub Copilot, Replit Ghostwriter, Mintlify, and others are already demonstrating significant productivity improvements.
Conversational interfaces like ChatGPT are showing a lot of potential and future versions could become mainstream tools in the development process, particularly for initial prototyping. It’s not hard to imagine such tools shortening development timelines by weeks. In such a scenario, the emerging skill of prompt engineering would become essential for many developers.
AI advancements will also have a major impact on future developers. Abstraction tends to increase accessibility, and therefore adoption. Think about what graphical user interfaces did for PCs, or what the cloud has done for deploying software. In the field of machine learning, abstractions created by frameworks such as PyTorch and TensorFlow opened up opportunities to students in undergraduate and masters programs, not just PhDs. AI can do the same across the board. It can help new developers get those early wins that can get them hooked. It can enable personalized learning at scale, and it can help established developers extend their skills and learn new ones.
AI seems poised to fuel a surge in software development. While it remains to be seen how fast AI will be adopted and how far its capabilities will grow, it will bring developers along with it. AI tools will make the field more accessible, deliver productivity gains, and enable developers to focus on more creative and challenging problems. All of that may mean that certain skills, even some we consider foundational, lapse in the coming years. And that is totally natural. That is what innovation is supposed to look like. It’s supposed to take us forward—to empower us to work at higher levels of abstraction.