When Apple revealed its new programming language Swift in 2014, some prophesied it would disrupt the iOS development job market and bring down salaries. Luckily for the iOS developers, that didn’t turn out to be the case.
The forecasters shot off the mark because they made a crucial mistake in assessing what iOS developers are paid to do. I want to tell you this story because I see the same mistake being made by the crowd that today worries AI will eat up knowledge work and make us all redundant.
Before the introduction of Swift, iOS and macOS apps were predominantly written in Objective-C1. This programming language what developed in the early 1980s, and many steered away from it because of its dated syntax and the peculiar use of square brackets for invoking methods2. Swift represented a breath of fresh air for developers in the Apple platform. Compared to Objective-C, the syntax felt lean and modern. Apple had successfully smoothed the learning curve and lowered the barrier to entry.
Many observers thought developers from other backgrounds could now quickly jump into the Apple ecosystem thanks to Swift. They expected this to produce a sudden increase in supply in the job marketplace but without a comparable change in demand and in salaries going down as a result.
But Swift didn’t affect the supply and demand in the marketplace because, as every enthusiastic developer attempting to jump into iOS development soon realized, there’s much more that goes into shipping an app than knowing how to code in the recommended programming language.
Swift might have made the coding more accessible but didn’t remove the hurdle of learning how to converse with the iOS and macOS runtime, draw UIs, store data, or interact with the various other Apple APIs and frameworks. The new programming language didn’t help navigate Apple’s infamous process for signing applications for distribution nor made its arcane errors easier to understand. For all its improvements, Swift did not help developers get past the strict App Store review process.
Making it easier to write the code to build Apple apps did not disrupt the iOS development market because writing the code was and still is only a tiny fraction of what it means to work in the Apple ecosystem.
Fast forward to the present day, and we can observe similar prophecies for how ChatGPT will disrupt many professions that rely on writing or how image generation AIs will leave many artists out of a job. What a superficial analysis of the marketplace. What a demeaning mental model for what creative knowledge work means.
[ChatGPT] is unlikely in its current form to significantly disrupt the job market. Much of what occurs in offices, for example, doesn’t involve the production of text, and even when knowledge workers do write, what they write often depends on industry expertise and an understanding of the personalities and processes that are specific to their workplace.
Generative AIs will not steal your knowledge job because your job was never about pushing buttons to generate artifacts. Your job is to think critically about which buttons to push and to collaborate with other creative minds to make timely, context-specific decisions which drive the generation of artifacts.
1 – The were ways of writing apps in other languages such as Ruby with RubyMotion but they always remained niche. That no alternative syntax ever gained sizable traction is in itself telling of how the programming language of choice is only one of the many factors that affect an ecosystem’s learning curve.
2 – I had many conversation back in the day with people disliking the square brackets. I always found that to be a poor approach to software development. Any software engineer worth its title should be willing to pick up any language necessary to get the job done. As a commentator put it in a heated Hacker News discussion on why Objective-C is hard, people fret over language syntax too much.