The automation of intellectual labour: creative or otherwise, we’re all just workers in the end

In a recent blog post, Tom Campbell (2013) pondered the end of what has come to be known, since Richard Florida (2002), as the ‘creative class’ (or as Florida himself might prefer, the ‘Creative Class’). For those of you that don’t know, this social group is supposed to consist of ‘people who add economic value through their creativity’, including various kinds of ‘knowledge workers, symbolic analysts, and professional and technical workers’ who ‘engage in work whose function is to “create meaningful new forms”.’ (Florida, 2002, p. 68). Florida suggests that this class cannot be associated with the bourgeoisie of classical Marxist analysis because it is not defined by possession of property as Marx would have understood it: ‘Most members of the Creative Class [sic] do not own and control any significant property in the physical sense. Their property… is an intangible because it is literally in their heads.’ (ibid.) The latter statement seems remarkable only if one takes a superficial reading of Marx to be the last word on class. In fact, it describes a general characteristic of skilled non-manual workers, including members of the old professions: people whose income derives not from capital they possess but from work they perform, yet whose work commands a relatively high price on the labour market because its performance depends upon scarce forms of expertise. This describes the cool, smart, and quite possibly collar-less white collar workers Florida lauds no more nor less than it does doctors and accountants – and teachers too, whose work is precisely to develop expertise in others. These people belong to what Tony Bennett and colleagues (2009) prosaically call the ‘professional-managerial class’, which is – after the distant elite of politicians, high-ranking executives, celebrities, and the super-rich – the most dominant group in western societies today.

Florida is most famous for his proposal that the creative class can bring about economic regeneration. Campbell’s (2013) contrary point is that while skilled non-manual work – including creative work – appears to have maintained its value better than skilled and unskilled manual work in post-industrial service economies, that situation may not continue indefinitely. Given current advances in computer technology (the increasing trend towards overseas outsourcing is something I don’t have space to consider in this essay), it does not seem fanciful to speculate that the expertise required for professional work of all kinds – from medical diagnosis to graphic and product design to investment banking – could one day be so comprehensively minimised that the distinction between skilled and unskilled labour virtually disappears, with the market value of the former falling close to that of the latter. As Campbell writes,

there are growing indications that those same forces that proved so destructive to the British working classes are starting to inflict comparable damage on the rest of society.

To put it bluntly, it seems that high-skill occupations can be mechanised and outsourced in much the same way as car manufacturing and personal finance. In recent decades, we have become accustomed to the notion that manual labour in the UK has been rendered obsolete, uncompetitive or poorly paid. But are we now prepared for the same thing to happen to skilled labour, to white-collar workers, to the creative classes?

(Campbell, 2013, parag. 4-5)

Predictably, Campbell’s essay aroused some indignation, at least a little of which came from designers (the professional group he spends most time discussing) who resented what they took to be the implication that mere computers might enable those lacking their hard-won expertise to produce work at the level to which they aspire. They had a point, but it may not have been the most pertinent one: the cutting edge demand the best, but cost-cutters demand the best value for money. How brilliant will a designer have to be in order to compete with a piece of software that permits an untrained person to produce passable work for nothing? It may not take a synthetic Neville Brody or Philippe Starck to knock a hole in the design economy.

Any designer who doubts that his or her particular form of skilled labour could one day be ‘rendered obsolete, uncompetitive or poorly paid’ should look just a little distance downstream and consider what has happened to printing. Within living memory, printers have been transformed from craftspeople who would actually position words on a page, making countless aesthetic decisions in the process, to digital device operators whose task is only to produce accurate hardcopy renditions of page images supplied by the client – often a designer who sets type onscreen, defining the overall parameters and intervening here and there, but generally letting the software do what a skilled human being would once have done. Can algorithms position type as artfully as a trained and experienced compositor? Perhaps, perhaps not; it hardly matters. Old-style letterpress printing continues, but it employs a minuscule number of workers, and accounts for an infinitesimally small proportion of the developed world’s total printed text output. A connoisseur would have no difficulty in distinguishing digitally composited and printed books from their artisan letterpress equivalents, but the average literate adult is unlikely ever to come into contact with even a single example of the latter. The conventional reader leafs through books that were wholly digital right up until the point when the offset plates were exposed (if print-on-demand: when the ink hit the paper); the e-book reader swipes through digital files whose text re-flows onscreen without the slightest input from a human compositor. The printer – in the sense that the technician at my old art school used once to be a printer – is already a relic of history. Indeed, for most people in the developed world, the word ‘printer’ now refers not to a person at all, but to a computer peripheral: a plastic box of electronic components that sits on the desk and unremarkably spews out touchable, foldable, tearable representations of whatever was on the screen, noticed only when it jams or needs reloading with paper and ink. Can we rule out every hint of the possibility that one day a ‘designer’ will be a software application?

And what of the less glamorous white collar occupations Campbell did not consider – for example, teaching, where some of the expert’s traditional responsibilities are now beginning to be outsourced or assumed by computer programs? Take marking. Given a humanist view of education, there can be no doubt that thoughtful evaluation of student work by appropriately trained teachers is preferable to algorithmic auto-grading: not because humans are necessarily always going to be more accurate graders but because pedagogy involves a social relationship enacted not only through classroom discussion but through the production and judgement of student work. This point was well made by a graduate teacher of electrical engineering:

I find reading through 60 essays just as tedious and time consuming as the next out-of-place grad student in a department that doesn’t value teaching, but I also recognize that reading those essays is a valuable way for me to gauge how I’m doing. Are the concepts that I think are important showing up? Are there any major communication issues? What about individuals, are some struggling, what can I do to help? How will I learn my students’ personalities and how that might affect their personal engagement with the material? How will I learn to be a better educator?

(lilengineerthatcould, 2013, parag. 4)

An intelligent and heartfelt critique. But how can cost-cutting university administrators be persuaded to give arguments of this nature more weight than the argument that auto-grading is cheap? For years, we’ve let them persuade themselves – and us – that the balance sheet trumps everything. Auto-grading is on the rise precisely because it is what the above teacher condemns it for being, i.e. ‘a tool that will enable the continual ballooning of class size.’ (lilengineerthatcould, 2013, parag. 12) Larger classes mean less individual attention for students, but they also mean reduced teaching costs per student. In some cases, educational institutions may even choose to pass those savings on – as they do with MOOCs, the courses with the worst staff-student ratios in the history of education (though see above on the best and the best value for money) – but if they do, it will be with the intention of capitalising on other potential sources of income. MOOCs, for example, may turn out simply to be a hook for student recruitment that will – like every other form of advertising – ultimately have to be paid for by the people who sign up for the ‘real’ courses. Or they may become the standard form of curriculum delivery, with other aspects of the higher educational experience being ‘unbundled’ and treated as separate (and separately chargeable) services. The latter strategy works as follows:

You want to actually meet the Brad Pitt of robotics instruction? It will cost you. You want the LeBron James of nuclear physics to examine your research paper? It will cost you. You want to chat during virtual or real office hours, get comments on your essay, or secure a coveted recommendation? Those will all cost you.

(Crotty, 2012, p. 2)

Such aggressive monetisation may seem far from the egalitarian ideal that ‘open education’ is often taken to imply. But doesn’t it at least quite splendidly contradict what I’ve been saying – isn’t it, at the end of the day, an affirmation of the value of human expertise? Not at all. In fact, it’s more or less the same scenario suggested above for the design business: a hundred thousand students for the ‘Brad Pitt of robotics instruction’ means no students at all for hundreds or even thousands of less successfully branded educators. Will more tears be shed for out-competed teachers than for de-skilled printers or for blue-collar workers whose jobs were automated out of existence or shipped overseas?

If one thing is clear, it is that a situation in which a plutocratic elite of institutions contracts a charismatic elite of star professors to ‘teach’ what would not so long ago have been considered an inconceivably vast army of students can quite successfully be passed off as a breakthrough for democracy if most of those students are getting their educational product for free. And should these things come to pass, any diminishment in the number of individuals able to earn a crust through teaching may seem a small price to have paid: particularly if it turns out that there are no good jobs left for their would-have-been students to apply for anyway.

That future is not inevitable. But the fact that it hasn’t happened yet should be no cause for complacency. Teachers: where steelworkers have gone, we yet may go. As the boring, un-hip shadow of the ‘creative class’, perhaps it falls to us to remind our cooler brothers and sisters that we’re all just workers in the end.


Bennett, T., Savage, M., Silva, E., Warde, A., Gayo-Cal, M., and Wright, D. (2009). Culture, Class, Distinction. London: Routledge.

Campbell, T. (2013). ‘The end of the creative class?’ Accessed 12 April 2013. Available at:

Crotty, J.M. (2012). ‘The coming age of the teaching megastar’. Accessed 14 April 2013. Available at:

Florida, R. (2002). The rise of the creative class: and how it’s transforming work, leisure, community, and everyday life. New York: Basic Books.

lilengineerthatcould (2013). ‘Both sides of auto-grading argument miss the point’. Accessed 12 April 2013. Available at:

6 thoughts on “The automation of intellectual labour: creative or otherwise, we’re all just workers in the end”

  1. As a software developer, I am always trying to make tools that will enable a smaller number of people to do more, better work with less effort in less time.

    Is the demand for work increasing fast enough to outstrip these efficiency improvements? The evidence suggests not. In particular, a recent report looking at the Californian economy found that corporate profitability is soaring, but wages and employment (particularly among those who are not software engineers) remains stagnant.

    As Paul Graham (IIRC) said, “Software is Eating the World, and, as Ray Kurzweil may well say, “The Economic Singularity is Upon Us”.

    Can we transition peacefully to an automated, post-scarcity economy? Or will there have to be economic collapse and famine first?

  2. Thanks for your perspective, William! It’s great to hear from one of the people who is ‘mak[ing the] tools that will enable a smaller number of people to do more, better work with less effort in less time’. Doing that is definitely a good thing, but society works on the assumption that a person only deserves a living for as long as it remains more efficient to pay him or her to do a job than to outsource or automate it. So when you ask whether ‘the demand for work [is] increasing fast enough to outstrip… efficiency improvements’, that’s the killer question.

    Right now I’m putting together a budget for a project that will involve professional quality audiovisual production. ‘Professional quality’ means I can’t propose to do it myself with iMovie and a camcorder! But how many professionals do I actually need if I want to end up with a professional quality finished product? It turns out that – thanks to advances in hardware and software – the answer is one. Yes, I could get even an even more professional product if I contracted a whole team, as once I would have had no choice but to do – but not so much more professional that it would justify the extra expense on a small project like this. What’s the overall outcome going to be? More video to watch – or fewer professionals able to earn a living by producing it? The most likely answer is: both.

    And of course, you’re right to see a danger of collapse. What happens to a consumer economy when too many of the consumers lose their income?

  3. This is an excellent post – far more developed and thoughtful than my original article!

    You have brilliantly articulated the point about technology I was trying to make in my article (and which, in failing to be clear enough, provoked a hostile reaction from many readers). As you say, the point is not that digital technologies will exactly replicate creative practices, but rather that they may well (economically) replace them. As with industrialisation, the key thing is the efficiencies and reduction in costs that new tools bring. The fact that the highest levels of craft and artistry can’t be done by machines matters little if they can do it at a fraction of the cost. Hence, even though many of us would doubtless prefer our possessions to be hand-made by artisans, the objects that fill our lives and houses are almost all mass produced in factories.

    I was pleased to see that you discussed film/tv production, which I didn’t mention in my article but actually was very much in my thoughts when I wrote it. Friends of mine who are corporate video makers have seen their production budgets (and hence incomes) reduced markedly in recent years as new entrants have started to compete. Doubtless many of these new entrants are less experienced and skilled, but they use the very latest digital equipment and editing software, and are able to make films which are only marginally lower quality at a much lower cost. It’s hard to see why the same process will not erode the economic value of many other occupations in the creative industries (design, music etc). Again, your discussion of printing is particularly relevant here.

    1. Thanks very much for such a generous comment, Tom, and sorry for taking a little while to reply. When I read your article, it was a real ‘I wish I’d written that’ moment: you were powerfully articulating concerns that had been preying on my mind for some time, both as they relate to my own industry (i.e. higher education) and as they relate to the industries that I study from that vantage point or otherwise interact with. So what else could I do but try to take it a little further?

      Outsourcing is a related issue that you discussed and I didn’t, by the way, and one that applies to education as much as it does to the creative industries: there’s been a lot of talk recently about ‘MOOCs’, ‘wraparounds’, and ‘blended learning’. You may have seen the open letter from the philosophy department at San José State University, explaining why it refused to pilot such a programme of outsourcing (, or the open reply in which the professor who originated the outsourced content stated that the concerns raised in the first letter were legitimate (

      Your comparison with craft goods is also apt, and I considered mentioning them while drafting the above post. Artisan furniture, etc, seems incredibly expensive, but the people who produce it generally subsist on very low incomes. I spoke to a gallery director who had recently exhibited the work of one of the UK’s top basket weavers; considering the amount of work that went into these pieces, the prices were very low, but she still got customer responses of the ‘But these only cost £25 in IKEA’ type.

      As for the hostile reactions to your article, I think those are impossible to avoid online, in part because of an ingrained culture of flaming and counter-flaming. It seems that some people will read an article no more deeply than they must in order to decide which side of an imagined flamewar it belongs to. I guess that one could write less provocatively, but would one then be read at all?

      Anyway, I’m glad you found time to stop by!

    1. Thanks very much for the tip, Anthea – Enrico Moretti’s work looks very interesting and I will indeed check out his book.

      I’ve been thinking intensively about these issues recently, due to the current row over arts funding in the UK. Important though culture is, it’s just as important not to resort to false arguments in its defence – especially if those arguments lead to the use of public funds to support activities that benefit very few people.

      When I can get a moment, I’ll be posting on the topic again, so I’ll look forward to more of your helpful comments!

Comments are closed.