Top highlight
Image by Richard Duijnstee from Pixabay

Writing’s not exactly easy for me. Words don’t flow from my brain to fingertips so much as they get squeezed and wrung out. But it’s through squeezing and wringing that I find value in writing. By forcing myself to think clearly and cogently about something, I’m left with (hopefully) better organized and more informed views on something I care about, in addition to the feelings of cathartic joy that can only come from self-expression.
If all that matters is text as output, there’s no doubt that an A.I. could produce more of it, faster, than I ever could. But the text isn’t entirely the point.
A.I. helps us get to outputs and outcomes faster. But what do we lose from the process along the way?
- **
Ted Chiang has famously described ChatGPT as a “blurry JPEG of the internet.” In that same article, he makes a compelling case for writers doing hard things themselves, without A.I.:
If you’re a writer, you will write a lot of unoriginal work before you write something original. And the time and effort expended on that unoriginal work isn’t wasted; on the contrary, I would suggest that it is precisely what enables you to eventually create something original. The hours spent choosing the right word and rearranging sentences to better follow one another are what teach you how meaning is conveyed by prose.
Writing’s just one example of a whole class of creative/mental labour in which white-collar professionals like myself engage on a daily basis. But paradoxically, many of us are rushing to embrace the technologies that promise to displace us. In doing so, we not only threaten our futures — we also cheapen the work we do today.
Here’s a couple of examples from my own line of work that I have been thinking about:
Case 1: ChatGPT can write your code for you — it promises to make coding easier. When I write in R to perform data analyses, there’s a lot of backtracking, rewriting, and Googling for packages or for help. But “rewriting” is another way to say “refactoring.” “Googling” is another way to say “learn something new.” And throughout this process I gain understanding of my particular data set’s important ins-and-outs, its quirks and nuances— information that is critical and necessary for me to appropriately analyze and interpret the data.
Case 2: It’s possible to recruit A.I.-generated users to use for your user research studies. That definitely makes recruitment easier. I’ve struggled a lot with the recruiting service our team uses, but through those struggles I’ve learned much about the incentive structure those users face. Overcoming those challenges have allowed me to build up my stakeholders’ confidence in my research findings.
Case 3: Sentiment analysis tools let you measure the positive/negative valence of a large set of texts, like Reddit comments or tweets. Read those tweets yourself, and you’ll have a very visceral sense of what your users find joy and delight in, as well as your users’ trials and tribulations, their pain, frustration, anguish, and anger. It’s harder to form a similar emotional connection with the topline numbers that an A.I. tool coolly spits out for you on a dashboard.
In these cases, what troubles me is not the quality of the output of A.I. tools — though to be sure the quality leaves much to be desired. ChatGPT’s answers are full of factual errors. A.I. users have absurd, nonsensical characteristics. Sentiment analysis use basic dictionaries of word-valence associations that makes it hard to correctly measure negations or sarcasm. But output quality can and will improve over time.
Even if an A.I. can spit out perfect lines of R for me, I’m not going to have gained the knowledge about various packages or about my data set that I would have gained working with it myself. An A.I. that writes a Medium story for me doesn’t help me think better about a topic, or give me the satisfaction of having expressed myself in my own writing.
And an A.I. that perfectly measures sentiment still can’t make me feel for my users.
- **
I don’t mean to entirely glamourize hard work. In that oft-used analogy of technological change, no doubt the automobile was a welcome respite for the horse.

At the same time, we live in a world full of people choosing to do things the hard way. For example, people choose to knit and sew when modern machinery can produce clothing faster and cheaper. Maybe you’ve been gifted a scarf knitted by a loved one. It surely would have been easier for that person to just buy a scarf, but making it themselves was a sign of their appreciation for you. It’s appreciated because it’s hard.
Using A.I. to automate work tasks comes with a cost — we forfeit any value inherent to the process of those tasks, value we reap when we do it ourselves. In many cases that might be a more-than-okay trade. But other times it might not be.
Let’s be attentive to those costs, to what we give up. When we struggle — to write, to design, to code, to create, to think — and when we overcome that struggle, we gain something in return. A.I. can make work easier, but easy isn’t the only thing that matters.