I should be wracked by internal conflict. As a software engineer by profession, I have been given a mandate to use AI. Regardless of whether AI is scuttling the careers of junior engineers, or whether we’re in a bubble to rival that of the dotcom boom of the late 90s, there is nowhere I could ply my trade that would permit me NOT to use it.
As a writer by vocation, I refuse to use AI to write for me. The thought of removing myself from the process of putting word to page makes me queasy. Even if the outcome was a decent paycheck from Amazon sales $0 earned thus far. for a week’s worth of AI hackery, I cannot bring myself to embrace AI writing.
Even though I reside in both technical and artistic provinces, each with vastly different relationships to the technology, I maintain a strict delineation regarding acceptable AI use: AI does the tedious work; it does not do the meaningful work. Caveat lector: the definitions of “tedious” and “meaningful” are tied only to my own ambitions and interests.
There is, in fact, a precedent to this relationship in the technical world.
✦ AI does what tech workers have been doing for time immemorial ✦
Anyone who’s ever built a web site has ripped off HTML from another site. Copy/paste, boom: scrolling marquee! Tiled background! Alert popup! It’s a rite of passage. It’s how we learned. You would be hard-pressed to find any software engineer who a) hasn’t done this and b) isn’t thankful to their predecessors for the opportunity.
All proprietary enterprise products for the largest companies in the world use open-source software, either off the shelf or heavily modified. It costs nothing, no credit is given! There are often license restrictions, but these usually don’t prevent anyone from taking the code and using it to make a monetizable product.
Tech has had an enduring culture of sharing, so it’s not a surprise that AI use is acceptable and encouraged, even if it’s been trained on other people’s code. Most of that code was offered freely as part of this culture.
For my personal projects, I find AI liberating. It enables me to write software I otherwise wouldn’t have been able to write. I’m a terrible UI designer. Despite years of attempts, I haven’t managed to cultivate the skill. AI, however, has become a front end collaborator that can create more-than-serviceable UI for my projects that have either languished in a half-completed state, or those I was simply too embarrassed to show off.
AI augments my 20+ years of professional experience. It’s a coworker, not a surrogate.
It’s important to note that even though I haven’t written all the code by hand, there’s no point at which I’m absolved of the responsibility of knowing what I’m doing. I’m 100% responsible for the output - its correctness, its look and feel, its shortcomings. I’m responsible if paying customers do not have the experience to which they’re entitled. There’s no substitute for professional experience and human presence. Someone, a human representative, has to own the output. Someone at some point, maybe several layoffs prior, told an AI agent what to build, and no matter what, there needs to be a human in charge, even if it’s the know-nothing CEO.
✦ AI shouldn’t do the fun stuff ✦
AI models didn’t get to be as capable as they are without a ton of training material. There’s no shortage of news stories about the lack of ethics at AI companies as they eagerly pirated copyrighted work to use in their corpus And now Anthropic scrambles to thwart copyright issues on code they accidentally leaked? The universe does have a sense of humor!. Software code is, in large part, available and reuse is acceptable. It’s NOT legal to use software against its license, work is still copyrightable, and IP law still applies, but the ethics seem to apply less to snippets of code than they do the final products themselves I am emphatically not a lawyer..
When these models are tasked with writing in the style of a specific author, it’s as if the author’s entire business, even their identity, is being hijacked. The IP is their work, their voice, their perspective. That these models were trained on an author’s body of work feels like an attempt to supplant them in the same way that creating a social network called Facebook with Likes and Friends and Pokes and all that would be infringement.
Even if AI tools are writing without explicit instructions to impersonate, authors haven’t opted into this relationship willingly. There was no pre-existing culture of sharing as there was in the software community.
It should go without saying that of course I don’t want anyone to impersonate me. Training ethics and fair use aside, as an artist I don’t want the robots to make art. I want to make the art. I want it to take a long time. I want to struggle with the creative process.
Software is more about the end product for me. It satisfies a functional desire. Not so with writing. The process is the thing. The output is a byproduct of the process and a reflection of the author. It says something about the person who made it whereas most software doesn’t. <label for=”mn-4” class=”mn-toggle” aria-label=”Show note: Cue someone chiming in with “Software can be…”>†</label>Cue someone chiming in with “Software can be art!” and I agree, but mostly it’s functional.” When we think of Google’s products, aesthetics, commentary, metaphor, evocation of ideas aren’t the first things that come to mind. That’s not the company’s raison d’etre.
Far be it from me to yuck your yum if you want to read the stuff that AI agents write I will remain a silent judge in this regard.. I want to read books by people so I can experience the joy in the mutual recognition of quirky observations about life! It’s cool if a robot makes an interesting observation, and it doesn’t make the observation any less true, but it’s not special. It’s not a shared experience. Something is lost in the interaction.
The artist in me objects to the tool because writing is fun, even if talent eludes me. Art is for me! Robots should do my laundry, wash my dishes, clean my house, relieve me of the toil in my life so I can self actualize, or at least do a puzzle.
✦ I certainly didn’t ask for this ✦
I’m not going to argue about the ethics of how AI models are trained. By now that matter has been adjudicated to death and the news cycles have turned over and AI companies have grown so large and influential that their technology is being shoved down our gullets by money-grubbing CEOs holding our careers hostage. It was created unethically and it’s here to stay. The systemic issue has become a quagmire that we’ll all have to navigate over the long term.
The only way I could see myself breaking out of the sphere of influence is if I became a turnip farmer in the Yukon wastelands, and I would love nothing more, but I have kids and I doubt the school districts are any good.
The best I can manage is begrudging acceptance. The road will be long, and I’m still bitter about the idea that I need a smart phone.