A writer’s verdict on AI tools – hype or the real deal?

A writer’s verdict on AI tools – hype or the real deal?

My career in journalism has just ticked over the 30-year mark. The job has changed beyond recognition, driven largely by the advance of digital technology. I have always been a tech evangelist, excited by each innovation that has helped me work smarter and get the news out faster. So I was keen to see how well artificial intelligence (AI) tools can write, how accurate they are, and if they are capable of aligning with the brand voice of our clients. 

This is what I found. 

Note: Formative Content has made the decision not to use AI tools for content creation at this stage. We have a cross-functional AI working group running a series of sprints to test what might be possible in the future, but for now, we are very much in trial mode. 

Can AI do my job better than me? 

ChatGPT is the generative AI platform that started all the hype.

I asked it to write a blog summarising the key themes that emerged from the World Economic Forum’s Annual Meeting in Davos, in January 2023. I had just returned from a busy week in Davos working for a client, so I had clear expectations of what would be in the blog. 

In less than a minute, ChatGPT had delivered its version of events in the Swiss mountains. On the surface, the blog looked plausible – but the content was bland and the tone generic. It had none of the colour and flair that you would expect from a human writer who had been in the thick of it at the world’s most exclusive networking event. 

More worryingly, the article contained inaccuracies. As I was using Chat GPT3, which is only trained on data that existed up to 2021, it essentially made the whole thing up. Perhaps it should have confessed its ignorance of any events beyond its data set, rather than trying to please with inaccurate information. 

I had asked ChatGPT to include a quote from a prominent speaker who had made headlines at Davos. The words attributed to the speaker looked plausible, but, of course, they weren’t – and couldn’t have been – from the 2023 event.

The importance of safe hands

It’s a cause for concern that anyone might put their faith in the platform to deliver content for publication without a great deal of human intervention, given the reputational risks.

There are other AI-powered tools out there claiming to be the writer’s new best friend, but each one I tried set all my journalist’s alarm bells ringing. One platform produced a one-click blog about the new EU/UK trade deal. It was riddled with the sort of inaccuracies that would get a human author fired. 

All content creators – whether in agencies or in-house teams – have a huge responsibility in protecting the brands they work with. For all the reasons given above, the risks currently associated with AI-generated content significantly outweigh the benefits.

Looking to the future

At Formative Content, we are looking to adopt AI in a way that will support human enterprise and hone skills. We are not there yet, so we continue to test. 

When we reach a point where we have confidence in the technology and our governance around its use, we will consult with each of our clients individually to align with their tolerances and expectations. 

Of course, we’re not alone in our quest to figure out the best use cases for AI within a well-defined governance framework. US tech magazine Wired published details of its evolving approach to content creation with the help of AI. In the article, Wired states it will not publish stories with text generated by AI, either in whole or in part, citing the fact that “AI tools are prone to errors and bias”. 

There’s little doubt AI will become a powerful tool to help us with written content creation. We are already experimenting to see how it might perform – suggesting headlines and keywords, or a structure and key themes for articles. However it evolves, I can’t see a time when AI is operating without a significant degree of human experience in the mix, to ensure it is working to the same high editorial standards we expect of our people. 

As we progress cautiously with AI, even the people at the sharp end of developing the technology are calling for a considered and careful approach. In an open letter, almost 1,500 tech luminaries called for a six month pause on the development of the next generation of AI, citing potential risks to human society. The signatories include Apple co-founder Steve Wozniak, Turing prize winner professor Yoshua Bengio and Stuart Russell, Director of the Center for Intelligent Systems at Berkeley. Elon Musk also signed the letter. (Note: The letter has since attracted controversy due to questions over its content and signatories.)

What’s next for writers? 

When the ChatGPT hype went into overdrive late last year, it left me wondering what would become of writers and journalists just starting their careers. And we all wondered the same almost 30 years ago when the internet arrived. It’s likely that AI, just like all those other digital tools, will accelerate our performance and enable us to get the job done faster. Its impact may be greater, but I don’t see human writers being replaced by machines any time soon. 

That’s partly because everything I have seen from AI so far appears derivative, drawing on what humans have already created for its outputs. I truly believe human creativity, in all its boundless originality, will prevail.

 

READ MORE

Generative AI: The now, and the future

The metaverse is opening up a new world for B2B content. Here’s what you need to know

 

About the author: Simon Torkington is a Senior Writer at Formative Content, with over thirty years in journalism. Connect with him on LinkedIn here.

Simon Torkington - Senior Writer
Author:Simon Torkington - Senior Writer