Longer online version: sub-edited for print here

‘They’re not going to care until we stand up and make them care‘

Union colleagues fight to defend rights from AI

It's only right that freelance journalists will want to unpick the impact of "artificial intelligence" (AI), over and over again, as the insidious tech is increasingly unmasked as an unrepentant scraper of our intellectual property. But for its April meeting, London Freelance Branch reached out to colleagues from other trades within the Federation of Entertainment Unions (FEU) for a fresh viewpoint. Do they see AI as full of potential rather than threat?

Invited to speak to the branch were Laurence Bouvard from the actors' union Equity and John Sailing from the Writer's Guild of Great Britain.

Laurence Bouvard

Laurence Bouvard

Laurence Bouvard, a professional actor and voiceover artist, and the chair of Equity's Screen and New Media Committee, did not bring glad tidings for those whose livelihood depends on selling the original content they create. "AI hasn't created all the issues that we face," she began, "but it has made them a lot worse. The desire to build these new systems by taking our data is relatively new."

This comes from someone who could hardly be dismissed a luddite: Laurence holds a Master's degree in computer science.

Sony Walkman and cassette

For younger readers - this is a Sony Walkman cassette player, with an apparently copyright-breaching home-recorded cassette

A key problem, she noted, was the obsolescence of the Copyright Designs and Patents Act, passed in 1988 – back when an urgent problem was the copying of "cassette tapes".

Politicians don't understand the tech behind what the likes of ChatGPT and OpenAI are doing. They only see it in broad terms of IT sector growth and innovation, and they don't like being annoyed by the little people whose work is being stolen to make it happen.

"Now, as a computer scientist, I know how much fun it is to build those machines. You go ‘Oh, I need some data. I'll just take it. Look at this cool thing I built!' without actually thinking about the people affected."

This has led to a tipping point, she said: tech companies now have to be coerced into realising they cannot just take our work and train their machines for free. To this end, Equity launched a campaign named "Stop AI Stealing The Show".

Its long-term aim is to get legislation changed: but resistance from politicians means this will take time. So there is a medium-term aim of obtaining collective union agreements – that is, a return to collective bargaining – to treat creative workers decently.

Key to this is the cleaning up of contracts that vaguely refer to perpetual rights on yet-to-be invented platforms and media. Laurence recounted how a colleague had recorded something for IBM years ago, only to find his voice on a website advertising cheap AI-generated voiceovers. IBM had effectively sold his voice on to a third party, and it was all legit according to the original contract he'd signed.

To help counter such abusive contracts, Equity has compiled an AI resource toolkit. Alongside this the union runs an awareness campaign that asks for simple things from those seeking to insert cheeky AI clauses into contracts: "We want consultation, consent, attribution and compensation."

Consent is important, she said, noting that as a voice artist she records dictionaries. "For all I know, somebody's taking my voice and using it for some porn websites somewhere that I don't know about. That's not right."

And it's the same for journalists, she warned. If your writing is already out there, AI tech companies will take as many samples as they want without paying you, and then replicate your style and skill, again without paying you. And the groundswell of recent legal challenges to this kind of widescale copyright theft reveals not just how commonplace this is in the tech industry, it shows the arrogance of it all.

"They're not going to care until we stand up and make them care."

Laurence reckoned the general public wouldn't want to read everything that's written by AI anyway. It's inaccurate and unreliable. "This is the big secret: the technology is not as good as AI tech companies keep trying to make out it is. But they're don't care. For them, it's just cheaper."

Unfortunately, while AI may never be good enough to replace original journalism and performance art, she said, long before it could ever get to that point we'd all be out of work anyway. So the skills that creatives have and the value they provide right now are worth fighting for.

"We should work together on this."

John Sailing

John Sailing on screen

John Sailing, a senior organiser at the Writers' Guild of Great Britain, where he is responsible for advising members on contractual and copyright issues, said he preferred to think of AI as ‘artificial imitation'.

"It's not intelligent, it's not sentient, it's not its own thing," he observed. "It's predictive text with a dial ramped up to that absolute maximum."

And yet he noted a study that estimated there was an exposure risk of about 68.8 per cent for poets, lyricists and writers being replaced by AI. Another, by consultants KPMG, estimated 43 per cent of tasks associated with authors, writers, and translators could be automated. The threats of job displacement and wage suppression are very real.

Where he thought AI might struggle is replicating the ingenuity and humanity of skilled storytelling.

But back to copyright. AI, he said, requires "not millions, not billions, but trillions of pieces of data in order to predict the next word in a sentence, and the next sentence in a paragraph. And we know that AI developers have taken content from anywhere that they possibly can."

He referred to an investigation in The Atlantic that revealed how 183,000 books had been scraped for content without permission from the authors and without any payment: "Outright copyright infringement on a mass scale as far as we're concerned."

The Writers' Guild's policy is that authors need to give their express permission before their work is used to train any type of AI. It would have to be an opt-in system, not an opt-out system, despite many of the conversations about AI-training licensing models taking place between copyright holders, legislators and other stakeholders at the moment favouring the latter.

"There must also be proper transparency when it comes to the work that is used to train AI," John advised, describing the current method as "a black box system" in that the outputs won't necessarily tell you what has gone into it. He said he looked forward to seeing more online tools such as haveibeentrained.com which lets visual artists search for their own work to see whether it has ended up being used to train an AI.

Transparency, yes, but payment too. "If you're having your work subsumed into AI training, you need to get paid for that," he said. "It's the basis of how we've operated copyright law within this country for a long time. And I don't see any reason why that should change."

Another matter championed by the Guild is the "right to review". John gave the example of how HMRC is hoping AI will automate the processing of tax returns. "You know as well as I do that HMRC are pretty bad at getting freelance tax affairs correct," he said. "But AI can only produce good outputs based on good inputs. So when bad decisions are made by AI, there needs to be a way of challenging them and have them reviewed by a human."

John noted some positive concessions in the collective agreement that ended last year's Writers Guild of America strike. For example, he said, a producer can't just get ChatGPT to throw together a few storylines and hand them to a writer to write a full script while insisting they will not be fully paid since "ChatGPT did half the work". Instead, producers and writers must agree jointly when AI is to be used and how it's to be used. Writers can't be forced to use AI. After writers submit their work, it cannot then be used to train an AI.

Back home in the UK, one of the problems at the moment, he said, is that a lot of people are waiting for courts to make decisions on matters such as whether AI output can be copyrighted. The Writers' Guild view is that machines should not benefit from copyright protections. Such a thing would devalue the whole basis of intellectual property being the result of intellectual human endeavour.

It doesn't help that the British government refuses to take a stand on data scraping as copyright infringement, he regretted. "They try to hold a rather ambiguous slide, to keep the 'tech bros' happy."

On a final note, John warned about complacency in letting AI be used for copy-polishing, in which it handles the formatting, grammar correction and so on – the supposed low-hanging fruit. "This is where a lot a lot of writers learned their craft," he said. "If we close off those entry routes, there's a real risk that writing will, once again, become the preserve of those who can afford it and push out those in the working class, it will push out women, it will push out writers of colour."

And all this to benefit a technology that is often questioned over racial and misogynistic bias.

Questions and answers

Q: Is it enough to have a human making the publishing decisions when the stories themselves are written by AI?

Laurence Bouvard: "The quality of AI tools is actually still not very good. So yes, they virtually all need human eyes on it. The problem with post-editing is the same as in the field of translation: it is paid very poorly or it is farmed out to people living in low-pay countries. Once people start asking for proper rates for those jobs, then maybe they will dial back a bit."

Q: What kind of legislation are you hoping for? Are there are any politicians who understand the problem?

LB: "The Copyright Designs and Patents Act desperately needs work. It's not fit for purpose any more, particularly for performers. There's also the Beijing Treaty [dealing with intellectual property in audiovisual performances] that everybody signed up to. John, you probably know more about this."

John Sailing: "There are some technical questions around the implementations of particular provisions, which I won't go into now. But talking about the legislative side, Lord Holmes introduced a Bill in the House of Lords on AI that does contain a lot of things that we are asking for. But it's a private member's bill and does not have government backing. To be honest, I don't think it will get passed because it flies in the face of the prime minister's awe for innovation and the seeming complete disregard of creative industries and the billions [of pounds] that we generate."If an AI is able to just take our work for free without any formal remuneration for our members, it risks taking the legs out from underneath us. I think that is going to damage the UK economy; also indirectly through all the secondary jobs that our industry brings.

"Internationally, the EU has passed the AI Act. Unfortunately, we're not in the EU any more, but that goes some way to helping with protections there. And in the US, President Biden made a presidential order around AI as well, which also helps give some kind of steer on things.

"At the Writers Guild, we're members of the the International Affiliation of Writers Guilds (IAWG). We are engaging with WIPO – the World Intellectual Property Organization – around some of the ways in which they need to harmonise copyright legislation within an international framework.

"The UK is falling behind in this area. In an election year, we need to be knocking on party doors and having those conversations about what we'd be asking for in any manifesto they publish later this year."

LB: "Most politicians know nothing about tech. It is very hard for them to resist the siren call of ‘Tech! It's the future! It's lots of money!' and blah, blah, blah. It is incumbent upon us all to speak to our MPs. This country has soft power: only the arts in this country are in the number one place worldwide, but they are being hollowed out. That's not caused by AI; but it's being made worse by it, so you risk ending up with an industry of wannabes doing it as a hobby, or superstars, and nothing in between."

JS: "On a positive note, within the creative industries at large, we are all talking to each other quite a lot about these issues. Last year, there was a proposal by government to change some of the copyright law to introduce an exception that basically meant that it would be completely legal for companies to scrape data without it being considered copyright infringement.

As an industry, not just unions, but also producers, publishers and the like, collectively we said no, that would just be terrible for us, we can't have that. We worked together to make sure that that didn't happen, and it didn't. So we do actually hold quite a lot of sway."

Q: If an AI uses our work, or our style, how can individuals find out and what can we do about it right now?

LB: "There's no recourse. It's one of these things where nobody really thought about it. An article in the New York Times observed that Google had changed the wording of its consent form to allow it to scrape any Google Docs or anything else you keep on Google Drive.

"An AI can recreate the style of you by training on your work in particular. But then equally, they could take all romance writers' work and then publish its own Mills & Boon.

"So there are two things going on and both of them are damaging. The first is damaging because it's you and it's being identified with you or your style – like when Tom Hanks got very angry because they used his AI data to sell toothpaste or insurance or something.

"But equally, walk-on background artists get very upset because [when they show up for a job] they have to go through this machine that copies their data, and they're paid maybe an extra £10. And then they are turned into generic background people and they can't say ‘Hey, that was me!'"

Q: Is the big problem that the general public simply doesn't care what AI does as long as it's cheaper?

LB: "Here's the thing: just because an AI can write a book, do we want it to? Because the other hidden thing is that AI is terrible for the environment. There's a dispute happening in the US because they are draining one of the rivers. AI works on computers; computers get hot; they need to be cooled with water.

"Do we want AI to discover cancer when it's in its first stage by looking at thousands of slides? Or do we want AI to write a movie? We need to start being sensible about what we're going to use this tool for."