This is for the future of humanity
EDITOR OF the Freelance Mike Holderness introduced Isabelle Doran, the co-vice-chair of the Creator’s Rights Alliance (CRA), to the February meeting. The CRA is the body that brings together the National Union of Journalists, Equity, the Association of Photographers, the Society of Authors and other organisations that represent “those of us who make our living from copyrighted works”. Mike is a former chair of the CRA.

Isabelle Doran with the Brave New World? report
Isabelle Doran: I am the CEO of the Association of Photographers and have been for four years. I'm also co-vice-chair of the Creators Rights Alliance, with Deborah Annetts, the CEO of the Independent Society of Musicians. Our chair, Anna Ganley, is Nicola Solomon's replacement as the CEO of the Society of Authors.
For creators of all kinds things have got harder. We've had a number of crises: with Brexit, with COVID and now on top of that generative AI. So the work of the Creator’s Rights Alliance has never been busier.
I think we were very fortunate last year with the combination of the three of us. We had a policy and public affairs person, Lindsay Ford, who has subsequently moved over to work exclusively for the Society of Authors, and we now have Jenny. We are in a good position to really start trying to engage with ministers and push the point across about the work of creators in all its forms – as a journalist, photojournalist, author, illustrator, musician, performer, or actor.
One of the meetings we held that was a Labour Party drop-in, so we made friends with Northamptonshire MP Lee Barron, who is not necessarily known to many people: he is a former postie and is very much for the cause of creators, and was very keen to find us a room in which we could have the Labour drop-in.
I want to run through with you all what we did in 2025.
Government's option 3
We started the year with the government having launched a consultation on copyright and “artificial intelligence” (AI). There were various options presented to us. Zero: do nothing. One: set up licencing [for use of copyright works by AI], which most of us supported even though the conditions of that weren't ideal: we’re still for that, rather than doing absolutely nothing.
Option 2 was to literally hand over everything to big tech, which obviously we weren't going to support. And then Option 3 was the option that the minister in charge at the time, Chris Bryant, has decided was the route that we were all meant to take.
I think he's been surprised by not only the response, but also that most people said “no” to Option 3. It would allow commercial text and data-mining – with an opt-out.
As we all know, whenever we post anything online, we don't have control of that work once it goes further afield. If you're writing for a newspaper, they will distribute it, they'll licence it elsewhere. So, to rein that back in and take control of that is particularly challenging.
So thankfully we had a lot of support from those with the technology background to really push back on this third option with an opt out.
Overall, there were 11,500 responses, which is huge for a consultation.
So, I sent in our submission. What we all agreed was that we really need to push a lot of focus on Parliament and parliamentarians – because it's all well and good us putting forward our concerns, but we have to try to convince ministers. It seemed that they saw us as a “nice to have” [rather than a pillar of the economy]. But this is our careers, our jobs on the line.
So a lot of campaigning came out of that.
We had a meeting with James Frith, who is a supporter. John McDonnell was there at the meeting with James Frith putting forward our concerns about the creators’ rights campaign.
Now, to be fair, the newspapers decided to push forward on this campaign. Back in May, all the newspapers for one day had a blue “Make it fair” wrap-around. We stood across the way from Parliament, waving our banners for a photo call.
We caught the attention again of the press, but also importantly, we caught the attention of ministers and civil servants, who were quite impressed.
They weren't expecting the collaboration across the creative and news media representatives, so in that respect it was a very good opportunity to share the same concerns, and to share it loudly. We got a lot of interest, and we had a drop-in with the LibDem party. Victoria Collins (LibDem MP for Harpenden and Berkhamsted) stood up in Parliament and referenced statistics from the Creators’ Rights Alliance and from its member organisations, so we got a lot of attention, and for good reason.
By that point, in July, I think, Chris Bryant realised that Option 3, allowing free use of our work unless we opt out, was not going to work. He is a fan of the creative industries. He's a fan of writers and musicians, photographers; he was there to listen to our concerns and it was Lee Baron MP [Lab, Corby and East Northamptonshire] who arranged for us to have a room so we were able to share our concerns with Labour MPs at the time when they were voting on the Data (use and access) Bill.
Data Bill
I don't know how many of you were aware of this particular bill that was going through parliament. [It was] a very stuffy, boring bill that didn't seem to do very much apart. But we knew it was an opportunity to push for transparency [in the use and access of our data]. All our works are going into these AI programmes – but we don't know which of our works and we're assuming it's all of them – but which programmes?
So one of the big things we're asking for is transparency over that. If we're looking for remuneration and if we're looking for compensation, then we need to know what's gone into these programmes and how much of it and how it's used.
We were hoping to be able to add it into the Data (use and access) Bill. We had a lot of campaigning, and one particular Baroness stood out: Baroness Beeban Kidron, who was briefly named the Creators Champion because her speeches were extremely eloquent and cut the mustard; she really homed in on exactly what the problem was.
She actually united the House of Lords to stand up for creators and the creative industry. It is fantastic to have that champion there for us.
The Lords went through were five rounds of trying to put in amendments to the legislation [mostly to mandate transparency] and five times the government pushed back and said “no, this isn't going to happen” with whatever reasons they decided to give, and that was that: they blocked it.
However, the government made two concessions: one was that they would publish the results of the consultation, because we've been waiting for a very long time to see it. The other was that they were going to do an impact assessment. That is due to be published before 18 March.
And it's extraordinary to put the fact that they’re going to do an impact report into law. I don't think anybody's ever done that before. So that's what they committed to in order to get that Bill passed. Those was the only two concessions but, again, all of this pressure matters. So, of course we’ve kept going with that.
Pennie Quinton: Could I ask a question on the impact report? Were you given a heads up on who was doing it, the budget that was allocated for it, how much time and what was the deadline for the impact report?
Isabelle Doran: As we get into the latter part of 2025, in summer recess ministers are all away, MPs are all on holiday, and we had the great privilege of being asked to attend the House of Lords Communications and Digital Select Committee to be able to explain the challenges that individual creators are facing.
This was in front of Baroness Kidron and various industry representatives, including Anna Ganley of the UK Society of Authors; Owen Meredith from the News Media Association. There have been a number of meetings and evidence sessions. Around Christmas we saw the [outline] results of the consultation published. I think Options 1 and 2 got something like 93 per cent or 95 per cent in favour; the government's preferred Option 3 got something measly like 3 per cent or 5 per cent.
It was very clear to the government that they needed to listen to creators and that we would all push back very hard on the opt-out. In January we began to find out that the government has put in a call out to various different organisations to do some research.
So, there are two consultancy firms that have been tasked with doing this impact assessment: there's Alma consultants and then there's one called BOP consulting.
I don't know what “BOP” stands for. They are working with another party called Technopolis who apparently deal with a lot of stats to do with technology and AI.
Knowing that this report was coming down the track, some members of the Creators' Rights Alliance got together to produce a report called Brave New World. We were concerned about the representation of the creative industries and particular creators because obviously the government isn't necessarily going to paint the right picture, particularly the impact [of AI] on all our jobs.
All our members’ jobs: writers, photographers, illustrators, musicians, performers. So five of our organisations got together the: Society of Authors, Association of Illustrators, Independent Society of Musicians, AoP and Equity – starting from our member surveys that we've done over the last few years.
It's down to the brilliance of a PhD graduate who is working at ISM and has an AI and copyright background, who has done all of this and produced this report, which is fantastic and hopefully very timely – because it’s hot off the press, probably about a week old.
We are hoping that it will form part of the impact assessment, because what we were worried about is, if it's just left to government they'll just look at the opportunities of AI and not consider the impact that's happening on the ground, on individual creators.
So it's been a pretty busy end to 2025 and start to 2026 already.
Freelance Champion
At the Creators Rights Alliance we're really trying to focus on “the year of the creator”: trying to pull the focus on to us, our workforce, with research and obviously supporting the initiation of the Freelance Champion. The department for Culture, Media and Sport has been looking at appointing a Freelance Champion. I came from a meeting earlier today that was supposed to be with Minister Ian Murray – but obviously there are other things as we know that are happening – so, it was one of the civil servants who held the meeting.
The freelance champion is supposed to be a representative for our industry/ It'll be a difficult job they're going to have, because they're only going to be working four days per month to really get to know and understand the creative industries and represent our interests. So today was giving feedback on what that role might be.
However, it's important that we do have a Freelance Champion. It is important that we have somebody that represents our interests as well as all organisations representing our interests. So that's something that should be coming down the track this year.
Photographers losing work
Generative AI is having a huge impact and just from our perspective: The Association of Photographers did a survey of our members in January, repeating one we did last year. Last year 58 per cent lost work to generative AI. It's the same this year at 58 per cent, the amount of turnover that was lost has more than doubled. Our members are directly competing with generative AI.
AoP members often work on a commission basis; the majority are in advertising, design and editorial [illustration], less in journalism. Some, when they're pitching for work, are being asked to pitch against an AI proposal – this is happening on the ground and it's shocking, 58 per cent is huge.
This is why we do these surveys: so we can push it back on government and say “this is from our own survey”. We found that 100 per cent of our members want transparency. It wasn't 99 per cent or 98 per cent, it was 100%. That was particularly positive and we'll use that information to try again to push the government into listening to creators.
Obviously, we're still looking for fair compensation. We still want to make sure there's consent. Licencing is much more complex, certainly from our members’ perspective, they don't want to license their works for AI use. And I understand why.
If you've got AI – suppose you've got a supplier who's basically saying, “you know what, we're just going to take all of this and we're not going to pay a bean, but if you have licensed [your photos] maybe down the line, we'll give you something.” I think they essentially caused so much upset that our members are saying, “well, why should I license my work unless I get compensation and maybe I'll consider it, but I want to be able to have the right to opt in, I want consent and I want to preserve my rights which I think are very important.”
Obviously we're going to continue with our opportunities for members to present concerns to MPs and Lords. For example, this report is one focal point for members to be able to write to their MPs and it's not exclusive – if you still have concerns, we recommend you write to your MP and write to them as often as possible.
MPs must reply; that's their responsibility. Even if it is formulaic, they have to do it. The more letters they get, the more they have to go to their ministers and say, “look, I keep getting hundreds of emails from my constituents and they're all saying the same thing.” So it does work. It puts pressure where we need pressure to be put.
Mistaken agents
Surprising – and then unsurprising at the same time – is the fact that “agentic AI” – systems that complete tasks independently or with minimal human supervision as we see with all these summaries on our social media and on our search engines. They are coming back with responses – but one of the things they're not doing is drawing on is BBC News. That is quite interesting, bearing in mind that BBC News is globally, one of the most respected and sought-after sources of news, whether it's national, international or local news.
I don't know how the BBC is going to benefit from going on to YouTube. Channel 4 has a small presence on YouTube.
Certainly generative AI is having an enormous impact on news media as we know it and obviously there are some alarming things that are coming out from the democratic western world. Look at the US, what it stood for, how it presented itself among its allies. Suddenly we see a US administration that is quite prepared to publish fake images, to publish racist images, to publish images that aren't real. It's just extraordinary and I think we are living in absolutely extraordinary times.
Questions: agency merger
Julio Etcchart: Thanks for sharing the wonderful work you're doing. I'm on the NUJ Photographers’ Council. One of the things we've been doing recently is raising concern about the merger of Getty and Shutterstock… while Getty were committed to not using AI in reportage images … we haven't heard from Shutterstock... So, have you got any clue what’s happening there?
Isabelle Doran: I don't have any news specifically on Shutterstock, but certainly the combination of Getty Images and Shutterstock is quite a challenge for us from a collective bargaining position. It makes it much more difficult for individual photojournalists to be able to negotiate contracts, to be able to have any bargaining power. It's not ideal, because I think… the two of them combined take over 80 per cent of the [stock image] market, which is just extraordinary, to place all that power under one combined force.
Associated Press (AP), the Press Association and Reuters will all be affected, I would imagine, from the pressure that it places on individuals, not just photographers, but also writers for those news agencies. But if you look at the wider picture, we know why the two are combining – because Getty Images has got a huge amount of debt.
Shutterstock has bought lots of companies, so you could see that either one of them was going to pay, particularly in comparison to their biggest competitor, which is Adobe. On top of that, you've got OpenAI with their ChatGPT. Probably OpenAI is the largest. In comparison to OpenAI and Adobe Getty and Shutterstock are quite small.
This is always going to affect individual creators, writers and photographers, there's no doubt about it. But in the greater scheme of things, probably the worst [concern] would be ChatGPT, because of course it's not only text, it's also images. OpenAI has done another deal with Adobe recently, so you can now use Adobe products in Chat GPT, which is extraordinary. So, it impacts us enormously, but in the grand scheme of things, they're tiny in comparison to the bigger challenge, which is OpenAI.
Mike Holderness: What is the regulator for this? Is it the monopolies authority in the US, or the UK Competition Authority, and does it have any power to say “no”?
Isabelle Doran: The CMA (the Competition and Mergers Authority) certainly has, and they are investigating as far as I'm aware. They've also received lots of written submissions saying that there's a problem with the Getty Images and Shutterstock merger. I would imagine that Reuters, the PA and AP would be getting in touch to say that 80 per cent of the market is too much. It all depends on how the CMA decides to investigate and whether they feel that they've actually got the opportunity to push back and say it's too big.
Calling out theft
Phil Sutcliffe: As far as I can see the AI companies have already scraped virtually everything that has been made up till now and which is online. Which is most of the stuff that's ever been made in word or picture form... So, they've already stolen all of that.
Do the campaigners, the lobbyists supporting us constantly use words like “steal” and “theft”? That is what they're doing to us. We need those words in there to the government, in making the arguments in public. We need to say to government: “Say 'yes, we approve of theft. We approve of billions of quids worth of wholesale theft by giant corporations from little people'.” Will government say that to everyone in this room, actual and virtual.
We've got to force them to say it in public if that is what they're going to do… as far as I can see, one part of this situation is that we've lost, because everything has been scraped, so all you can do is try and get compensation, even though you wouldn't have said yes to it had you been asked about the scraping. But law can still change in the future and affect everything we create from now on and everything that is only the twinkle the eye of unborn children now. So that's how important it is and how we can't take a defeatist perspective on this…
Isabelle Doran: You're absolutely right: it's about pursuing compensation. Just the small group that published this report, that's exactly our ambition, to push for compensation. I've heard comments back to me whether from other creative industry representatives organisations or from, civil servants about the leaders’ line being “We need to move on… We need to look at the opportunities.”
And we keep saying “No, we can't – we need compensation. It is the one thing that's important.”
They're saying that everything has been scraped. Obviously, we're still producing work, so there are new works that are going up.
We know that these programs are statistical pattern-predicting programmes. They are “hallucinating”, chucking out garbage, because they are machines. They are machines that doesn't know the difference between “yes” or “no”, right or wrong.
A year ago Vogue put Ariana Grande on the front cover, and they used generative AI. She ended up with six fingers.
We need to be in it for the long haul. We need to push back, and we need to keep pushing for compensation. We need to keep pushing that this is Illegal, that it is theft and we are, the David. We know Goliath is big tech over in the US. We have a battle on our hands but bring it on absolutely bring it on.
Pennie Quinton: I know that you've got the concession that they would do an impact assessment. But effectively that's an internal tool of lobbying that you've won. Compared to the money that Google has, that Musk has, is there any way you can demand a certain amount of expenditure to run that impact assessment?
Because there's no way, there's equality in an impact assessment. They can just get some intern to do it in the civil service if they want to. How can we hold the actual level of their investigation into the impact on our industry to account? Because otherwise it's just going to be rubbish.
Isabelle Doran: That's one of the reasons for pooling all our resources and information together – we can all challenge whatever report comes out if we're not satisfied with it.
There's an opportunity to challenge them on whether it's heading in the wrong direction. One thing I think is extraordinary, and something again to consider whenever there's an article being written, is the fact that when Google was putting forward its claim of bringing wealth to the United Kingdom through productivity as opposed to jobs, they were making up their figures. I think that's why we need to push back and that's why it's so important to do the surveys, it's important to respond, whether to a consultation or to a survey, because the more evidence we can provide, the more we can push back, the more they can't dismiss it. But the thing that is coming down the track is definitely productivity versus jobs.
Productivity just increases the profitability of the company. They get rid of more jobs and, as we're seeing, unemployment is rising. Those who are trying to get a foothold into the creative industries. or any other industry, are struggling because all those early opportunities for young people starting out are just not there. A lot of the jobs are being swapped out with generative AI.
The government has some of those stats; but what they're desperately looking for is “opportunity”. One of the concerns we have is that their focus will be is on the film sector because it has been using generative AI quite a lot in the production of films – that's where they'll see the “opportunity”. So, create tech is another word to add into our vocabularies.
Susan Wallace: Also, it seems to be that speed is of the essence, isn't it? Because AI is absolutely everywhere. it's a runaway train, and the danger can be with surveys and with waiting for MPs to pick things up, that it happens regardless. And my question is: technically, or in practical terms, how does AI work differently in photographs rather than text, because they seem so completely different.
Isabelle Doran: That's a good question. (One of the things that I had to do very quickly when we were trying to push back from a photographic perspective was to understand the technology – because if you're going to form an argument against it you're just going to be labelled as “living in the past” and told “you need to get with the times and this is what's happening, the genie is out of the bottle now”.
So, you've got text-to- generative AI and then text-to-image. “Large language models” (LLMs) generate text in response to a text “prompt”. Generating images from a text prompt uses a “diffusion model”. These were much harder for technologists and AI developers to understand and figure out. Systems such Midjourney and Stability AI use “stable diffusion models”; what they do is, with with each “layer” of an image they're adding noise, and then reversing that noise [from the model trained on other images]. It's pattern recognition for machines.
So, the way to think about it, is that memorisation definitely exists in these machines. They are having to recognise a particular pattern, so you're changing something from what we would visually see into something recognised from zeros and ones. The machine is translating – as we do when translating from English to French. It's a translation of that image for the machine to be able to read it and “understand” it and recognise the pattern.
There are otherapproaches that have been used in the past, such as “generative adversarial networks” (GANs) in which trained models compete with each other.
We mustn't confuse these generative AI systems with AI in general. AI is a much broader term, and AI has been doing things like looking at X-ray images. The danger where it starts replacing the journalists who are reporting.
Tim Gopsill asked about deals between AI companies and publishers.
Isabelle Doran: You saw a lot of the big deals done with the newspapers. Then they've gone to book publishers. Some of the deals are fine. Some are not fine. I mean, they're still all terrible when you consider that these programs are the displacement of our jobs.
Displacement of work means that they should pay a lot more. They should pay almost a living wage type of amount instead of £300 or £30 or whatever an author may get from a publisher.
But one of the concerns that I have is that you have programs that are going to compete with us in the market place, but also if you say “no” to your work being appropriated, if it's already been scraped, the AI companies insist that their machine can't un-learn. They say that would be like trying to unlearn the alphabet. How are you possibly going to do that?
In some respects, that payment should be made regularly, every year, as we have with our royalties as we have with collective rights management. It should not be a one-size-fits-all “there's your payment, off you go.”
It must be a continual payment.
The AoP has members who are not members of the collecting societies. Everybody whose work is up online needs to be a member of a collecting society. Every professional needs to be earning an income from that.
So, there are definitely hurdles to overcome. This is just the start of the journey. The big issue would still be whether the big tech companies are even going to bother to pay.
Where we do know that there's interest and where there's likely to be much more of a reciprocal arrangement would be… with new AI developers in the UK – those who feel that there's a niche that nobody's covered yet that they could fill, particularly with high-quality work. There's an opportunity there, but again, the use of our works has to be with consent.
It has to be paid on an annual basis but I haven't seen anything like that yet.
A member asked: What is preventing generative AI from saying “this was the riot that happened in Parliament Square”? Can you tell us about the “separation of church and state” so that if you're an on-the-ground reporter and you're sending in your filing first – is generative AI at some point going to want to file for you? Is this coming into the newsroom and taking the place of a journalist who's on the ground?
Isabelle Doran: You have RAG, real-time augmented information, as part of the large language model landscape. Say I'm using Chat GPT (by the way, I don't). Say I ask a current question, such as “what’s happening in Ukraine right now?” It will pull that information from the news media. Currently I think where the challenge is, as you say, as soon as you've got this layer of agentic AI, where does it start changing that information? How do we know it's true? How do we know it's being validated?
This is concerning because we know that information can be fabricated. We know that information can be distorted. I think it's incumbent on us all still to go to our newspapers, to go to our news sources, to work with our news journalists, to make sure that we verify information with those sources rather than relying on search engines, rather than putting in a question and relying on agentic AI to be able to give the answer.
We've just seen what's happened with Washington Post where they've got rid of so many journalists who would have been reporting on the ground what's happening. Something I think we do need is for newspapers to be much more ethical and much more considered and to say “this is for the future of humanity”.
![[Freelance]](../gif/fl3H.png)