Artificial Intelligence makes it easier than ever to create images, videos, and sounds. But just because the technology exists doesn’t mean it’s legal to use it however you want. In January 2026, the laws in the UK changed to keep up with new AI tools and how they’re being used.
The Online Safety Act 2023
In 2023, the Online Safety Act became law. It’s a set of rules for big tech companies like TikTok, Instagram, and Snapchat.
Before this law, if someone posted something harmful, it was often left to users to report it and hope it got taken down. Now, the law says these companies have a “duty of care.” This means they legally have to build their apps in a way that stops illegal content, like fake sexual images, from showing in the first place. If they don’t, they can be fined.
The Data (Use and Access) Act 2025
Now, a law called the Data (Use and Access) Act 2025 has created new rules about how people themselves use AI. The law now covers not just what you share, but what you choose to create in the first place.
Because of these rules, the police don’t have to wait for an image to be shared with thousands of people to take action. If someone uses AI to create a private image of you without your permission, the police can treat that as a crime immediately. They now have the power to catch people who make these images, not just those who post them.

Is it Illegal to create deepfakes?
A deepfake is an image or video that has been changed by AI to make someone look like they are doing or saying something they didn’t actually do. Whether a deepfake breaks the law usually depends on what the image shows.
If you create (or even just ask an AI to create) a sexual image of an adult without their permission, you are breaking the law. It doesn’t matter if you “meant it as a joke” or if you kept the image to yourself – the act of making it (or asking AI to make it) is where the crime is.
Protecting children and young people
The law is even stricter for anyone under 18. It has been against the law for a long time to have or share sexual images of children, but the rules now specifically say that AI-generated images of people under 18 are “illegal pictures.” This means that even if a photo isn’t “real,” it is still treated as a very serious crime.
Having these images on your phone, or using AI to make a photo of a friend or classmate look sexual, can lead to a permanent record that follows you for life. The police are taking this very seriously to make sure every young person feels safe online.

The ban on ‘nudification’ tools
You might have seen or heard of ‘nudification’ apps or tools that claim to use AI to “see through” clothing or create fake nude pictures of real people. The government has banned these tools completely through the Crime and Policing Bill.
It is now against the law for companies to sell these tools, and it is a crime for anyone to use them. These aren’t just “silly apps” – the law sees them as tools used to bully and upset others.
Staying safe and following the law
The best rule is to never use AI to create an image of someone that they haven’t said yes to. Consent is the most important part here, and it means that a person has clearly agreed and understands what is being made.
You should also be careful with what you receive. If someone sends you a fake sexual image, don’t send it to anyone else. Sending a fake image is just as much against the law as sending a real one, and it can cause just as much harm to the person in the photo.

Need to talk?
Legal changes can feel complicated, and you might have questions about something you’ve seen online or a tool you’ve used. If you are worried or if you think someone is using AI to target or bully you, you don’t have to figure it out alone. Talk with Meic. We provide free, anonymous, and confidential support for anyone aged 0-25 in Wales. Our advisors are here to listen and help you understand your rights without judging you.
