Official Report: Minutes of Evidence
Committee for Justice , meeting on Thursday, 26 June 2025
Members present for all or part of the proceedings:
Ms Joanne Bunting (Chairperson)
Miss Deirdre Hargey (Deputy Chairperson)
Mr Doug Beattie MC
Mr Maurice Bradley
Mr Stephen Dunne
Ms Connie Egan
Mrs Ciara Ferguson
Mr Justin McNulty
Witnesses:
Mr Andrew Dawson, Department of Justice
Ms Lorraine Ferguson-Coote, Department of Justice
Justice Bill — Deepfake Amendment Consultation: Department of Justice
The Chairperson (Ms Bunting): We have with us from the Department of Justice Andrew Dawson, the director of criminal justice, police and legislation, and Lorraine Ferguson-Coote, the head of the criminal policy unit. Committee members will be familiar with them both. Folks, you are very welcome to the meeting. Thank you very much for taking the time to be here. I will hand over to you to make some opening remarks, after which members will ask questions.
Mr Andrew Dawson (Department of Justice): Thanks very much. I will be as brief as I can. We are here to talk about our consultation on sexually explicit deepfake images. As you know, on 29 April, the Minister announced her intention to table an amendment to the Justice Bill at Consideration Stage to provide for Northern Ireland-specific offences to criminalise the behaviours associated with sexually explicit deepfake images. The Minister is grateful for the Committee's agreement in principle to our approach so far. We acknowledge that the inclusion of any such provisions in the Bill will be dependent on the outcome of our public consultation, eventual Executive agreement and the drafting that would need to be done.
Sexually explicit deepfake images involve a person's image — usually a women's image — being inserted into sexually explicit content without their consent. Examples of deepfakes include videos in which a face is placed on the body of an actor in pornographic material, real images that are digitally altered so that the body is stripped of all clothing or images that are generated by artificial intelligence to resemble the victim depicted in sexually explicit scenarios. I mentioned that it is usually women whose images are used. We are aware of at least one research report, from 2023 — there are probably others — that shows that deepfake pornography makes up 98% of all deepfake videos that are online and that 99% of the individuals in those videos are women.
The Department has drafted a public consultation paper that seeks views on legislative proposals for how those abhorrent behaviours can be dealt with. A draft of the paper was shared with the Committee to inform today's evidence session. We hope that it was helpful and provided the Committee with some opportunity to consider the scope of the proposals and to identify any concerns that we may be able to help you with today.
I should note that the proposals relate to sexually explicit deepfake images of adults only. That is because protections are already in place for children. Under article 3 of the Protection of Children (Northern Ireland) Order 1978, it is an offence to take, make, distribute or show an indecent photograph or pseudo-photograph of a child under the age of 18.
The proposals in the consultation are aimed at criminalising a range of behaviours that are associated with sexually explicit deepfake images. These are as follows: intentionally creating or requesting the creation of the images without consent, or a reasonable belief in consent, with the intention of causing humiliation, alarm or distress to the person depicted in the image; intentionally creating or requesting the creation of the images without consent, or a reasonable belief in consent, for the purposes of sexual gratification; intentionally sharing the images without consent, or a reasonable belief in consent, with the intention of causing humiliation, alarm or distress to the person depicted in the image; intentionally sharing the images without consent, or a reasonable belief in consent, for the purposes of sexual gratification; and threatening to share the images with the intent to cause fear or distress to the person depicted in the image.
On the creation of images, it is important to note the exponential rise in so-called nudifying apps, which appear to be readily available online and essentially alter a clothed image of someone so that the person appears realistically nude. Creating deepfakes is therefore no longer limited to experts. The proposals are designed to try to take account of such technological developments.
In developing the proposals, we considered provision in other jurisdictions but were mindful that any offence structure here should fit appropriately within the existing sexual offences legislative framework in Northern Ireland.
The proposals that I outlined for deepfakes are based on the existing image-based sexual abuse offences of upskirting, downblousing and cyberflashing that are already provided for in the Sexual Offences (Northern Ireland) Order 2008. It is proposed that the behaviours should be dealt with as hybrid offences: that is, triable either in the Crown Court or Magistrates' Court. Such an approach provides the ability to reflect the varying levels of seriousness of the offending behaviour and allows for the prolific offender, or the offender who demonstrates a risk of sexual harm to others, to be dealt with appropriately by the courts.
We have included motivations in the proposals, and the inclusion of the motivation of causing humiliation, alarm or distress or the sexual gratification motivation that relates to the creation or sharing of images is pertinent to the point about hybrid offences. It is considered important to provide for and distinguish between the two motivations so as to identify those who pose a risk of further sexual offending and to ensure the effective management of that risk. Where sexual gratification is proven, the offender will be subject to sex offender notification requirements, which are more commonly known as being on the sex offenders' register, and brought within the scope of a particular civil prevention order, such as a sexual offences prevention order, to help further manage risk. It is also proposed that, where there is a risk of serious sexual harm from the commission of further offences, offenders will be brought within the scope of the public protection sentences that are available under the Criminal Justice (Northern Ireland) Order 2008.
To deal with those who threaten to share the images, the motivation of an intent to cause fear or distress is considered a more appropriate proposal. It is also proposed to include a recklessness element in the offences. That is an attempt to guard against over-criminalisation, particularly of the young or the vulnerable, who may not have thought through the consequences of their actions. That is particularly relevant because, although the offences relate to the creation and sharing of images of adults, suspects over the age of criminal responsibility of 10 years old will be liable to prosecution.
In the consultation, we have aligned the proposed penalties with the existing image-based sexual abuse offences that I referred to earlier, namely, on summary conviction, up to six months' imprisonment or a fine, or both, or, on conviction on indictment, up to two years' imprisonment.
Finally, we have not proposed a definition of the term "sexually explicit deepfake images" at this stage. From looking at the legislation in other jurisdictions, we know that varying definitions have been applied, and it is important that any legal definition capture all potential aspects of those images and that it keep pace with advancements in technology. We are, however, seeking views on the definition that should be applied, and, once we have received views, we will, in liaison with our legal advisers, inform the proposals to ensure that all images of that type are captured by the proposed offences and are future-proofed.
The next steps are to publish the consultation early in July and for it to run for an extended period of 10 weeks to allow for the summer period. I appreciate that undertaking a consultation over the summer is not ideal, but it is necessary on this occasion in order to allow us as much opportunity as possible to use the Justice Bill as the vehicle to effect the proposals.
That is a brief overview of the proposals. We are happy to take questions.
Mr Bradley: Thank you very much for your presentation. It is an alarming situation that we are in, with IT and AI progressing so quickly. What cooperation have you had from social media companies such as TikTok, Facebook and X?
Every computer that is logged on to has an electronic stamp. It has an IP address. How can you force companies to hand over IP addresses to law enforcement if something were published? Moreover, what cooperation are you getting from them?
Mr Dawson: We do not deal with that specifically in the deepfake amendment that we are proposing to table to the Justice Bill. The Online Safety Act 2023 contains a number of levers to provide for online platforms to remove illegal content, however. The Act sets out a list of priority offences, and there are a few well-documented cases in which that legislation was used against a number of big online companies. Lorraine may be more familiar with that.
Ms Lorraine Ferguson-Coote (Department of Justice): That is right. As Andrew said, there are provisions that, once the action is criminalised, and we have offences created, those will be dealt with under the auspices of the Online Safety Act. There has already been a big change. Ofcom, which oversees and regulates the provisions in the Online Safety Act, has seen a trend of companies taking notice. In fact, the creation of deepfakes offence in England and Wales was included in the Crime and Policing Bill. Once it was announced that that would be done, a couple of the big companies blocked UK users, because they knew that there would be consequences. As Andrew said, those companies have very specific responsibilities, particularly to UK users, and they can be fined up to £18 million or 10% of their global revenue. There are therefore particular provisions in place. They are in the reserved space, so they are not in our Bill, but we are seeking to criminalise those particular behaviours so that they can be dealt with under the auspices of that particular provision and thus allow our legislation to take effect.
Mr Bradley: Chair, I know that I strayed a bit there, but I am keen to know that the big companies can be held responsible for that type of activity.
Ms Ferguson-Coote: They can.
The Chairperson (Ms Bunting): Maurice, you did not stray, because it is in the information that the Department provided us with that the US, for example, legislated for companies to have to take certain images down within 48 hours of their being notified of them. Does what is contained in the Online Safety Act have something similar in place, whereby there is a specific deadline by which time companies must remove the content?
Ms Ferguson-Coote: I am not sure about a deadline, but the Act contains particular protections that require tech companies to take down the material. In America, it is a federal offence, but I do not think that there is any specific time frame included in the Online Safety Act. We can double-check whether there is anything specific in it. There is a requirement for companies to take down the images, however.
Mr Dawson: As far as I am aware, under the Online Safety Act, platforms have to remove any illegal content where there is an individual victim, where it is flagged to them by users or where they become aware of it by other means. There is a so-called triple shield of protection to make sure that illegal content is removed, thus enforcing the promises that social media platforms make to users when they sign up to the terms and conditions that everyone accepts when they join to offer them the option of filtering out content that they do not wish to see. The intention behind our deepfake provisions in the Justice Bill is to move Northern Ireland into the orbit of the 2023 Act.
Ms Ferguson-Coote: We would be criminalising the behaviour, so that would come under the Online Safety Act. At the moment, England and Wales are covered. Scotland is covered for sharing, but there is definitely a need to look at it for Northern Ireland.
Ms Egan: Thank you for coming in today. I want some clarification on what you said about those who commit offences being placed on the sex offenders' register. Is that restricted purely to sexual gratification offences and does not include offences of intending to cause harm or humiliate?
Ms Ferguson-Coote: Thank you for your question. Only where sexual gratification is proven would people fall within the notification requirements and the civil prevention orders that manage risk. For young people, we are proposing to attach a qualification to that, in line with the other sexual image-based offences, so that they would have to commit an offence for which the penalty is 12 months' imprisonment. That is to provide a bit of protection that recognises their age and their immaturity.
Ms Egan: Why is the intention to harm or humiliate not part of that, given that that is still very serious and has the same impact on the victim?
Ms Ferguson-Coote: It does. These are only proposals, however. We have brought them into line with the upskirting, downblousing and cyberflashing offences. That is what is provided for under those offences. A very interesting fact to note is that, in England and Wales, the creation offence is a summary only offence, not a hybrid offence. What we are proposing is that, because of the seriousness of the creation offence, more so than the sharing offence, we should offer the opportunity for the more serious end to be dealt with through the Crown Court, if necessary. We are trying to get the balance right and to take the temperature of the public through their responses. We have garnered information from the other jurisdictions and from our current legislative framework. Doing that has given us a good basis for our proposals. The proposals are out for consultation, so people who are a bit more knowledgeable in that space and who have experience on the ground can inform the consultation.
Ms Egan: OK. This is my final question. What engagement have you had with the PSNI on how the offences will be operationalised? The online world, as you know, can be very difficult to navigate. You could have a situation in which people from Northern Ireland are affected yet the person who committed the crime is not in Northern Ireland.
Ms Ferguson-Coote: We have had early engagement with all the criminal justice partners, including the PSNI. We have not received any particular responses to our proposals as yet, however. I think that that is down to the timing of our drafting the consultation to meet the time frame for the Justice Bill. The PSNI has seen our draft proposals, and we will engage with it further, in tandem with the public consultation. I am jumping ahead, but our intention is to engage fully with the partners throughout in order to make sure that, by the time that Royal Assent is granted, we are operationally ready to commence the provision.
Ms Ferguson-Coote: On the previous Justice Committee, there was a particular interest in whether such an offence could be proven. The Public Prosecution Service (PPS) is used to proving sexual gratification through the materials used.
Ms Ferguson-Coote: It was discussed. The PPS said that that is not an issue. It is not a new concept. It was part of the original voyeurism offence, and it is part of the upskirting, downblousing and cyberflashing offences. That has never been presented as an issue.
Miss Hargey: Thanks very much for the update. I have a broader point to make about consultations. I get that we are moving at pace, but we have another consultation coming up, and the Department will be conducting, from my count, 10 consultations over the summer. I am concerned about community and voluntary organisations, where such things take effect. The summer is a busy period for families, with kids being off school. I wanted to make that point.
I am keen to know what pre-engagement there has been with the likes of the Human Rights Commission (HRC). A balance of rights needs to be struck, particularly when looking at the offences. We want to protect people. What will the equality screening process look like?
In your presentation, Andrew, you touched on article 3 of the Protection of Children Order. Are you confident that everything is covered by that and that there will not be any gaps?
You mentioned children's mental capacity. In the online world, they may share information and not even realise that, by doing so, they have committed an offence.
In some cases, they may know what they are doing, but, in other cases, they may not. Even though you are saying that children are not included in this, has there been any engagement with children's organisations that work in this sphere, or even the likes of the Children's Commissioner, to look at unintended consequences in those areas?
I have another question. It is about the nature of some of these issues. One of the big things that has come up is catfishing, and there may be images generated as part of that. The GAA catfish is one example. That had an effect across the island. What cross-jurisdictional work is being done to try to close any loopholes that exist? Some images may be generated in the South of Ireland or in England, Scotland or Wales but be shared here or vice versa. Are we looking at that? I am conscious that this legislation will not come to Consideration Stage, even for us. If you do not have the answers today, I would be keen to get a follow-up response.
Mr Dawson: We will try our best to touch on some of those points today. We can then follow up on the others.
Mr Dawson: I do not think that we have engaged yet with the NIHRC, but we will do so as part of the consultation. We have an early analysis of the ECHR concerns that we have identified, and, at this stage, we consider that the creation of deepfake offences will potentially engage article 10 of the convention, which relates to freedom of expression, to the extent that the provisions restrict a person's ability to post or receive certain kinds of images or videos. Article 10 is a qualified right under the convention, as you know, so any interference with those rights must be justified. Again, our work will build up the justification for that.
Miss Hargey: If you have any more information that you can provide once you do that assessment — even the Human Rights Commission's response — it would be good if you could share that with the Committee as early as possible.
Ms Ferguson-Coote: Absolutely.
Mr Dawson: Given the nature of how we are doing this, we are happy to share anything that will help your consideration.
You asked about section 3 of the Protection of Children Order. Are we confident that it is all covered? At this stage, yes, but, again, we have time over the summer to make sure that that initial assessment is correct. What about engagement with children's groups? Again, there is nothing as yet.
Ms Ferguson-Coote: Again, not as yet, but they are on our list.
Ms Ferguson-Coote: Yes. Once we consult, there will be a broad list that takes all those groups into account. It covers all the section 75 categories and beyond, so we have quite an extensive list of groups that this will go out to.
Mr Dawson: What about the cross-jurisdictional aspect?
Ms Ferguson-Coote: Sexual offences have extraterritorial jurisdiction, which means that residents and nationals here who commit an offence outside Northern Ireland that would be an offence in Northern Ireland can be prosecuted. We have not had any particular engagement on that, but we do have advisory groups for intergovernmental agreement (IGA) projects. There is a victims' advisory group, so we can bring what we are proposing to the attention of that group. We have been engaging with Republic of Ireland officials on their experience to help inform our findings and what we might want to do here.
Miss Hargey: Or even if there is legislation or good practice in one jurisdiction that can be lifted.
Ms Ferguson-Coote: We have been looking at that. There is a consultation process, and we then go out to instruct on the drafting. All the other jurisdictions that we are aware of have to be taken into account. That is all part of it, so our drafters will be informed about all of that.
On the motivation of children, I think that your question was about how we protect children who do not think through the consequences of their actions. That came up with the upskirting and downblousing provisions, and the Minister is keen that we include the motivations for that purpose. If you have a consent-based-only or a strict liability case, with the age of criminal responsibility being 10, there might be a young boy of, say, 11 who is cajoled by his peers to do something like this but has not thought through what the result might be and whose intention is not specifically for sexual gratification or to humiliate, alarm or distress. Including the motivations will help in that way. Also, we put a recklessness element into the provisions. That was attached to the humiliation, alarm and distress element of the provisions and cyber-flashing was attached to the sexual gratification element because of the nature of that offending behaviour. Including recklessness means that someone is aware that what there are doing will humiliate and cause alarm and distress but takes the risk anyway. It is unreasonable to take that risk. We have built that safeguard into the legislation for the current image-based offences. We have included it in the consultation, and it is open to views. I hope that we have been clear in the consultation that the provisions will help to drive down overcriminalisation and protect the young who may not think through their actions, because that offence is becoming prevalent. There is a lot of participation among young people in the nudify apps and that type of technology. As Andrew said, it is just not the experts who use those technologies: they are in the hands of our children and they are creating those images. Peer pressure may be involved, and we do not want to bring a child into the criminal justice process who does not necessarily need to be there.
Miss Hargey: A safeguard, one way or the other, is required. If the existing safeguard has been used in other cases, have any gaps in the legislation that is in place been identified? I heard on the BBC this morning that there are children as young as six who have a phone. There will be cases in which where young people have deliberately shared such images, but there are issues with mental capacity. Children have access to the technology at a very young age, so my concern is about the safeguards because there are people who target children online and send them the image. A child may not know the implications of sending that on.
Ms Ferguson-Coote: While the offences will apply to adult victims, the perpetrators can be anyone over the age of criminal responsibility. We have proposed those safeguards because we had very in-depth conversations on the justice section of the Justice (Sexual Offences and Trafficking Victims) Act (Northern Ireland) 2022 — the SOTV Act — in relation to upskirting, downblousing and cyber-flashing because there was concern about the area of motivation. For example, there was concern that if someone says they did it for a joke, as banter, they would get off with it. However, if they do it as a joke, they will be humiliating, alarming and distressing someone, so that is already caught within the provisions. We looked at that critically, took concerns from the previous Committee on board and shaped the provisions in a way that protects young people but does not allow them to get away with committing harm. There is a balance.
Ms Ferguson: I will try to be precise. My question is on the recklessness element of the provisions. You referenced a Spanish case study in which schoolchildren were put on probation for creating and spreading AI-created images of their female peers. They were ordered to attend classes on gender and equality awareness. What consideration has been given to the use of positive obligations, including training and education, as part of the criminal justice response? What more will be done by the Department on public education and awareness raising on deepfake offences? I welcome the consultation, but I am conscious that it will take place in July, which is not an ideal. It is critical to have direct engagement with the consultation and utilise it as an education piece. What are your thoughts on those two areas?
Ms Ferguson-Coote: We will engage and get views through the consultation process, and our press office will be involved in that. Using social media, we will target groups that represent the interests of young people and the Department of Education. We will target where we put the consultation out to in order to get views.
When we have provisions to implement, we will have a promotion or engagement strategy. We did that with the SOTV Act. We are thinking about timings, as we hope to bring the Bill through as quickly as possible and obtain Royal Assent so that there is no gap between the provisions here and those in our neighbouring jurisdictions. We want to bring this in as early as possible, so we will have to begin work on our implementation plan and how we engage with young people to let them know about it. We ran a promotional campaign on upskirting, downblousing and cyber-flashing because they were considered to be the most prominent of the provisions in the SOTV Act on which promotion would be best used. That was fairly successful, if you look at the statistics and the outworkings of that. That was, broadly, a social media campaign, although we did outdoor placement as well on billboards, buses, bus stops and various other things. That was an impactful despite it being fairly low-budget compared with some media campaigns. We got a lot out of that campaign, so we will consider the level of promotion again. We utilise our own press office and try to do as much as we can in-house, but there may well be opportunities, if we have some money, to do more. That will be down to the budget.
Ms Ferguson: Is the use of positive obligations included in the consultation process? What other positive options are there, particularly for vulnerable children and young people?
Ms Ferguson-Coote: In terms of what we might do to rehabilitate or work with them?
Ms Ferguson-Coote: That would probably come under the ambit of the general criminal justice system and the work that the Youth Justice Agency, for instance, in line with other offences. We have engaged with the Youth Justice Agency, and it is aware of this consultation proposal, so we can work with our colleagues on that. I hope that that is helpful.
Mr Beattie: This is really interesting. I am really supportive of this; we are certainly going in the right direction. I do not like the term "deepfake". The term feels like a challenge, and people will take it as such. The word "fake" is good enough: it is about what we do with that. A definition is definitely needed. Connie raised some really good points about harm, distress and humiliation. I was looking at the penalties for the offences under articles 71A, 71B and 72A of the Justice (Sexual Offences and Trafficking Victims) Act (Northern Ireland) 2022, and to be honest they are pretty minor. If we are talking about
"on conviction on indictment, to imprisonment for a term not exceeding 2 years"
what would happen if the person produced 100 fake images? Would their sentence still be no more than two years, or would there be flexibility there? I am concerned that if somebody does this on an industrial scale, their punishment will be pretty meagre.
Ms Ferguson-Coote: We have aligned our penalties for the proposals with those for upskirting, downblousing and cyber-flashing because those are our baseline. This is very much a consultation and about seeking people's views. As you quite rightly point out, there is a sliding scale, so it will be up to the independent judiciary to work through the specifics of the case and attach the relevant sentence. The maximum that we are proposing is two years but that is only a proposal. We have no other basis on which to set that higher at this stage, because we are working on the basis of the other offences. It is very much open to consultation and people's views on whether that is reasonable.
Mr Beattie: I kind of get that. The point about upskirting and downblousing, however, is that it is a physical thing that has been done and can be copied and sent out on multiple occasions. We are talking, however, about the generation of fake images. An individual could do that on an industrial scale — once those images are out there, you will not get them back, no matter how hard you try — and end up with no more than six months in jail. It may be that they are sent to jail for two years but spend only half of that behind bars. I know that there is a consultation, but are people looking at this in terms of scale. It is a crime that could harm an incredible number of people and therefore we need to make sure that we have a bracket, with a minimum and a maximum, so that there is a bit of flexibility.
Ms Ferguson-Coote: We could make that a bit clearer in the consultation. We could say, "That would be the maximum", and spell out some of your concerns. We take the point about the variation in the type of offence: upskirting, downblousing and cyber-flashing could be quite opportunistic. Voyeurism can be premeditated if someone sets up a camera to take pictures, record or observe. However, a lot of times it is opportunistic. The creation of deepfake is more premeditated. That is our view at the moment. It is only our view, and we may well be accepting views. However, we could make clearer the penalties and the maximum would be set, regardless of the variation of the different cases that you talk about.
Mr Beattie: The fake images are absolutely premeditated. They are so easy to do now because of the AI tools that everybody has to hand. Even if you go onto X, you have an AI tool there that will pretty much do whatever you want it to do. It is simple to do, and it will be difficult to fight against it. However, I really want to avoid creating a challenge by calling it "deepfake". I would rather have a proper definition so that people fully understand what it is.
That was a good answer. Thank you very much.
Ms Ferguson-Coote: I will just answer the point about the term "deepfake" and the definition. As you quite rightly say, it is important that we did not put anything down by way of a definition: there are varying definitions, as we said in the consultation, and we want to get it right. Quite often, when you come to the point of drafting, the drafters will have a clever way of designing the definition so that it captures all your intentions. That is why we did not go out to the public with a definition.
The term "deepfake" is commonly used. We had a similar difficulty with the term "revenge pornography". A lot of organisations that support victims and represent their interests do not like that term because it suggests that there is a level of blame on the victims of the photographs. "Deepfake" is a common term that is understood and a lot of young people understand what it is. We went out to promote awareness of the offences of upskirting, downblousing and cyber-flashing, which are technical names. Upskirting and downblousing are set within the context of voyeurism, and cyber-flashing is the unsolicited sharing of an image. It is just that the common terms are quite well known across —
Ms Ferguson-Coote: — the people with whom you want to engage. That is why we use those terms. We do not particularly like some of them, but their use allows better engagement. The term "deepfake" is better known for that type of behaviour than a name that we might give it in legislation.
Mr Beattie: That is fair enough. We took it off Reddit, so we take it off social media and then use it. I do not want to turn it into a popular term.
Ms Ferguson-Coote: Yes, I understand.
Mr Beattie: However, that is only my opinion. I will not die in a ditch over it: I just thought that I would raise it.
Ms Ferguson-Coote: I understand that, thank you.
Ms Ferguson-Coote: Yes, they can. The people who set up the technologies do.
The Chairperson (Ms Bunting): Is two years' imprisonment maximum a sufficient deterrent if people are going to make money? Some of that stuff, you say, gets 17 billion views. If that was on something of the scale on YouTube, it would make a lot of money. Is that sentence a sufficient deterrent?
Ms Ferguson-Coote: That is a good point. As I said, we have a baseline in the proposals and we very much encourage views on the levels.
The Chairperson (Ms Bunting): I am interested in the baseline because there is a massive distinction between upskirting and downblousing and the offence in these proposals. They are very different types of offence, so I am not sure that that is the best baseline to use. I presume that that will be borne out in the consultation. We will see.
Ms Ferguson-Coote: It is, yes. It is similar to the provisions in England and Wales, which have a summary-only offence for the creation of such images without the need to prove motivation. For them, sharing carries a similar penalty.
Ms Ferguson-Coote: It is. It is similar, so it is not —.
Ms Ferguson-Coote: Not at the moment, no.
Ms Ferguson-Coote: The sharing of content, which is the hybrid offence, came in with the Online Safety Act 2023, so we could have a wee look at that.
Mr Dawson: We are entirely open to challenge on it.
Ms Ferguson-Coote: We are open-minded about it.
Mr Dawson: We felt that, of the existing offences, those were the appropriate benchmarks to use, but —
Ms Ferguson-Coote: It is about knowledge of variations.
Mr Dawson: — we could be wrong.
The Chairperson (Ms Bunting): It will be interesting to see what wider society says. A number of Members have been victims of this, and it will be interesting to hear their views on whether that sentence is sufficient for the damage that has been inflicted on them, given that, for example, their children may have seen the videos of them. Their experience is replicated in that of many people, mostly women, across the piece.
Does anybody have anything further? No. In that case, folks, thank you very much indeed. All the very best, and we look forward to seeing you again when the responses come back.
Miss Hargey: Have you agreed dates for the consultation? Your paper says, "early July", but you have no —.
Ms Ferguson-Coote: We have a bit of finessing to do, and we would like to take Mr Beattie's concerns into account. We will then produce the document for the Minister's sign-off and get it out as soon as possible to a wide distribution list.
Ms Ferguson-Coote: It will be open for 10 weeks; I said that, I think.