Thoughts on AI-Generated Photo Manipulation
The Winnipeg Free Press recently reported on an incident related to “explicitly altered” images at a Louis Riel School Division school (story embedded below or found HERE). The images in question involved faces of under-age students that were manipulated using AI.
My heart goes out to both the victims of these images but also those who created the images without a potential understanding of their impact. I am seeing a lot of comments in the online space surrounding the role of AI in this situation. The following are some of my initial thoughts:
Technical Education
I believe that society, students included, needs to be formally educated in what generative AI is, how it can be used, and what examples of unethical use look like. The idea of deep fakes and altered media will continue to grow in use and impact news, entertainment, and politics. Society needs to understand how to think critically about what they see and hear and how to report suspected deep fakes. People also need to understand what the impact is for participating in this type of content creation (specifically when looking at creation for the purpose of causing harm or spreading misinformation).
Sexual Education
Youth need a clear understanding of what safe and consensual sexual exploration looks like, especially for those under age. We saw similar boundary-crossing incidents throughout recent history as apps like Snapchat made the idea of sharing images appear “safe” or digital cameras became more accessible. Going further back in history, I think we all know of stories where images were cut out of yearbooks and pasted on pornographic magazines, photocopied, and distributed. While digital tools like social media and AI have increased the accessibility, reach, and finesse of these practices, these incidents speak to a need to inform and support appropriate sexual etiquette and not to a failure of the digital tools themselves.
Legislation
This type of incident calls for an increased push for legislation regarding child sexual content on the internet in general. There are HUGE failings in general when ANY software allows these types of images to be uploaded, distributed, or (in this case) created.