AI takes a regular image
I've been wondering about this for a while now, and I’d love to hear what others think. If an AI takes a regular image and generates a nudify online of it—especially when it’s not based on any real nude photo—can we still call it a deepfake? I know deepfakes usually involve face-swapping or realistic video editing, but this feels like a gray area. It’s not using a real source image for the nude part, but it’s still imitating something that doesn’t exist. Thoughts?
5 Views
Great question, and honestly, it's one of those areas that doesn't have a clear boundary yet. I’ve used some of these tools out of curiosity for how the tech works, and it’s definitely more of a “nudify online” engine than a classic deepfake generator. For example, this site doesn’t use real nude content—it relies on trained AI models to predict what the anatomy might look like based on pose, lighting, and body type. It’s algorithmic reconstruction, not image synthesis based on stolen or altered footage. That said, the ethical concerns are absolutely real. While it's technically not a deepfake in the traditional sense, the intent behind using it could be very similar. If someone uses it on a real person’s image, it could be just as invasive, even if it’s not “real” nudity.