From VHS to Web 1.0, pornographers have always been early adopters of technology, so it should be no surprise they’re pioneering the use of artificial intelligence to sell sex. According to a recent Rolling Stone article, enterprising smut-peddlers are using AI to create computer generated simulacra of women in order to sell their “nudes.”
Claudia, the creation of a couple of computer science students using Stable Diffusion, has been posting all over Reddit for the past three months. She doesn’t exist, but shares naked pictures, posts about her sexual interests, and responds to comments—everything a real person might do to sell nude pictures. Judging by the thirsty comments, it seems to work for a lot of people. This particular robot-woman seems to have been created as an experiment, but her authors report they made over $100 selling her pictures before other Redditors called out the deception.
How many other AI creations are out there posing as real people isn’t known. The technology isn’t quite advanced enough to make the ruse fully believable, but the time is fast approaching when it will be. While OnlyFans, the internet’s leading adult content subscription service, has a strict verification process that will likely weed out AIs, sites like Reddit don’t, so there’s no foolproof way to tell for sure whether the online model you’re lusting over isn’t actually a machine (not that there’s anything wrong with that). But there are some tells you can look for.
Check the comment section
Before you begin an online relationship with a nude model, check their post history for comments from other users calling them out. You might not have time to pore over a nude photo long enough to spot it as AI generated, but someone else probably does, and the internet’s long history of contrarianism makes it likely that someone else has already called a fake a fake.
Check the background
At present, the technology for generating AI images is advanced enough to create images of human faces that can fool many of the people, much of the time—you can check out Which Face is Real and test yourself to see what I mean—but it isn’t as good at filling in the details in the background of photos. The backgrounds of AI photos tend to look like a texture, or feature details that don’t make sense—posters with “text-like” writing that isn’t text, areas that look painted, etc.
To get around this, AI-generated humans are often presented with a blurred or relatively featureless background. Check out this pic of Claudia: Her room’s decoration-free walls and featureless TV scream “AI.” Note that the angles of the wall don’t actually meet, and that the furniture holding the TV seems connected to the floor.
The “too perfect” look
There can be a glossy, “uncanny valley” look to computer-generated images. Many AI images smooth things out in unnatural, inhuman ways. Skin is totally free of blemishes. Hair looks painted instead of brushed.
This is a little tricky, because photo filters used on real people can do the same things—but there’s still an overall fake “vibe” to AI pictures that should make you think twice about whether they picture a real person.
Asymmetries, proportions, and hands
People are not 100% symmetrical, but we tend to be asymmetrical in recognizable ways. AI people are often off in ways that don’t make sense—an eye is too big or a different color than then other eye, there are too many teeth, the feet are way too large.
AI images tend to have an especially hard time with hands, often creating images with too many or too few fingers, or unnaturally long or short digits. A pair of glasses can be a dead-giveaway, as they can be depicted as not a tangible object, but melting into someone’s face. Earrings are the same deal—they’re often mismatched in AI images. Hair can be arranged in unnatural ways, or sprouting from places it generally does not. It’s all about the fine details.
Use AI-spotting software
If you want to really drill down on the reality of your internet crush before sending them money for lewds, use software to check for fakes. V7 Labs has a Chrome extension you can download that scans profile pictures for the signs of AI-fakery.
You might have to learn to accept it
Claudia, the fake e-girl from Reddit, is fairly crude in terms of AI-imagery, but still good enough to have fooled at least some users. These types of fakes are only going to become more sophisticated, so in the future, there probably won’t be any way to tell who’s real and who’s been dreamed up by an AI.
The silver lining is that you’ll eventually be able to create your own, personal e-girls and e-boys and beyond, exactly fit your tastes. Whether knowing that they don’t actually exist will affect your “enjoyment” of them is another, more philosophical issue.
from Lifehacker https://ift.tt/q831mUs
0 comments:
Post a Comment