Oh, the things people will do for love. Scratch that: the things robots will do.
ABE VR, Hammerhead VR’s new “experience” (emphatically their word and not mine), is about the exact lengths to which a robot will go in search of love. That is not necessarily a good thing—particularly if you’re a human. The robot has a twisted idea of love. He—if the robot’s gendered, it’s a he—will do just about anything in the name of unconditional love.
Since this is virtual reality, this premise offers plenty of opportunities for horror. You are the person the robot is interested in, which means the blades it wields are pointed straight at you. If VR has the capacity for immersion and even to generate empathy—as some claim—it also has the capacity to horrify. That’s what ABE VR aims for, and the trailer clips suggest that it is on track to pull that trick off. Think of it as a bad first date, but with robotsplaining instead of the male variety.
ABE VR exists in virtual reality but not a wholly virtual world
ABE VR is not really a commentary on technology or love per se; it’s mainly an excuse to freak you out. But it’s hard not to think that the robot’s concept of love, unhinged though it may well be, is not entirely detached from our world. It is everything bad and needy and violent about the worst relationships but given a different form. This is not an entirely new problem. Technology often takes on the quality of its creators. As Jack Clark noted in a recent Bloomberg Technology article, that is a particular problem when it comes to AI:
Much has been made of the tech industry’s lack of women engineers and executives. But there’s a unique problem with homogeneity in AI. To teach computers about the world, researchers have to gather massive data sets of almost everything. To learn to identify flowers, you need to feed a computer tens of thousands of photos of flowers so that when it sees a photograph of a daffodil in poor light, it can draw on its experience and work out what it’s seeing.
If these data sets aren’t sufficiently broad, then companies can create AIs with biases. Speech recognition software with a data set that only contains people speaking in proper, stilted British English will have a hard time understanding the slang and diction of someone from an inner city in America. If everyone teaching computers to act like humans are men, then the machines will have a view of the world that’s narrow by default and, through the curation of data sets, possibly biased.
The bad behavior in ABE VR is not strictly gendered but it does not exist in a void. It transposes a series of sadly human behaviors into another realm. This is just another entry in a long line of paranoid tech thrillers, but it cannot be dismissed on those grounds; ABE VR exists in virtual reality but not a wholly virtual world.