When Ravi Yekkanti puts on his headset to go to work, he has no idea what his day in virtual reality will bring. Who can he relate to? Will a child’s voice carry a racist comment? Does the cartoon try to grab his penis?
Yekanti’s job, as he sees it, is to make sure everyone in the Metaverse is safe and having a good time, and he takes pride in it. He is at the forefront of a new field, VR and the metaverse of content moderation.
Digital security in the Metaverse has gotten off to a somewhat rocky start with reports of sexual assault, bullying and child abuse – an issue made all the more urgent by the recent announcement that Meta is lowering the minimum age for the Horizon World platform. From 18 to 13.
Because traditional editing tools like AI-enabled filters don’t translate well to specific words in real-time immersive environments, mods like Yekkanti are the primary way to ensure safety in the digital world. And this work is becoming more important every day. Read the full story.
– Tat Ryan-Mosley
The flawed logic of accelerating dire climate solutions
Early last year, entrepreneur Luke Eiseman said, he released a pair of sulfur dioxide-filled weather balloons from Mexico’s Baja California peninsula, hoping they might explode miles above the ground.
This was a simple task in itself, effectively a small, DIY act of solar geoengineering, the controversial idea that the world could combat climate change by sending more sunlight-reflecting particles back into space.