YouTube is working on its own take on stories, which it’s already testing in beta with a small group of top creators. Now, the company is adding the next logical feature from Snapchat and Instagram to those stories: augmented reality green screen filters, as first spotted via Engadget.
The technology, explained in full over on Google’s research blog, actually seems pretty impressive: the company is using neural networks to identify and separate a subject from the background. It’s the same net effect as a green screen, allowing creators to insert whatever they want as their new backdrop in real time, but it’s made all the more impressive because it doesn’t require an actual green screen to make separating the subject easier.
The system isn’t 100 percent perfect yet, and much like portrait mode features on newer smartphones, which are essentially performing the same sort of task, there’s still some blurring or ghosting around the edges where the masking isn’t quite there. However, it’s still impressive tech to see running on a smartphone.
For now, Google intends to use the already limited beta of YouTube stories to further test the technology, but as it continues to improve, the green screen-like tech could show up in Google’s other AR services and apps in the future.