Skip to main content

Copy and paste the real world with your phone using augmented reality

Copy and paste the real world with your phone using augmented reality

/

Finally, a practical use for AR that isn’t just shopping

Share this story

Cyrial Diagne’s ‘AR Cut & Paste’ demo in action. Image composite: The Verge
Cyrial Diagne’s ‘AR Cut & Paste’ demo in action. Image composite: The Verge

Apart from burning through VC money and educating the public on how frighteningly large emperor penguins are, what is augmented reality actually good for? Here’s one answer: real-life copy and paste.

As this awesome demo from developer Cyril Diagne demonstrates, AR can be the perfect tool to quickly grab visuals from the real world and paste them into digital documents. Just point your phone at what you want to copy, and drag it over to your desktop. No fiddling around emailing images to yourself or cutting out objects in Photoshop. Forget it: your homework / mood board / dumb meme involving your pet’s stupid face is already done.

This is only a research prototype right now, but judging by replies to Diagne’s video, it looks like a few companies are already working on similar software. You can probably expect to see tools like this on your mobile phone in the near future.

It would certainly make a nice addition to the only other AR application that seems to have much practical use: seeing what clothes, furniture, and makeup look like pasted onto your face and / or house. And it neatly reverses the usual AR paradigm. Instead of projecting digital images into the physical world, it brings the physical into the digital.

As Diagne explains in a thread on Twitter, there are a few moving parts to his AR Cut & Paste demo. One component separates the foreground object from the background with machine learning, while another detects where your phone is pointing at your computer screen. Diagne says it takes about 2.5 seconds to copy an object and four seconds to paste it, but that could be easily sped up. In fact, he’s even put his code up on GitHub for anyone who feels like they want to improve it themselves.