My first reaction to Apple's WWDC announcement of iOS drag and drop was basically, "Lol welcome to three decades ago." But after watching a more in-depth technical explanation of the technology at Apple's follow-up Platforms State of the Union, I'm starting to wonder if Apple has a new “pinch to zoom” on its hands: a technology that doesn't just allow for multitouch devices to compete with point-and-click desktop experiences, but in a way, it surpasses them.
iOS’s drag and drop from a user’s perspective is fairly simple. You press and hold on an object, and it's pinned to your finger. Then you drag the item to wherever you want it to go, and release. On an iPhone, your drop destination is limited to elsewhere in the same app, but on the iPad you can drag an item down to the new universal dock in iOS 11, jump to the app switcher view and switch apps, or hover over an app that's already open in splitscreen, all while holding on to your original selection.
I'm starting to wonder if Apple has a new pinch-to-zoom feature on its hands
What makes iOS drag and drop special is that you can grab multiple things at once, and they don't all have to be within a convenient marquee selection range. You can keep hold of the first object; navigate elsewhere in the app; grab something else, which is then added to the "stack" of stuff under your finger; and keep adding until you're satisfied. On a desktop it's possible to select non-contiguous items for drag and drop with shift click or control click, but I'm unaware of a system for grabbing onto multiple items from multiple views outside of fancy clipboard hacks.
When you’re dragging something around on iOS, multitouch means you can still fully interact with the rest of the OS. On a desktop, once you’re holding something with the cursor, you’re limited to what you can hover over for a drop target and what keyboard commands can marshal.
One demo Apple provided is on the familiar iOS home screen. It’s been rebuilt with the drag-and-drop API, and that means you can grab an app, swipe to another screen, grab a few more apps, and then drop them all at once into a folder. A huge win over the one-at-a-time limitation that’s always been present on iOS.
According to Apple’s documentation, you can actually have multiple of these drag activities going on at once. “As many as the user’s fingers can handle,” is the official spec, which sounds bonkers.
Multi-grab makes iOS drag and drop stand out, but it's Apple's APIs that could make it even more powerful. It's all thanks to the blessing and curse of iOS: Apple's stranglehold control on how apps talk to each other in the operating system. To make an item available for drag and drop, developers have to implement a specific API from Apple, which allows them to specify thumbnails, metadata, and the exact data payload. When a user hovers an item or a collection of items over another app, that app can see the metadata, but not the actual payload, for security reasons. But the ability to use metadata hints and arbitrary data formats in a drag-and-drop interaction could unlock really interesting new inter-app workflows and interactions.
For instance, dragging a location from Maps includes a detailed thumbnail preview, which will show up when you drop that location into an app that supports images — like Notes, for instance. That sort of “expansion” is a common feature in apps like Messages or Slack, but now it’s baked into a very basic part of the UI.
Adobe gave a demonstration where a presenter selected a few color samples and a brush from the company’s Capture app and placed them into Photoshop Sketch, all in one drag-and-drop operation. The presenter then grabbed multiple layers from Sketch and placed them into Photoshop Mix. In Mix you can even target a location on the canvas for the assets to land, or place them in the layer stack.
Any app that implements the standard text controls of iOS gets basic drag and drop for free, and other Apple-built UI tech, like table views, will work well with drag and drop out of the gate. But if developers embrace the extended opportunities here, we could see apps with more complicated data types talk to each in really novel new ways.
You need a lot of apps to speak a common language
Maybe you could drag a video file from an editing app into an app that only accepts images and get the video's thumbnail, or drop it into an audio app and extract the audio. Or drag a file from Dropbox and drop it as a URL in an app that doesn't support that kind of file. Apple has already shown interoperability between data-rich Calendar items and Reminders. Obviously, Adobe is using custom layer and brush data formats to make it easier to jump between its own iOS apps, but conceivably any other design app could implement support for these data types.
Maybe I’m overhyping this. So much is left up to developers that rich drag-and-drop actions could be too rare to matter. For this to work, you need a lot of apps to speak a common language, and Apple is only specifying a format for very generic assets. But if it goes well, I can now imagine a world where better drag and drop is a reason to pick iOS over macOS for complicated, multi-app workflows. Is that crazy to say out loud?