Light L16 camera review: futuristic frustration

The Light L16 camera is an engineering marvel. It takes 16 different smartphone-sized imaging modules, each carefully aligned behind a piece of glass, and uses them in concert with each other to create images that are bigger and better-looking than the results the individual cameras are capable of. It does all this in a form factor that’s two or three times thicker than, but not quite as wide as, an iPad mini, something that actually fits in a few pockets and is easy enough to stow in a bag. That’s Light’s selling point for this $2,000 camera: the L16 is ostensibly a full bag of camera gear in one body.

At a very high level, this is the experience of using the L16. With a scrub of your thumb across the 5-inch touchscreen, you can quickly zoom from a 28mm wide-angle perspective to a 150mm telephoto view or anywhere in between.

Otherwise, the L16 is mostly a bother. The photos it takes are in a strange-quality limbo between smartphone images and something that was shot with a mirrorless or DSLR camera. The camera handles slower, contemplative photography just fine, but it had trouble keeping up when I photographed more demanding scenes. The desktop editing software — which is necessary to process the full-resolution images and essential if you want to get any of the L16’s photos onto an iPhone — is fairly robust, but it’s sluggish. For such a remarkable camera, I constantly felt uninspired to use it. And, more often than not, it stayed right in the bag or coat pocket that it so easily fit.

If you’re having trouble grasping how the L16’s dizzying camera array works, think of it this way: you know how dual-lens phones like the iPhone or Note let you zoom from wide angle to telephoto in the camera app? This is that, stretched to the extreme. The 16-camera modules each have their own image sensor and lens, and they cover different focal lengths. There are five 28mm wide-angle modules, five midrange 70mm, and six 150mm telephoto ones.

The big difference between this camera and those phones is that the L16 simulates all the focal lengths in between 28mm, 70mm, and 150mm by combining data from multiple camera modules. So instead of digitally zooming in on the 28mm image to make it look like it was shot at 40mm, it’s replicating that focal length by stitching images together on the fly. This is also why the quality of the L16’s images can be a notch better than that of a single smartphone camera. The L16’s results are slightly greater than the sum of its physical parts, all thanks to some really clever software.

Considering all the software and hardware work that makes this possible, this all happens remarkably fast, within a second of taking a picture. The camera quickly coughs up the results, too, though that’s because at first, it shows a lower-resolution preview of the image you take. On playback, you can tap a button to assemble a higher-quality 13-megapixel composition with better dynamic range and detail. But to get the full-resolution 52-megapixel image that the L16 can capture, the images have to be processed through the company’s desktop editing software.

I found that 13-megapixel photo was usually good enough, though. In good light and at wide focal lengths, the L16 captures photos that can fool you into thinking they were shot with a bigger sensor. But so can smartphones, and top-tier smartphones are more consistent than the L16, which tends to oversaturate its images or throw a color cast on the image in scenes with multiple lighting sources.

The L16’s performance drops off in low light. In theory, combining data from multiple images should help it combat image noise and turn out photos that are rich and bright. Most of the time, though, I wound up with blurry, dirty-looking pictures.

Quality is especially a struggle when the L16 is zoomed all the way in. Those 150mm modules have f/2.4 apertures, so they let in less light than the wider cameras (which are f/2.0). Worse, it’s hard to keep the picture steady at such a long zoom range, and there’s no image stabilization. The L16 might save all the space that a zoom lens takes up, but smaller isn’t always better at long focal lengths.

Speed is another problem. To be fair, I didn’t notice many delays when I shot slowly, which shocked me considering all the computational work the camera is doing with each shutter click. With this in mind, I found it especially useful for street photography. How often can you zoom to 150mm without pointing a long lens at someone?

But when I tried to rattle off a burst of shots, things ground to a halt. The refresh rate of the screen, which is forgivable when you’re taking one or two photos, makes capturing multiple images a headache. I’m a professed over-shooter, but it has a significant speed limit.

I also found the camera’s autofocus to be slow and sometimes unreliable, especially in low light. And since all this is managed through the touchscreen, it was hard to make sure I had it focused on exactly the right spot. There’s just way more room for error here than you’d find on a traditional digital camera.

These delays and quibbles added up quickly when I tried to push the camera. While it served me well on slow walks through my neighborhood, it was far too slow to handle a chaotic scene like a dog park.

If the L16 really is supposed to supplement or replace my DSLR or mirrorless camera and a bag of lenses, it can’t just save me space. It has to offer — or at least approach — similar versatility as those cameras. That means there are times that I’m going to need it to be quick. Otherwise, that trade-off might not always be worth it.

Part of my frustration with L16’s responsiveness came from the touchscreen-only interface. The UI is fluid and fairly easy to understand, but while I like having some touchscreen options on cameras, I hate having to do everything through them, especially in moments when I’m trying to quickly change settings so I can capture a fleeting moment. The lack of knobs or a viewfinder or any physical controls (save for the shutter button) is all meant to keep the L16 slim. Light accomplished that mission, but it comes at a price.

Things don’t get any easier when it comes time to edit the L16’s images. I struggled with the company’s desktop editing app — which, I should note, is still in beta — even on a powerful MacBook Pro.

Light built a sort of pseudo-Lightroom experience, with a filmstrip layout of all your photos and some basic editing tools like exposure, contrast, and sharpness. There are also creative tools, like simulated bokeh / depth adjustments, which is reminiscent of the original pitch for Lytro’s cameras. These depth effects work okay as long as the camera actually focused on your intended focal point, but I found that isn’t always the case. If the camera actually focused on a person’s ear, instead of their eyes, you’re probably not going to be able to fix that in the editing software, and the bokeh effects will only make it more obvious that you missed your focus.

Overall, the editing software, especially those more creative tools, is sluggish. It’s not as bad as Lytro’s desktop editing software was when it ruined my experience with the Illum a few years ago. Light has put in a lot of effort in the early months of this camera’s existence to make all the software (on the computer and on the camera) faster, lighter, and easier to understand. That’s a welcome gesture for a camera that isn’t going to sell more than a few thousand units.

Images shot with the L16 at 28mm, 104mm, and 150mm — all from the same spot on the sidewalk.

But it’s not enough. Zooming in on a photo, for example, causes the software to have to reassemble the image on the fly, leading to load times between each click of the magnifying glass. Simple edits, like adjusting the exposure, can sometimes take a half-second or more to render. Added up, little delays like these stretch editing sessions that should take 30 minutes into hours.

You can skirt the desktop editor if you have an Android phone because the L16 is capable of Bluetooth file transfers. (The camera actually runs Android. The “camera” interface is just an app that Light built.) But even then, you can only transfer the 13-megapixel JPGs, not the raw DNG files. And since there’s no accompanying mobile app, and iPhones don’t allow regular Bluetooth file transfers, there’s no quick or easy way to get the L16’s images onto your Apple device.

Instead, iPhone users have to connect the camera to a computer, download the images through the desktop software (a process that can take about a half-hour for around 300 images on its own), export them (which is also a struggle), and transfer them to the phone from there. It’s a frustratingly old problem for such a futuristic camera. And when you consider how good the cameras have gotten on smartphones like the iPhone X, Google Pixel 2, or even now the Huawei P20, as well as the fact that you can shoot and edit RAW images on these phones, it makes you wonder why anyone would drop $2,000 on the L16.

I’m not that happy with the L16 as a camera. But for Light, the real value of the L16 is not as a standalone product. Rather, it’s a demonstration of certain technologies and ideas, all of which could make a lot more sense in smartphones.

Space is at a premium in smartphones, but the demands for quality and versatility when it comes to smartphone photography have skyrocketed in recent years. Even scaled down to just a few cameras, Light’s technology could still bring big optical zoom, bigger resolution, and maybe better quality to the backs (or fronts) of our smartphones.

The problem is that in the time it took Light to get to market — I saw the first prototype in October 2015 — the biggest smartphone makers in the world started implementing similar ideas. Apple, Samsung, and now Huawei use multiple camera modules on the backs of their phones, giving customers extra versatility by offering a zoom lens. They can also combine results of these multiple cameras into a final image that is better than what one could have captured on its own.

Light told me as recently as late last year that it was working with a smartphone manufacturer to incorporate its technology. It seems like the right fit for Light’s technology, but it’s unclear what part of the market the company was talking about. Maybe Light will give midtier smartphones a chance to compete with the iPhones and Samsungs of the world. It also may have missed the boat.

Despite its flaws, I still can’t really wrap my mind around the simple, stunning fact that the camera, with all its moving parts and software jujitsu, worked any time I clicked the shutter.

I never got over the feeling that the L16 is a better demonstration of an idea than it is as a product. Even then, the L16’s middling results aren’t exactly promising. After all, some people did buy this thing. Light solved lots of problems to make this camera work, and maybe now it can apply some of that knowledge to a smaller form factor like a smartphone. Until then, as replacements go for bigger, bulkier cameras, my existing smartphone will do just fine.

Vox Media has affiliate partnerships. These do not influence editorial content, though Vox Media may earn commissions for products purchased via affiliate links. For more information, see our ethics policy.


I love gadgets and this might be the gadgetiest gadget in a while. Really interesting.

Yes! It’s interesting, quirky and flawed. I can’t bear to see the specs of another smartphone, this is at least something new to chew on.

As if I didn’t already have enough trouble getting my fingers out of the frame.

I think Light’s biggest problem is the image quality. It’s simply inconsistent, particularly for the people who are legitimately anticipating that it will be able to replace their higher-end camera kit. The high MP images have bizarre artifacts where there will be tons of detail in one area, and then weird low-res blur where there is a stitch with less overlapping image data. It’s a big problem.

And the form factor on the Light isn’t even that tiny. It’s relatively thin, compared to anything with a big sensor and a lens, but it’s still a brick. No one is putting that in their pocket. And if it’s not in your pocket, who cares if it’s slightly thinner?

I think the biggest competitor to something like this is a travel zoom camera like Panasonic’s ZS100 (or their new ZS200). Big 1" sensor, so image quality will be comparable in most circumstances, and a 25-250mm (or 24-360mm!) zoom range, all in a package that is considerably smaller in every dimension but depth (where the lens protrudes about 3/4" farther). Most importantly, the image quality is at least somewhat consistent, the ergonomics are better, usage and interface is fast and predictable, and the RAW files can be post-processed efficiently by any software suite in the world. And a price tag between $550-800, instead of $2000.


Yup, in short I’m super interested in this but the combo of spotty image quality and high price make this a no go for me.

I was so excited for this product when I saw it announced, but I’m not at all surprised when with how this review ended up. However, I think this could definitely be a future of photography. For "Pro-sumers" this could be the one camera they have in addition to their phone, as long as it eventually does do image processing better.

I disagree on form factor though – this is a big difference from even a mirrorless camera. This could much more easily fit into a jacket pocket or purse than a 4/3 or APS-C sensor, because the telephoto lenses on those can get pretty big. If you can high quality zoom from 6 smaller lenses that fit into a inch-think brick, that would be a huge innovation.

I would love to own a version of this in maybe 6-8 years when the processing on this is faster, and less post-processing work is required. Also, less than $1000 please. Keep up the good work!

I’m not comparing to a wide-ranging interchangeable lens camera, but rather a 1" sensor fixed-lens superzoom.

See the size comparison in the link above.

The Canon S100 here is about the same max thickness as the L16 (though much, much smaller in every other dimension). You can see how the ZS100 compares – it’s basically the same thickness as the L16, the only difference is that it has a small lens protrusion that sticks farther out. Again, in every other dimension the ZS100 is actually much smaller. And it even has a viewfinder so that you can see in bright light. Not to mention 4K video, yada, yada, yada…

Very fair point. I personally haven’t used high-end point and shoot cameras like the Canon S100 or Sony RX100, but maybe I should. I always thought that the quality of those was only marginally higher than a modern iPhone, while a mirrorless would be very noticeable.

For it to be worth it, it has to be significantly better than your phone. That’s a high bar nowadays

The Canon "S" series is a fair bit older, and only has the same size sensor as the new Huawei P20, so it’s only a bit better than an iPhone. The new Canon "G#X" series, the Sony RX100 series, and many of the new Panasonic compacts, however, have 1" sensors, which are a really big upgrade from a smartphone.

Still, even with a smaller sensor compact, the minute you’re zooming in farther than 50mm, the image quality quality gap is really dramatic. Digital zoom on smartphones is atrocious compared to even a soft, low-quality optical zoom lens.

Anyway, it’s clear the L16 is desperately first-generation technology that is more of a tech-demo than a real product, a bit like Lytro. They can’t earn serious revenue from product sales with a debut like this, but hopefully it’s able to show the potential of their technology to a company that has much, much deeper pockets and is able to invest in R&D, simplify and scale down the tech, and integrate it into a smaller, less ambitious device, at perhaps 1/5th of the price tag.

Or, best case scenario, it ends up in a smartphone.

Google / Alphabet and Light seems like a match made in heaven…

Wouldn’t it be funny if Microsoft was like "hey – that’s the Lumia we’ve been waiting for!" and then built this tech into a Surface Phone?

I know, I know, WP is dead. This is exactly the kind of thing that would have made the hardcore fans salivate, though.

And the phone still wouldn’t have any apps. Maybe PWAs will make a comeback possible in a few years. If MS has the interest in trying.

The idea of the Light L16 is promising. But it’s clearly held back by the hardware and image processing. The lack of OIS on any of the cameras and small sensor sizes (1/3.2 inch making it smaller than flagship smartphones) lead to lackluster low light performance. If it struggles to outperform (if even) a flagship smartphone in low light, the only reason to purachase this would be for the greater zoom capabilities compared to a smartphone.

With the big smartphone companies already having working implementations of image processing on data for a multitude of sensors, there is less reason for them to even consider acquhiring Light.

I thought this is Huawei P2000

I was checking my calendar because I honestly thought this was a joke.

If any review required video element, it’s this thing. I wanna see how it works in action!

The picture of that thing, all those lenses, quite literally, makes my skin crawl.

I know. It’s deliberately organized that way, but it’s disgusting to look at for some reason. I just want to "clean" it off put a case on that think pronto.

This camera doesn’t do anything to me, but man… I once did an image search on Google for trypophobia… yuck. It’s weird how some of the pictures are fine, but a few I just wanted to hurl. I’ve avoided looking it up ever since.

Where is your skin crawling to, quite literally?

I want to know the math behind why they oriented the lens the way they did on this… Shoggoth-cam.

While this is very cool and I hope that Light moves forward with future products like this … looking at the camera makes me feel very uneasy, like I’m looking at some sort of minor eldritch horror.

haha, that HAS to be a phone case with a picture of multiple lenses.



i don’t know why, but somehow I thought this was a phone at the beginning, then released it is a camera, then the all touch interface does not make sense

As compute power, bandwidth, and storage increase…. there’s no doubt that the future of photography and videography is computational imaging replacing bulky optics.

I wonder how the number of sensors and their distances from each other affect the error in depth mapping…will two or three be sufficient?

I’d love to get my hands on one of these.

View All Comments
Back to top ↑