iOS 15 may give the iPhone’s camera an upgrade: fewer green flares

The latest version of Apple’s iOS 15 beta seems to be making subtle improvements to photos by processing out the green lens flares that can show up in outdoor pictures (via 9to5Mac). News of the feature was posted to the iOSBeta subreddit by Reddit user Doubleluckstur, and The Verge was able to see it in action by testing with an iPhone 12 Mini running the public beta.

Many iPhone users will be familiar with the green blobs, and while taking a picture it seems like nothing’s changed — the flare still shows up in the viewfinder. But when you go to view the final picture, the flare is nowhere to be seen (in some cases; we’ll get to that in a bit). It does seem to be the result of all the post-processing that’s done to the picture, as the flare shows up in the alternate (and less-processed) frames that are available if you’re using Apple’s Live Photo feature.

The green dot shows up in the viewfinder and Live Photo, but not the processed still.
Left: Live Photo frame, with flare. Right: processed frame, without flare.

So far it’s a bit unclear which iPhone models the processing happens on. In the Reddit thread, one user reports also seeing the green dot removal on their iPhone XS, while another says it doesn’t work on their iPhone 8 Plus. 9to5Mac speculates that the feature could be limited to phones with an A12 Bionic processor or newer (so the XS and XR onward). The feature only being available on newer phones wouldn’t necessarily be surprising: some of iOS 15’s features, like Live Text or Portrait mode for FaceTime, already have the A12 Bionic listed as a requirement.

Of course, with this discovery came people trying to chase it down and find its limitations. Reddit user -DementedAvenger- posted examples of the flare still showing up on top of trees or mesh bug-proof window screens, as well as flares that came from a bathroom light instead of the sun. I was also able to replicate his tree example, as well as get some indoor flares of my own (though it’s worth noting that they’re not the same green bubble flares that so often occur from the sun).

You can still get flares if you’re really trying, especially indoors.
You can see the flare on the right side of the tree.

Another limitation is video: I couldn’t find any scenarios where flares showed up any differently in the final video than they did in the viewfinder. Of course, removing a lens flare from 30 or 60 frames per second 4K footage would be quite a bit more computationally intensive, and have a greater chance of looking a bit weird, so it’s not necessarily surprising that the feature seems to be photo-only for now.

The gif compression turns it white, but the flare is still clearly visible in this video.

As always with beta features, the functionality we’re seeing here could be different with the next release, or gone altogether. Still, it seems that Apple is at least experimenting with getting rid of the green blobs that can crop up in images from time to time, and it would be a welcome thing to see in the final version of iOS 15.


The post-processing on iOS is so confusing to me. Take a live-photo and after a couple seconds the colors and exposure of the image completely change. But if you edit the image to be an earlier or later frame in the live-photo, the colors and exposure aren’t automatically adjusted.

That being said, I’ll be happy to see these green blobs disappear in post-processing if this is a launch feature.

What IS a photo? Has anyone ever argued about this?

What is a camera?

What is a phone?

What is love?

Baby don’t hurt me.

Got to do with it?

Yes. This has been argued a lot. Purists take the position that a "photo" is the raw output from the sensor/film that represents the objective capture. They would not consider a photoshopped image that alters the subject, surrounding and lighting to be a "photo". Post processed on modern phones falls somewhere in between.

Baby don’t hurt me.

The example without the flare is terrible, I can see HDR haloing around the trees.

I wouldn’t put a ton of stock into the pure image quality, these are jpgs exported at around 80% quality.
Though part of the sky being blue and part of it being grey is actually what it looked like in real life — I’m surrounded by wildfires and the air has been extremely smoky.

No, that’s HDR halos, I can see those from a mile away. JPG compression has nothing to do with it. Second photo in the article shows an extreme example of it.

iPhone cameras have been pretty dot flare crazy for years. It’s nice they’re at least recognizing it with a patch even if it can’t handle video.

Another limitation is video: I couldn’t find any scenarios where flares showed up any differently in the final video than they did in the viewfinder.

J.J. Abrams just let out a sigh of relief.

View All Comments
Back to top ↑