The above images are examples of a rolling shutter artifact. A "rolling shutter" means that the image is recorded one row (or column in portrait orientation) at a time, and not simultaneously. This means if the exposure and the flash are of very short duration, then the flash will only illuminate some pixel row (or columns) of the image. Here's what's going on in the first image:
This can be replicated with a flash set to strobe, and taking lots of photos or video with an iPhone:
The fact that the "beam" can go "behind" the clouds can seem confusing in a still image. But it's really just that the lightning is behind the clouds, and they are thick enough to block the light. I simulated a similar effect using the flash and a foam pool toy. If the flash is in front of the toy then it illuminates it, and you get the stripe across the toy. If the flash is moved behind the toy, then the toy is not illuminated, so the "beam" (the stripe of the image) looks like it's going behind the toy.
It looks like the flash is not actually working, however you have to remember that for most of the frame, it's actually off. The only time the flash is on is during the white bar.
In my image the flash gives a white light (as a camera flash is obviously intended to), but lightning often shows up as a purple or pink tinge:
Typical lightning photography with a rolling shutter looks more like half the image being lit by lightning, rather than the column we see above. Here's a more typical lighting shot at night with this problem:
http://www.productionapprentice.com/tutorials/ccd-vs-cmos/
Notice it's horizontal. The image is scanned one row at a time, but when you take a photo in portrait mode, then those rows are the columns.
Here I've repeated the flash experiment in a darker setting, and we get the half-illuminated image:
This video demonstrates what is going on with a rolling shutter, although it's a little misleading as it assumes the reset and read-out phases are happening simultaneously.
Update Aug 15 2015, a new example:
source, location
Here the illusion of it being a beam behind the clouds is particularly striking, as a band of dark clouds is partially obscuring the illuminated clouds behind. You can see part of the lightning bolt just below the power lines. Remember it's not actually a beam of light, it's just a slice of a scene as it would look illuminated by lightening.
This can be especially confusing when you notice the reflection of the "beam" on the hood of the car. Your brain naturally interpret this as a reflection of the scene, and so you think there's actually a beam in the sky. But if you look closely, you'll see the reflection in the hood is actually of a slightly different are of the sky. The "reflection" lines up perfectly vertically, whereas the nearby traffic signals are sloped. If this actually was a reflection of a beam in the sky, then it would also be sloped.
So while it's a reflection of sky illuminated by lighting, it's not the same area of the sky. It's a little unintuitive.
So again, what we are seeing here is essentially one normal image overlaid with a slice of another image that's illuminate by lighting. (And note that what looks like the reflection of a building in the lower right is actually the reflection of the air conditioning vent inside the car)
Warning: Complicated Explanation Ahead!
A very unintuitive topic here is the width of the band. Why does a long exposure (in a dark setting) give a wide band, but the short exposure (in sunlight) give a narrow band? You would think that if the exposure is short, then more of that exposure would be happening during the flash, so the band would be wider.
The first thing here is that the flash I use only has a duration of 1/1200th of a second, so if things work the way the previous paragraph suggested, then the most we could illuminate would be a band 1/40th the width of the image (in portrait orientation), and yet we see that we can get half or more of the image illuminated. So what's going on? How can a 1/40th of a frame exposure illuminate half the lines in a frame? And why does it it get narrower for shorter exposures?
The answer is complicated. CMOS sensors can only be read one line at a time, and yet the exposures can happen simultaneously (or at least overlap). So the reset of each line (the start of when it begins to record) is staggered to match the read-out speed, and the exposure length is actually on an individual line basis.
The diagram below is showing the "flash window" - i.e. the portion of a frame in which you can fire the flash and have it affect all the rows. Normally when you use the flash, then it's quite dark and so you have a long exposure, and hence quite a big window in which you can fire the flash, If you were to go outside of this window, you'd get the top or the bottom of the image only illuminated by the flash.
Source
In brighter conditions, the per-line exposures are shorter, so it's more like:
Notice the read-out of the lines (green)is the same angle, as that can't get any faster. This also dictates when the reset (blue) happens. But now that the exposure time (red) is much shorter, only a few lines can get exposed during the flash. Hence the "beam" is narrower for a shorter exposure.
This gives us a minimum width of the "beam". It's the duration of the flash, divided by the read-out cycle time (i.e. the time between starting two consecutive read-outs). The faster the camera's read-out time, the smaller the potential "beam" and the sharper the edges.
Why sharp edges? The start and end of the flash might also occur during the reset or read-out phase of the lines, meaning that line only gets partially exposed to the flash. This means the edges of the "beam" are not always sharp.
Note: The above post is a summary post from material in the discussion thread below, hence the subsequent discussion might seem somewhat redundant. Updated with the Florida photo on www.metabunk.org
0 Comentarios