
Exploring the Creative Possibilities of Frame Averaging on iPhone
I’ve been spending time recently exploring the iPhone app Even Longer, a sophisticated tool for creating long exposure imagery through computational photography. While I’m still early in the learning process, I’m already impressed and excited by what’s possible — especially with one of its most useful modes: Frame Averaging.
Frame Averaging blends many individual frames over time into a single final image. The result can be wonderfully expressive: softened water, streaked clouds, simplified scenes, and a sense of time passing that a single frame cannot always convey.
These photographs were all created during a recent trip to the Outer Banks of North Carolina using Even Longer.

A Different Kind of Long Exposure
Traditional long exposures often rely on neutral density filters, manual settings, and a tripod. Even Longer opens another path by using the iPhone’s camera and computational processing to build the image over time. It still requires a tripod for the best results, but that's where the similarities between traditional cameras and Even Longer end.
It’s different from doing long exposues with a traditional camera — but that’s part of what makes it interesting. It invites a different way of thinking about exposure, movement, and time.
In bright conditions, I sometimes added a 5-stop neutral density filter — like this one of Roanoke Lighthouse on Manteo at sunrise, and the iPhone infrared photo of Rodanthe Bridge, above — to reduce light and allow for stronger result or turning the water to glass.

Understanding the Controls
One of the biggest learning curves with Even Longer is that it uses familiar terms—ISO, shutter speed, EV, and total time — but they do not function exactly the way many photographers expect from a traditional camera.
With a traditional camera, we usually think in terms of the exposure triangle: shutter speed, aperture, and ISO working together to create a single frame.
Even Longer is different.
It uses computational photography, blending many frames over time into one final image. That means we are really working in two layers of control:
Layer 1: Per-Frame Settings
These controls affect each individual frame the app records:
- ISO
- Shutter speed
- EV compensation
They influence brightness, noise, and the amount of motion blur in each source frame.
Layer 2: Time Accumulation Settings
These controls shape the final image over time:
- Total Time
- Frame Averaging mode (or other modes)
They determine how long frames are gathered, how much motion is blended, how smooth the water becomes, and how the final image is constructed.
Once I began thinking of the app this way — not as a traditional camera, but as a computational tool working in two layers — it became much easier to predict results and work more intentionally.
What I’m Seeing So Far
What excites me most is not simply that the app can smooth water. While cool, many tools (SlowShutter, Reeheld, Average Cam Pro, Hydra) can do that.

What interests me is how Frame Averaging can simplify a scene and shift attention toward shape, light, mood, and design.
A busy shoreline becomes calmer. Reflections become painterly. The passage of clouds or water becomes part of the composition itself.
The photograph becomes less about freezing an instant and more about interpreting time.

Still Learning, Still Exploring
Make no mistake: this is a technically sophisticated app! The relationship between ISO, shutter speed, EV, total time, and blending modes is not always intuitive — especially if you come from a traditional camera background.
That makes it exciting to explore, but it also means there’s more to learn.
And I'm in the process of doing exactly that. I'll follow this up with some additional examples after doing more comparison tests, but I wanted to share this with you for now. Maybe you'll join me in exploring this powerful iPhone app.



) to see how your blog looks