FundyBrian’s Explorations

Amazing detail on those feathers. Is this with your new phone Brian, because you’re still showing iPhone 7 Plus as your device. Either way, the editing is superb.
Oops, I had better update my device settings. Thanks for your comments. But no, it was made with my 7 Plus. I think the main things that gave it the clarity are, tripod, DNG format, and the 20 shots focus stack that made this amount of depth possible. Well maybe the clarity and sharpening done in Affinity, too. It also helped that the hummingbird wasn’t moving.
 
Isn’t it just HDR on steroids?:D
Yes & no. HDR uses tone mapping or exposure fusion to combine the tonal range of 3 images into one. The 3 images are exposure bracketed. Something like: a normal exposure, one or more stops over exposed to record detail in the shadows, one or more stops underexposed to record detail in the highlights. Then the 3 photos are combined in a software program like Fusion, Affinity, or Photoshop into a single image having an extended tonal range. It is the tone mapping that gives the HDR “ look”.
With HDR the variable is exposure.

Focus stacking uses a set of images, each at a different focus but all the same exposure. There may be 5, 10, 20, 30 or more separate images each focused at a different distance to be sure of saving the entire range from closest foreground to farthest background. Focus stacking doesn’t use tone mapping like HDR. The combined stack of focus images is processed like a regular photo.
With focus stacking the variable is focus.

Here are 2 of the 20 images from one of the Hummingbird stacks. I selected one from the closest and farthest focus points.
A3071435-6E3C-4744-BB11-138E13B7E133.jpeg

This is the closest or top focus distance. The only areas in focus in this photo are marked in red. This is a completely typical single exposure image in terms of the amount that can be in focus in a single image.
In other words, if I photographed the hummingbird with only a single image this is as much as I could get in focus. To get more in focus at one time I need to make many more images, each at a different focus and then put them all together.
The iPhone lens has a fixed ƒ1.8 aperture lens so it has limited depth of field, especially in close-up photos.

99EB7B39-D899-4192-9D5F-F6628492AF0D.jpeg

This photo is the farthest down focus position. Once again, the only areas in focus in this photo are marked in red.
Each one of the other 18 photos has one narrow plane of the image in focus. Each one of the 20 images is a completely standard image except in each one the focus position is incrementally adjusted back, or downward in this case. So we start from the closest focus position and gradually work to the most distant focus position. The closer we are to the subject the narrower the total focus range becomes.
 
Last edited:
In Affinity here is the New Image category we use for focus stacking. In Affinity it is called a focus merge. Affinity is an iPad-only app.
In Photoshop the set of images one on top of the other is called an image stack, whether it is for HDR, focus stacking or panorama. This is where the focus stacking name comes from. Photoshop is a computer-only software program.
In either Affinity or Photoshop the stacking software selects the areas in sharp focus in each image in the stack and combines them into a single image. So you can appreciate there is a lot going on behind the scenes to make a single focus stacked image.

F8E018FA-B654-4285-AD31-60499902F9AB.jpeg

Focus stacking relies on a precisely registered series of images so a tripod is essential.
At this time the only iOS camera app capable of making a focus stack is CameraPixels, and it can do it at full resolution and with DNG raw files. This is an iPhone-only app. In the bracketing section you can choose exposure bracketing or focus bracketing, and I presume white balance bracketing but I haven’t tried that.


To help (I hope) to explain focus stacking here is a screenshot from the iOS app FocusStacker (on both iPhone & iPad). This app is used by DSLR users to calculate and provide the necessary info to create the focus stack set of images.
I’ve started by setting the blur spot diameter suitable for a crop-sensor camera - in the red square - 18 microns. By the red arrow on the left we see the focal length of the chosen lens is set at 50mm.
Then we focus on the closest part of the subject in the photo and set that distance at “1”. The farthest distance, in this case infinity, is set at “2”.
The app calculates the rest. If you remember that a lens produces the sharpest image in the middle of the lens area, we trim off the fuzzy outer edge of the lens by stopping down to a middle aperture, around ƒ8. Smaller apertures can give better depth of field but at the expense of increased diffraction, which means less sharpness. In this case the app selected ƒ9 and it will take 10 photos to cover the range of distances required to completely cover the subject. The vertical grey bar, with the red 2 at the top and red 1 at the bottom, is the distance scale where the focus distances are shown next to my red dashes. Notice that the spacing between the dashes becomes smaller as the focus distance gets closer. That’s just how optics works. Depth of field is increasingly hard to get the closer we get to the subject.
230198CB-3873-4CCD-A11B-54FF67FB6756.jpeg
 
Last edited:
So every time you make a photo using this method you’ll have 30-odd images in your camera roll? :fearscream::eek: :barf: If that’s a yes, I already know I’ll never do it. :feet:
Yes, that’s what happens. I’ve been using 20 for my focus stack images. Even so, it adds up. It’s the sort of thing you reserve for special occasions. Then I transfer all the focus stack images to my iPad Pro so I can use Affinity to merge them. The merging process is fairly easy and from there on it is the same type of editing you do with any single DNG image.
This sort of thing wouldn’t be necessary with a regular camera with an adjustable f-stop (aperture) because you could simply select a smaller aperture, like f11, to get more depth of field. With the iPhone you are always shooting at f1.8, which in regular camera terms is a very wide-open aperture where you get the least amount of depth of focus in your photos.
 
Here is another app, TrueDoF (Depth of Field) which is another app used by DSLR users to calculate exactly what range will be in focus in an image. It is also a very useful educational tool about depth of field. I think I have bought every app this developer makes - and they are not inexpensive. (I also have one called OptimumCSP (Camera Settings Pro) that is another way of calculating the best possible settings to use in a given situation to get the sharpest possible image)

Here is the main window of the app.
96DBECAD-2531-4260-998C-5F2C42C7D3D5.jpeg

The right side vertical bar is where I have set the aperture of the iPhone lens ƒ1.8. The lens focal length is shown on the short bar on the far left. The other vertical bar shows the focus distance and how much will be in focus (the depth of field).
The red arrow is where the lens is focused - at about 2.75 metres. In this particular case it is set at the hyperfocal distance, which is the focus distance at which the back depth of field extends to infinity. At this position we mare getting the most possible depth of field by not wasting any of it out beyond infinity. We can see that the depth of field extends from infinity to 1.36 metres. This means that everything between infinity and 1.36 metres will be in acceptable focus.

Now watch what happens as the focus distance is brought closer.
3A9272E9-EF75-46B5-8876-974CFAEC0832.jpeg

The red arrow indicates the focus being brought closer. From 2.75 meters to about 1.6 metres. The main thing to notice here is how much smaller the total amount of depth of field has become. The focus distance is the only thing being changed as we progress closer and you can see what happens to the total amount of depth of field as we get closer.

Next we move the focus closer again.
CAC5F07E-B4D6-4BB2-B328-D07A5CF263D2.jpeg

Now we are focused at about 0.65 metre and look how much smaller the depth of field has become.

Now look at the closest focus position this calculator can display.
989B3CD8-0548-48AA-A668-D1DDADA406C0.jpeg

Look how small the depth of field has become. And it continues to get smaller the closer we get to the subject. For instance when using a macro lens.

In all of these examples we can see the larger amount of depth of field is behind the focal distance. Another thing it shows is that focusing on a very distant point in the image is not a good idea as it wastes a lot of the available depth of field.
 
This sort of thing wouldn’t be necessary with a regular camera with an adjustable f-stop (aperture) because you could simply select a smaller aperture, like f11, to get more depth of field.
Yeah but all these fancy camera apps I have (and never use) like ProCamera, Pure, 645 PRO have those controls. Obviously it’s not the same as a DSLR but seems way easier than this method.
 
Here is another app, TrueDoF (Depth of Field) which is another app used by DSLR users to calculate exactly what range will be in focus in an image. It is also a very useful educational tool about depth of field. I think I have bought every app this developer makes - and they are not inexpensive. (I also have one called OptimumCSP (Camera Settings Pro) that is another way of calculating the best possible settings to use in a given situation to get the sharpest possible image)

Here is the main window of the app.
View attachment 112949
The right side vertical bar is where I have set the aperture of the iPhone lens ƒ1.8. The lens focal length is shown on the short bar on the far left. The other vertical bar shows the focus distance and how much will be in focus (the depth of field).
The red arrow is where the lens is focused - at about 2.75 metres. In this particular case it is set at the hyperfocal distance, which is the focus distance at which the back depth of field extends to infinity. At this position we mare getting the most possible depth of field by not wasting any of it out beyond infinity. We can see that the depth of field extends from infinity to 1.36 metres. This means that everything between infinity and 1.36 metres will be in acceptable focus.

Now watch what happens as the focus distance is brought closer.
View attachment 112951
The red arrow indicates the focus being brought closer. From 2.75 meters to about 1.6 metres. The main thing to notice here is how much smaller the total amount of depth of field has become. The focus distance is the only thing being changed as we progress closer and you can see what happens to the total amount of depth of field as we get closer.

Next we move the focus closer again.
View attachment 112952
Now we are focused at about 0.65 metre and look how much smaller the depth of field has become.

Now look at the closest focus position this calculator can display.
View attachment 112950
Look how small the depth of field has become. And it continues to get smaller the closer we get to the subject. For instance when using a macro lens.

In all of these examples we can see the larger amount of depth of field is behind the focal distance. Another thing it shows is that focusing on a very distant point in the image is not a good idea as it wastes a lot of the available depth of field.
This just hurts my head..... :expressionless::confounded: I really do appreciate your knowledge, and taking the time to explain all this, and I know there will be people here who get a lot from it.

But.....it’s exactly why I stopped taking classes and trying to figure out my SLR. My eyes roll up into my head and I zone out. Photography has always been very organic and intuitive for me.

BUT..................... Your diagrams make wonderful abstract art!! I love them! :D
 
Yeah but all these fancy camera apps I have (and never use) like ProCamera, Pure, 645 PRO have those controls. Obviously it’s not the same as a DSLR but seems way easier than this method.
Actually, there are no camera apps whatsoever with aperture control for iPhone because the iPhone camera simply doesn’t have an adjustable aperture. It has a fixed f1.8 aperture and that’s it.
There are 2 main controls on any camera - the shutter speeds and the aperture control, and the iPhone camera only has one of them - shutter speeds. No aperture control at all. A fixed value. But we do need a greater range of adjustment than can be provided by the shutter speed control by itself so the ISO was drafted to provide the second adjustment.
On a film camera the film speed was a fixed value. You simply set whatever ASA rating the film had, 64, 100, 400, etc. You had to tell your light meter what ASA value your film had in order the get a proper light meter reading. In the later days of film the ASA rating (American Standards Association) was replaced by ISO (International Standards Organization) but the numbers were exactly the same.
In low light situations the iPhone camera relies on jacking up the ISO to maintain a usable shutter speed. Naturally this results in noisy grainy images just like using a 1000 ASA film on a film camera.

Lots of people use Focus Stacking on regular DSLR cameras even though the do have an adjustable aperture. They do it to get depth of field that would be otherwise impossible considering the physical limitations of cameras. It is used most commonly in super close-up of tiny things like insects.

Lacking any sort of adjustable aperture there is simply no other choice than focus stacking if you want to have more in focus in your image than is ordinarily possible on an iPhone.
 
Actually, there are no camera apps whatsoever with aperture control for iPhone because the iPhone camera simply doesn’t have an adjustable aperture. It has a fixed f1.8 aperture and that’s it.
There are 2 main controls on any camera - the shutter speeds and the aperture control, and the iPhone camera only has one of them - shutter speeds. No aperture control at all. A fixed value. But we do need a greater range of adjustment than can be provided by the shutter speed control by itself so the ISO was drafted to provide the second adjustment.
On a film camera the film speed was a fixed value. You simply set whatever ASA rating the film had, 64, 100, 400, etc. You had to tell your light meter what ASA value your film had in order the get a proper light meter reading. In the later days of film the ASA rating (American Standards Association) was replaced by ISO (International Standards Organization) but the numbers were exactly the same.
In low light situations the iPhone camera relies on jacking up the ISO to maintain a usable shutter speed. Naturally this results in noisy grainy images just like using a 1000 ASA film on a film camera.

Lots of people use Focus Stacking on regular DSLR cameras even though the do have an adjustable aperture. They do it to get depth of field that would be otherwise impossible considering the physical limitations of cameras. It is used most commonly in super close-up of tiny things like insects.

Lacking any sort of adjustable aperture there is simply no other choice than focus stacking if you want to have more in focus in your image than is ordinarily possible on an iPhone.
:flushed:

I just use the native camera app. :coffee:
 
My Morning View.
View attachment 107395
Looking out my upstairs window at sunrise. This is a 4 shot panorama stitched in Affinity. Affinity handled the stitching job very well with no fussing with settings. My originals were made in PureShot on my iPhone 7 Plus. This upstairs room serves as Fabi’s craft area and occasional guest room. On the longest days of summer the sun rises to the left of West River Mountain on the left and on the shortest days in winter the sun rises at the far right of the photo. In the middle in the black area you can just make out part of waterside beach catching some light.
I’m very happy with the Apple Pencil, better than any other stylus I have tried. In this photo I was using the Apple Pencil with the Inpainting brush to fill in areas cut off by the perspective curving of the stitching in the corners and it did a flawless job.
You can imagine how meditative it could be to sit here and watch the sunrise.
The iPad Pro 12.9” has been a game changer for me. For one thing, I can see defects in my pictures I couldn’t see before on my iPhone. Things I thought looked pretty good on my iPhone don’t always turn out as well on the bigger screen. It has far reaching implications on my sense of mobile photography. Affinity is another game changer. We finally have the almost Photoshop we have been waiting for and the ability to develop RAW files, do tone mapping, do HDR with DNG files, panoramas, focus stacking, plus all the usual photo editing, all in one app is wonderful. It is somewhat slow and very manual feeling compared to some specialized effects apps but it offers a great deal of creative possibilities. I expect we will eventually see improvements and expansions to Affinity similar to Photoshop.
At the moment my workflow has become somewhat confused, trying to decide whether to edit on my iPhone or move directly to my iPad, and then which one has the most up-to-date version of the photos. It will take some care to keep it all straight.
Just seen this. Missed quite a lot while I was in Namibia.

I never edit on my iPhone. I AirDrop everything to the iPad.

I could never understand why you didn’t buy an iPad Pro earlier. It was a complete game changer for me too. The Apple Pencil made a massive difference for photo manipulation and as well as all the reasons you mentioned. I have quite a few serious apps particularly video such as LumaFusion and of course Affinity that just don’t work on other Apple devices. Everything is just that much easier, faster and better on the eyes.
 
I
Just seen this. Missed quite a lot while I was in Namibia.

I never edit on my iPhone. I AirDrop everything to the iPad.

I could never understand why you didn’t buy an iPad Pro earlier. It was a complete game changer for me too. The Apple Pencil made a massive difference for photo manipulation and as well as all the reasons you mentioned. I have quite a few serious apps particularly video such as LumaFusion and of course Affinity that just don’t work on other Apple devices. Everything is just that much easier, faster and better on the eyes.
I completely agree about the iPad Pro. I got the 12.9” and it’s great.
I also recently got the long-awaited Affinity Designer so that gives us a good vector app as well.
Still, it sometimes makes me feel uneasy about the “mobile” studio when it takes two devices to complete the chain. I stayed as long as I did with the iPhone alone because I was clinging to the all-in-one phone-studio idea. Affinity is what clinched it for me. To do HDR with DNG files, as well as Focus Stacking, I needed Affinity, and for that I needed the iPad.
Maybe it’s my aging eyes but I also discovered some of my iPhone edits didn’t look so good on my computer screen, mostly because I couldn’t see properly what I was doing at the small size. Sometimes I raised settings too high. They looked fine on my screen but once I saw them on my 5K 27” iMac I could see they were no good and ended up trashing most of them.
 
Just seen this. Missed quite a lot while I was in Namibia.

I never edit on my iPhone. I AirDrop everything to the iPad.

I could never understand why you didn’t buy an iPad Pro earlier. It was a complete game changer for me too. The Apple Pencil made a massive difference for photo manipulation and as well as all the reasons you mentioned. I have quite a few serious apps particularly video such as LumaFusion and of course Affinity that just don’t work on other Apple devices. Everything is just that much easier, faster and better on the eyes.
Did you notice my last post in the State of DNG on iOS thread? I mentioned a problem with AirDropping combined RAW+jpeg photos. Only the jpegs got sent.
Lately I have seen too many cases where my jpeg images were terrible next to my DNG images that I question if it’s worthwhile shooting any more jpeg at all. Unfortunately some apps can only save jpeg so sometimes I just have to accept it.
 
I

I completely agree about the iPad Pro. I got the 12.9” and it’s great.
I also recently got the long-awaited Affinity Designer so that gives us a good vector app as well.
Still, it sometimes makes me feel uneasy about the “mobile” studio when it takes two devices to complete the chain. I stayed as long as I did with the iPhone alone because I was clinging to the all-in-one phone-studio idea. Affinity is what clinched it for me. To do HDR with DNG files, as well as Focus Stacking, I needed Affinity, and for that I needed the iPad.
Maybe it’s my aging eyes but I also discovered some of my iPhone edits didn’t look so good on my computer screen, mostly because I couldn’t see properly what I was doing at the small size. Sometimes I raised settings too high. They looked fine on my screen but once I saw them on my 5K 27” iMac I could see they were no good and ended up trashing most of them.
I have bought Affinity Designer too. However, I’m sort of apprehensive about getting into it because I know that like Affinity Photo there is likely to be a steep learning curve. It has one flaw in that it doesn’t trace bitmap photos into vectors. Imaengine seems the only app that does it in colour on the ipad. I have got used to the simple editing tools in Imaengine so I will have to make an effort to change to AD but of course the tools will be much more powerful.
 
Did you notice my last post in the State of DNG on iOS thread? I mentioned a problem with AirDropping combined RAW+jpeg photos. Only the jpegs got sent.
Lately I have seen too many cases where my jpeg images were terrible next to my DNG images that I question if it’s worthwhile shooting any more jpeg at all. Unfortunately some apps can only save jpeg so sometimes I just have to accept it.
Yes, I particularly took note of that. I was unable to get a DNG out of Lightroom Mobile at all.
Edit: ah, just found the solution. The export function into the camera roll. It then totally Airdropped to my iPad as a DNG. But this of course isn’t a combined photo.

I am hugely surprised at your comment about DNG vs JPG. Is it all due to your ability to edit in Affinity?
 
Last edited:
Procamera works. It saves the jpeg to the camera roll and keeps the DNG in the app. You can then choose to AirDrop either the RAW or JPEG to your iPad and it keeps all the info and format.
 
Yes, I particularly took note of that. I was unable to get a DNG out of Lightroom Mobile at all.
Edit: ah, just found the solution. The export function into the camera roll. It then totally Airdropped to my iPad as a DNG. But this of course isn’t a combined photo.

I am hugely surprised at your comment about DNG vs JPG. Is it all due to your ability to edit in Affinity?
Not at all. Even on my iPhone I can see much better image quality and especially no compression artifacts in my DNG images compared to jpegs. I’ll have to post some comparisons.
 
Back
Top Bottom