Hosta la vista - the real fall colours.

FundyBrian

MobiLifer
Mobi Veteran
MobiSupporter
Real Name
Brian Townsend
Device
iPhone 8 Plus
Onsite
Project Page
The leaves have gone from the trees. The familiar bright fall colour have been replaced by the colours of death.
IMG_8740.JPG

A little past it’s prime. Laying on the ground dead.
IMG_8739.JPG

Colours seen as a range of browns
IMG_8742.JPG

Hosta leaves.

IMG_8738.JPG

Where have all the Fireweed seeds gone,
Long time passing -
IMG_8737.JPG

Hangers-on.
 
The great depth in that first image reminds me: I discovered just last week that Affinity Photo will do focus stacking (on your device). I've been experimenting with it but haven't produced anything worth showing so far, mainly due to ghosting in the background from trying to shoot handheld. The shots really need to be done from a tripod or other stable platform. I've been using CameraPixels for the shots because it can do focus bracketing, but I'm beginning to think it might be better to do individual shots selecting focus points by hand.
 
The great depth in that first image reminds me: I discovered just last week that Affinity Photo will do focus stacking (on your device). I've been experimenting with it but haven't produced anything worth showing so far, mainly due to ghosting in the background from trying to shoot handheld. The shots really need to be done from a tripod or other stable platform. I've been using CameraPixels for the shots because it can do focus bracketing, but I'm beginning to think it might be better to do individual shots selecting focus points by hand.

I wondered why I hadn’t heard about Affinity Photo and focus stacking - that’s because it’s an iPad-only app, and only newer iPads can run it, so it’s not on my iPad, boo hoo.
Thanks for the reminder about Camera Pixels. I knew there was some reason I got that app.

I’ve been thinking of doing a thread about focus stacking but had not yet got into gear.
I just made a series of focus stacking tests with different apps the other day and it’s true you need to be quite exacting for it to work well. I find that even when I’m being very careful my farthest focus is sometimes not quite far enough.
The one real focus stacking app we have is called StayFocused and it used to work very well, except for the on-line processing. Their server used to routinely produce sub-standard results while PS handled the same set of images with no problem. They have since been rethinking their approach and disconnected the online server. They have another plan in the works. You can still use the app to make the set of focus images but you’re on your own to stack them. Photoshop (on desktop) does the job quite handily but then it ceases to be an all-Mobile solution.
MultiFocus Camera can make a stack of 5 images but it doesn’t put them together.
Focus Camera can make and assemble a 3 image stack but only at low resolution. Close but no cigar.
Live Focus makes a series of shots at different focus distances to let you pick the best focus but doesn’t stack them together.
FocusTwist also makes a series of images at different focus distances but only square, and only to select the best focus. Keep the one you like.
MultiCam makes a series of exposure and focus brackets but after that, nothing. It seems to be broken at the moment.
Prime (formerly called Focus) is a RAW camera with excellent focus peaking so it’s really easy to see where you are focused. It seems ideal to make the focus stack but once you have the set there’s no app to process them.
Camera Pixels makes a speedy set of 5 images at different focus distances but doesn’t assemble them. CameraPixels is the only one that doesn’t let you chose the focus points.
So that’s where we are at present. We now have the onboard processing power to do the stacking but nobody has an app working yet.
For fun here is the first of a 5 image stack. This one focused in front.
IMG_9152.JPG

Here is a finished focus stack processed in (shhhh) PS.
Veggie Stack-2.jpg

You can imagine how useful that would be.
I had another app that made the focus stack and produced a finished result. It stopped working in iOS 11. Unfortunately it made only low resolution output.
 
Last edited:
I wondered why I hadn’t heard about Affinity Photo and focus stacking - that’s because it’s an iPad-only app, and only newer iPads can run it, so it’s not on my iPad, boo hoo.
Thanks for the reminder about Camera Pixels. I knew there was some reason I got that app.

I’ve been thinking of doing a thread about focus stacking but had not yet got into gear.
I just made a series of focus stacking tests with different apps the other day and it’s true you need to be quite exacting for it to work well. I find that even when I’m being very careful my farthest focus is sometimes not quite far enough.
The one real focus stacking app we have is called StayFocused and it used to work very well, except for the on-line processing. Their server used to routinely produce sub-standard results while PS handled the same set of images with no problem. They have since been rethinking their approach and disconnected the online server. They have another plan in the works. You can still use the app to make the set of focus images but you’re on your own to stack them. Photoshop (on desktop) does the job quite handily but then it ceases to be an all-Mobile solution.
MultiFocus Camera can make a stack of 5 images but it doesn’t put them together.
Focus Camera can make and assemble a 3 image stack but only at low resolution. Close but no cigar.
Live Focus makes a series of shots at different focus distances to let you pick the best focus but doesn’t stack them together.
FocusTwist also makes a series of images at different focus distances but only square, and only to select the best focus. Keep the one you like.
MultiCam makes a series of exposure and focus brackets but after that, nothing. It seems to be broken at the moment.
Prime (formerly called Focus) is a RAW camera with excellent focus peaking so it’s really easy to see where you are focused. It seems ideal to make the focus stack but once you have the set there’s no app to process them.
Camera Pixels makes a speedy set of 5 images at different focus distances but doesn’t assemble them. CameraPixels is the only one that doesn’t let you chose the focus points.
So that’s where we are at present. We now have the onboard processing power to do the stacking but nobody has an app working yet.
For fun here is the first of a 5 image stack. This one focused in front.
View attachment 101666
Here is a finished focus stack processed in (shhhh) PS.
View attachment 101667
You can imagine how useful that would be.
I had another app that made the focus stack and produced a finished result. It stopped working in iOS 11. Unfortunately it made only low resolution output.
Time you bought your new iPad, Brian. I want to see your results from Affinity. It’s obviously not that easy to create a focus stacking app otherwise we would have had one by now.
 
Time you bought your new iPad, Brian. I want to see your results from Affinity. It’s obviously not that easy to create a focus stacking app otherwise we would have had one by now.

I think it has more to do with limited demand for focus stacking. Even in the dslr world focus stacking is practiced by every few people. However, there’s more actual need for it on an iPhone with only a fixed aperture lens.
Aligning the images and blending them must be quite similar to stitching a panorama and we certainly have successful panorama apps. AutoStitch is another example of that sort of blending.
Does Affinity also do what AutoStitch did?
Have you tried focus stacking with Affinity? I haven’t heard very much about Affinity from actual users since the initial hype when it first came out. What do you think about it?
I was disappointed to discover that the latest iPad Pro has the 7 Camera and not the 8, and not dual lenses, not even the A11 processor. I figured for sure the Pro could use the processing power. Also, there’s no lack of room for the dual lenses. Maybe the next model to come out will have more 8 specs. I know Apple considers the iPhone to be its main flagship for the best specs.
 
Does Affinity also do what AutoStitch did?
It does do panorama merging, but I haven't tried that. It also does HDR merging, but that produced only mediocre results in a few attempts.

It's a complex program, very Photoshop-like in that respect. I imagine that has a lot to do with why you don't hear much about it. (Photoshop was my main program when I picked up photography again in the digital age. But these days, when mobile apps have become so capable, I rarely use it, and when I do, I find it annoying. It doesn't think like a photographer.)
 
It does do panorama merging, but I haven't tried that. It also does HDR merging, but that produced only mediocre results in a few attempts.

It's a complex program, very Photoshop-like in that respect. I imagine that has a lot to do with why you don't hear much about it. (Photoshop was my main program when I picked up photography again in the digital age. But these days, when mobile apps have become so capable, I rarely use it, and when I do, I find it annoying. It doesn't think like a photographer.)

Photoshop was built on a film darkroom model.
 
It does do panorama merging, but I haven't tried that. It also does HDR merging, but that produced only mediocre results in a few attempts.

It's a complex program, very Photoshop-like in that respect. I imagine that has a lot to do with why you don't hear much about it. (Photoshop was my main program when I picked up photography again in the digital age. But these days, when mobile apps have become so capable, I rarely use it, and when I do, I find it annoying. It doesn't think like a photographer.)
Yes, that’s exactly how I feel. However I still need software like this for printing photos. I have been thinking of buying the PC version of Affinity because it’s very similar to the iPad version - kill two birds with one stone. When I was working on my sister’s ecommerce site, it was easier to edit photos on the PC because I was doing all the coding on the PC.
 
I think it has more to do with limited demand for focus stacking. Even in the dslr world focus stacking is practiced by every few people. However, there’s more actual need for it on an iPhone with only a fixed aperture lens.
Aligning the images and blending them must be quite similar to stitching a panorama and we certainly have successful panorama apps. AutoStitch is another example of that sort of blending.
Does Affinity also do what AutoStitch did?
Have you tried focus stacking with Affinity? I haven’t heard very much about Affinity from actual users since the initial hype when it first came out. What do you think about it?
I was disappointed to discover that the latest iPad Pro has the 7 Camera and not the 8, and not dual lenses, not even the A11 processor. I figured for sure the Pro could use the processing power. Also, there’s no lack of room for the dual lenses. Maybe the next model to come out will have more 8 specs. I know Apple considers the iPhone to be its main flagship for the best specs.
Well, maybe it’s worth waiting for the next model of iPad. That’s how I feel about the iPhone. I feel quite strongly that the next model will be a much bigger upgrade. It’s already rumoured to have a stylus.

I have a real dilemma at the moment. I’m looking at lenses for my iPhone 7+ for my upcoming trip but I don’t want to end up getting lenses which are obsolete. I’m now down to two ‘kits’ - a Beastgrip with their own wide-angle/macro lens or the Moment case for iPhone 7+ with the new macro and wide angle lenses. I think the Beastgrip is just taking the edge. If I get this option I can include the Moondog anamorphic lens to fit directly on the iPhone for almost the same price as the Moment kit. I’d love to get the anamorphic version which fits with the Beastgrip but I need it to work with my gimbal. I just think the Beastgrip will give me more flexibility in the future. However, it would be so much more useful to have a macro lens which can easily be added to my day-to-day phone. Maybe I should also consider the Helium Core. Heck.
 
Well, maybe it’s worth waiting for the next model of iPad. That’s how I feel about the iPhone. I feel quite strongly that the next model will be a much bigger upgrade. It’s already rumoured to have a stylus.

I have a real dilemma at the moment. I’m looking at lenses for my iPhone 7+ for my upcoming trip but I don’t want to end up getting lenses which are obsolete. I’m now down to two ‘kits’ - a Beastgrip with their own wide-angle/macro lens or the Moment case for iPhone 7+ with the new macro and wide angle lenses. I think the Beastgrip is just taking the edge. If I get this option I can include the Moondog anamorphic lens to fit directly on the iPhone for almost the same price as the Moment kit. I’d love to get the anamorphic version which fits with the Beastgrip but I need it to work with my gimbal. I just think the Beastgrip will give me more flexibility in the future. However, it would be so much more useful to have a macro lens which can easily be added to my day-to-day phone. Maybe I should also consider the Helium Core. Heck.

Yes, a lot of lenses aren’t very specific what they won’t fit. At least you know the new Moment lenses were made for the 7 Plus.
I just got the Moment case for my 7 Plus and the new WA. The simple case not the battery case. I suppose eventually I’ll want both. I like the neck strap attachment points.
The new WA seems real nice. A bit bigger overall than the old version. Much bigger rear element opening and bayonet mount diameter. The Moment case doesn’t fit my DJI Osmo. I think they kept the mounting just the size to prevent the extra weight of the case. I have the counterweight to offset the lens. I’m sure I’ll get it to fit by removing one rubber pad in the grip claws. One nice thing about the Moment case - my Shure stereo mic fits where it wouldn’t fit my old case. But there’s no way to use the mic on the gimbal.
I’m not sure why you want the anamorphic. I have one for a bigger camera. A bit of a nuisance.
I’m thinking the beastgrip plus lens is too much weight for a gimbal. It would certainly slow down the response.
The Moment Macro is very good but somewhat limited since it is made for just one image size (3cm subject width). The focusing range goes way down. I find I use a series of close up lenses more often than the Macro.
I think the Moondog Anamorphic will fit a standard 37mm filter holder.
 
Yes, a lot of lenses aren’t very specific what they won’t fit. At least you know the new Moment lenses were made for the 7 Plus.
I just got the Moment case for my 7 Plus and the new WA. The simple case not the battery case. I suppose eventually I’ll want both. I like the neck strap attachment points.
The new WA seems real nice. A bit bigger overall than the old version. Much bigger rear element opening and bayonet mount diameter. The Moment case doesn’t fit my DJI Osmo. I think they kept the mounting just the size to prevent the extra weight of the case. I have the counterweight to offset the lens. I’m sure I’ll get it to fit by removing one rubber pad in the grip claws. One nice thing about the Moment case - my Shure stereo mic fits where it wouldn’t fit my old case. But there’s no way to use the mic on the gimbal.
I’m not sure why you want the anamorphic. I have one for a bigger camera. A bit of a nuisance.
I’m thinking the beastgrip plus lens is too much weight for a gimbal. It would certainly slow down the response.
The Moment Macro is very good but somewhat limited since it is made for just one image size (3cm subject width). The focusing range goes way down. I find I use a series of close up lenses more often than the Macro.
I think the Moondog Anamorphic will fit a standard 37mm filter holder.
I have to say that the New Moment lenses have rave reviews. I think it will just fit with my Zhiyun gimbal but it will be a tight fit.

When you say you use a series of close up lenses instead of the macro what do you mean? Are those the loose glass lenses that you have mentioned before if I remember correctly? I miss my two Olloclip macro lenses which I had for the iPhone 6. I got surprisingly good detail. Having said that I have got some great shots using the 7+ x2 zoom. It’s just that it then sometimes needs cropping and doesn’t always get quite close enough.

The Moondog anamorphic lens that I would get fits right onto the 7+. There is one that has the 37mm thread but then I’d have to use it with the Beastgrip and as I would be getting it specifically for video it really needs to work with a gimbal. Getting the 37mm thread version would make more sense in terms of longevity though. I just love the cinematic view that the anamorphic lens enables. Reading up on it, I think it’s easier to use on the iphone than on a full DSLR where focusing is a bit of a pain by all accounts but I want to find more reviews and opinions.

I’m holding out for Black Friday in the hope there may be some specials somewhere.
 
I have to say that the New Moment lenses have rave reviews. I think it will just fit with my Zhiyun gimbal but it will be a tight fit.

When you say you use a series of close up lenses instead of the macro what do you mean? Are those the loose glass lenses that you have mentioned before if I remember correctly? I miss my two Olloclip macro lenses which I had for the iPhone 6. I got surprisingly good detail. Having said that I have got some great shots using the 7+ x2 zoom. It’s just that it then sometimes needs cropping and doesn’t always get quite close enough.

The Moondog anamorphic lens that I would get fits right onto the 7+. There is one that has the 37mm thread but then I’d have to use it with the Beastgrip and as I would be getting it specifically for video it really needs to work with a gimbal. Getting the 37mm thread version would make more sense in terms of longevity though. I just love the cinematic view that the anamorphic lens enables. Reading up on it, I think it’s easier to use on the iphone than on a full DSLR where focusing is a bit of a pain by all accounts but I want to find more reviews and opinions.

I’m holding out for Black Friday in the hope there may be some specials somewhere.

A while back I bought a clip-on filter holder to use a 37mm polarizing filter. Once I had that I bought a set of 3 close-up lenses. Each one extends the focusing range a bit closer but I mostly use the closest and another lens I had which I mounted in a 37mm filter frame. These close-up lenses cover the range between the closest the camera can do on its own and the Moment Macro. You will find as you get closer that the amount of focusing effect you can get from the camera becomes less and less until the main control you have is adjusting the camera-to-subject distance.
I found +1 or +2 not much use. +4 to +10 are much more useful.

The anamorphic lens compresses the width of the image quite a lot to squeeze the wide screen format into 16:9 space. I would be more worried about image quality with that than anything else. Then the picture aspect ratio is changed to restore the proper shape of the subject. Another loss to image quality. It’s like shooting with a wide angle lens and then cropping the height of the image. It might seem that the picture is bigger but it’s actually smaller considering the display options we have. In the same way that 16:9 is smaller than 4:3. On the phone we have a 16:9 screen, but a 4:3 camera sensor. Go figure.
Back in the film days, with slide projectors, I was always a strong advocate of having a square screen so vertical and horizontal images could be shown equally. This is an equal rights issue for me! My vertical images were made every bit as lovingly as my horizontals. To see my verticals forced to display in the same height as a horizontal image means they are seen much smaller than than the horizontals. Relatively speaking, this puts verticals at such a disadvantage that it isn’t worth showing them. It’s an unacceptable bias against vertical format images. Does it make sense that the display technology affects the ratio of verticals/horizontals that people make?
You see what happens when people shoot vertical video on their phone. It is displayed as a narrow strip on a big horizontal space with distracting blurry areas left and right.
Try this is a slide show on a modern TV to understand what I mean. Modern display format is horizontal, and more than just horizontal 4:3, it is 16:9. Not even the shape we usually photograph in (4:3, or 3:2 on a DSLR). Unfair, I say. Discrimination, I yell.
So you put your anamorphic video on a 16:9 screen and you have a smaller image with black bars top and bottom.
 
A while back I bought a clip-on filter holder to use a 37mm polarizing filter. Once I had that I bought a set of 3 close-up lenses. Each one extends the focusing range a bit closer but I mostly use the closest and another lens I had which I mounted in a 37mm filter frame. These close-up lenses cover the range between the closest the camera can do on its own and the Moment Macro. You will find as you get closer that the amount of focusing effect you can get from the camera becomes less and less until the main control you have is adjusting the camera-to-subject distance.
I found +1 or +2 not much use. +4 to +10 are much more useful.

The anamorphic lens compresses the width of the image quite a lot to squeeze the wide screen format into 16:9 space. I would be more worried about image quality with that than anything else. Then the picture aspect ratio is changed to restore the proper shape of the subject. Another loss to image quality. It’s like shooting with a wide angle lens and then cropping the height of the image. It might seem that the picture is bigger but it’s actually smaller considering the display options we have. In the same way that 16:9 is smaller than 4:3. On the phone we have a 16:9 screen, but a 4:3 camera sensor. Go figure.
Back in the film days, with slide projectors, I was always a strong advocate of having a square screen so vertical and horizontal images could be shown equally. This is an equal rights issue for me! My vertical images were made every bit as lovingly as my horizontals. To see my verticals forced to display in the same height as a horizontal image means they are seen much smaller than than the horizontals. Relatively speaking, this puts verticals at such a disadvantage that it isn’t worth showing them. It’s an unacceptable bias against vertical format images. Does it make sense that the display technology affects the ratio of verticals/horizontals that people make?
You see what happens when people shoot vertical video on their phone. It is displayed as a narrow strip on a big horizontal space with distracting blurry areas left and right.
Try this is a slide show on a modern TV to understand what I mean. Modern display format is horizontal, and more than just horizontal 4:3, it is 16:9. Not even the shape we usually photograph in (4:3, or 3:2 on a DSLR). Unfair, I say. Discrimination, I yell.
So you put your anamorphic video on a 16:9 screen and you have a smaller image with black bars top and bottom.
I understand what you’re saying but when I see comparison videos on YouTube (on my iPad Pro) I find the video made with the anamorphic lens so much more attractive and perhaps the quality doesn’t seem to suffer because you see a smaller image. It gives it that film quality which makes the video more interesting to me. I’m a massive fan of the 16:9 format for photos too. Ok, I lose some quality because it has to be cropped but if I’ve taken it at high res, I’m prepared to sacrifice that to get a look which has an impact.
 
A while back I bought a clip-on filter holder to use a 37mm polarizing filter. Once I had that I bought a set of 3 close-up lenses. Each one extends the focusing range a bit closer but I mostly use the closest and another lens I had which I mounted in a 37mm filter frame. These close-up lenses cover the range between the closest the camera can do on its own and the Moment Macro. You will find as you get closer that the amount of focusing effect you can get from the camera becomes less and less until the main control you have is adjusting the camera-to-subject distance.
I found +1 or +2 not much use. +4 to +10 are much more useful.
This sounds like a pretty cheap option. Maybe I should give this a go.
 
Back
Top Bottom