FundyBrian’s Explorations

It looks like even the universal Adobe PDF Reader is creeping towards a subscription model. First just the extra tools and features are by subscription. Soon the whole app will be useless without one. Soon to be deleted. There are plenty of other PDF readers.
I wouldn't mind if the subscription was reasonable. I saw one app that charged £2 a year. Well heck, I'd be happy to pay that. But Enlight want £19 a year for each of their apps or £54 for a one off for each app. Geez, You can get Affinity Photo for less than half of that.
 
Unfortunately (or fortunately?) not for me. I can't think about numbers and calculations, etc. when I'm making photos. It's completely in the moment and organic. Even editing later I can't/don't want to think like that.... my edits are done completely by sight/feel. If I had to do photography any other way, I just wouldn't do it.
I don’t think I ever think about numbers and calculations when I’m making a photo. Yes, it’s an organic process, a thinking process, a building process.
The first step is previsualizing the final result so I know where I’m going. I’m a problem solver, but first I have to be a problem finder. I think about what I want out of this photo and what’s the best way to get it. I recognize the weak points of the photographic process, and the limitations, and how to get around them. Seeing photographically involves seeing not just what I see with my eyes but also seeing how it will turn out as rendered by the deficiencies of the photographic process. Recognizing that certain important aspects of tonality exceed the capability of the dynamic range of the camera. Wanting to avoid burned-out highlightsand blocked-up shadows. That might lead me to use HDR. Recognizing that unless I take certain steps the image won’t have enough depth of field to adequately cover the subject - that might lead me to focus stacking.
In most cases trying to solve these problems after the fact is a complete waste of time.
 
I wouldn't mind if the subscription was reasonable. I saw one app that charged £2 a year. Well heck, I'd be happy to pay that. But Enlight want £19 a year for each of their apps or £54 for a one off for each app. Geez, You can get Affinity Photo for less than half of that.
Too true. Down with subscriptions!
 
A couple of photos from yesterday. After a long dry spell we had a few rainy days and the mushrooms started popping up.
E370A550-246F-4A8B-9751-56B639BA04E1.jpeg

BF3833BE-F5E9-4F8F-8A3B-0E19CC7A7787.jpeg

Both are made using DNG focus stacking in CameraPixels, developed in Affinity. Without the extra depth of field provided by focus stacking I find I usually just delete any photos made that don’t have enough depth of field to cover the entire mushroom.
My standard mushroom gear includes a flat ground-level tripod, some small reflectors and prop sticks, some 37mm close-up lenses with a clip-on holder, polarizing filter, white balance card, a small piece of plastic to kneel on. I often turn my iPhone "upside down" so the lens is closer to the ground. The lowest possible ISO for the least amount of image noise. I tend to use 20 as the number of focus steps in CameraPixels and delete the ones that go outside the useful focus range before developing.
 
A couple of photos from yesterday. After a long dry spell we had a few rainy days and the mushrooms started popping up.
View attachment 115378
View attachment 115377
Both are made using DNG focus stacking in CameraPixels, developed in Affinity. Without the extra depth of field provided by focus stacking I find I usually just delete any photos made that don’t have enough depth of field to cover the entire mushroom.
My standard mushroom gear includes a flat ground-level tripod, some small reflectors and prop sticks, some 37mm close-up lenses with a clip-on holder, polarizing filter, white balance card, a small piece of plastic to kneel on. I often turn my iPhone "upside down" so the lens is closer to the ground. The lowest possible ISO for the least amount of image noise. I tend to use 20 as the number of focus steps in CameraPixels and delete the ones that go outside the useful focus range before developing.
I just love these particularly the first.
 
I don’t think I ever think about numbers and calculations when I’m making a photo. Yes, it’s an organic process, a thinking process, a building process.
The first step is previsualizing the final result so I know where I’m going. I’m a problem solver, but first I have to be a problem finder. I think about what I want out of this photo and what’s the best way to get it. I recognize the weak points of the photographic process, and the limitations, and how to get around them. Seeing photographically involves seeing not just what I see with my eyes but also seeing how it will turn out as rendered by the deficiencies of the photographic process. Recognizing that certain important aspects of tonality exceed the capability of the dynamic range of the camera. Wanting to avoid burned-out highlightsand blocked-up shadows. That might lead me to use HDR. Recognizing that unless I take certain steps the image won’t have enough depth of field to adequately cover the subject - that might lead me to focus stacking.
In most cases trying to solve these problems after the fact is a complete waste of time.
You have to have a lot of knowledge to be able to make all those decisions. When I do extreme photo manipulations I can work out what my next step is going to be because I know what all the apps do and what effects can be achieved. My frustration with photography is not having that knowledge. If I’ve never used a filter before how can I know when I need one or when one might make a difference. I have only recently learnt about defusing light. Fascinating. I have an accumulated knowledge when it comes to software so it’s easy for me to work out new software. You are the same with photography. It’s a great position to be in.
 
In most cases trying to solve these problems after the fact is a complete waste of time.
Hmmmmm.... I edit almost 100% of my images after making them, to adjust any discrepancies with lighting, etc. Or to take the image in a totally different direction if I see that I can’t fix any light issues. <shrugs> Seems to work for me. :D

The first step is previsualizing the final result so I know where I’m going. I’m a problem solver, but first I have to be a problem finder. I think about what I want out of this photo and what’s the best way to get it. I recognize the weak points of the photographic process, and the limitations, and how to get around them. Seeing photographically involves seeing not just what I see with my eyes but also seeing how it will turn out as rendered by the deficiencies of the photographic process. Recognizing that certain important aspects of tonality exceed the capability of the dynamic range of the camera. Wanting to avoid burned-out highlightsand blocked-up shadows. That might lead me to use HDR. Recognizing that unless I take certain steps the image won’t have enough depth of field to adequately cover the subject - that might lead me to focus stacking.
It may not be numbers, but that right there is way more thinking than I ever do before I make an image. :lol:
 
You have to have a lot of knowledge to be able to make all those decisions. When I do extreme photo manipulations I can work out what my next step is going to be because I know what all the apps do and what effects can be achieved. My frustration with photography is not having that knowledge. If I’ve never used a filter before how can I know when I need one or when one might make a difference. I have only recently learnt about defusing light. Fascinating. I have an accumulated knowledge when it comes to software so it’s easy for me to work out new software. You are the same with photography. It’s a great position to be in.
Yes, I’m the same way.
 
You have to have a lot of knowledge to be able to make all those decisions. When I do extreme photo manipulations I can work out what my next step is going to be because I know what all the apps do and what effects can be achieved. My frustration with photography is not having that knowledge. If I’ve never used a filter before how can I know when I need one or when one might make a difference. I have only recently learnt about defusing light. Fascinating. I have an accumulated knowledge when it comes to software so it’s easy for me to work out new software. You are the same with photography. It’s a great position to be in.
The problem with premade filters is that you don’t have any idea what makes them work. You may be able to approximately guess but the filter maker does not reveal the exact settings used to make it. Even if you achieve an effect you like you don’t know what went into the making of the filter so you haven’t learned anything except a dependence on premade filters. If that filter app goes defunct you cannot reproduce the effect on your own.
I always find when I try filter apps I go through many, even hundreds of filters, and I don’t find a single one that improves my starting image. I find lots of wacky effects but none that help me along the path I have in mind. But that’s the thing. I do have something specific in mind for the image. I’m not exploring looking to see what can be done with it. At the time of exposure I previsuslized the final result and from then on I’m trying to achieve it. I never take pictures and try to decide later what to do with them. If I try that, I always get to a point where I discover - ah, if I had realized where this was going to end up I would have shot it differently in the first place.
I want to know how things work and be able to create the look I want using the standard tools. In the long run it’s much more satisfying to know what you’re doing.
Look at all the filter apps out there. Some regular person made each filter. Learn to make your own that do exactly what you want. More importantly, you won’t need to use filters at all anymore because you will know how to create exactly the effect you want.
For an interesting case in point, open up Max Curve and a typical photo. Before doing anything else, open each of the curves categories in turn and see the neutral positions. RGB, CMYK, Lightness, HSL, Lab. Look at every one. Now go over and try a preset. “Ocean” is a good example. So just tap the preset and without adjusting anything go back and look at each curve again. This is one of the rare cases where you can see exactly what was done to make the filter’s look. It’s quite instructive. You can see where localized spikes have been added to certain colours, lightness, etc.
I find I so seldom make the same adjustments on different photos that saving presets doesn’t make much sense to me.
 
It seems to me I spend a lot of time pushing at any limitations in the mobile photography system. As soon as I find something that doesn’t work I try to find a solution.
The past few days I have been working on making an adapter to attach my 37mm accessories to my Moment Tele lens. I really want to be able to use it more for close-ups but by itself it doesn’t focus close at all - about 36” (on the 2x lens).
Sure, tape works well enough to temporarily attach filters but I worry about the alignment, etc. I also have other things in mind coming up. Like neutral density filters for slower shutter speeds, and that requires a light-tight fit.
The adapter needs to be a slip-on thing although permanently attaching an empty 37mm filter ring would be easier. So here’s my solution.
40E18E4C-4E9A-418B-A082-01BDF1CDA107.jpeg

Here’s the Moment Tele lens, some 37mm close-up lenses and a polarizer, and the adapter on the right. It started with a standard step-up ring to the desired 37mm size. I had to grind away the centre of the step-up ring until it just fit exactly outside the Moment Tele. A bit fiddly but that was the easy part. Finding a piece of tubing the exact diameter to slip over the Moment Tele was much harder. Fortunately I have lots of raw materials, leftover stuff from dismantled, salvaged, um... junk. I mean useful stuff. It needed to be rigid plastic not soft plastic in order for the glue to bond securely.
2314C2C9-1DFF-4559-8892-B5F7395F10D8.jpeg

On the left is my plastic sleeve with the filter holder glued to the front with cyanoacrylate glue made by Gorilla glue. In actual fact, the piece of tubing I found was just a bit too tight to fit in and I had to resize the inside diameter about 1/2mm. That was fairly painstaking with lots of checking along the way. Now I’m quite happy with the fit. It presses on smoothly without scratching the lens barrel.
AAF769CE-A954-46B2-A98F-447D2DA518CB.jpeg

Here it is in place on the Moment Tele with a Tiffen close-up lens neatly screwed on in front. I was fortunate enough to find a piece of tubing with a slight flange on the end which provided more surface area to glue the adapter ring.
If you look at this a minute you can see that once it is slipped on there’s nothing left to hold onto to pull it off again. So that means it has to be able to push all the way through until it slides off the other end. It works as slick as you like.

Now wouldn’t it have been easier if the lens came with filter threads in the first place?
 
The problem with premade filters is that you don’t have any idea what makes them work. You may be able to approximately guess but the filter maker does not reveal the exact settings used to make it. Even if you achieve an effect you like you don’t know what went into the making of the filter so you haven’t learned anything except a dependence on premade filters. If that filter app goes defunct you cannot reproduce the effect on your own.
I always find when I try filter apps I go through many, even hundreds of filters, and I don’t find a single one that improves my starting image. I find lots of wacky effects but none that help me along the path I have in mind. But that’s the thing. I do have something specific in mind for the image. I’m not exploring looking to see what can be done with it. At the time of exposure I previsuslized the final result and from then on I’m trying to achieve it. I never take pictures and try to decide later what to do with them. If I try that, I always get to a point where I discover - ah, if I had realized where this was going to end up I would have shot it differently in the first place.
I want to know how things work and be able to create the look I want using the standard tools. In the long run it’s much more satisfying to know what you’re doing.
Look at all the filter apps out there. Some regular person made each filter. Learn to make your own that do exactly what you want. More importantly, you won’t need to use filters at all anymore because you will know how to create exactly the effect you want.
For an interesting case in point, open up Max Curve and a typical photo. Before doing anything else, open each of the curves categories in turn and see the neutral positions. RGB, CMYK, Lightness, HSL, Lab. Look at every one. Now go over and try a preset. “Ocean” is a good example. So just tap the preset and without adjusting anything go back and look at each curve again. This is one of the rare cases where you can see exactly what was done to make the filter’s look. It’s quite instructive. You can see where localized spikes have been added to certain colours, lightness, etc.
I find I so seldom make the same adjustments on different photos that saving presets doesn’t make much sense to me.
Sorry I actually meant physical lens filters. :oops:
 
It seems to me I spend a lot of time pushing at any limitations in the mobile photography system. As soon as I find something that doesn’t work I try to find a solution.
The past few days I have been working on making an adapter to attach my 37mm accessories to my Moment Tele lens. I really want to be able to use it more for close-ups but by itself it doesn’t focus close at all - about 36” (on the 2x lens).
Sure, tape works well enough to temporarily attach filters but I worry about the alignment, etc. I also have other things in mind coming up. Like neutral density filters for slower shutter speeds, and that requires a light-tight fit.
The adapter needs to be a slip-on thing although permanently attaching an empty 37mm filter ring would be easier. So here’s my solution.
View attachment 115393
Here’s the Moment Tele lens, some 37mm close-up lenses and a polarizer, and the adapter on the right. It started with a standard step-up ring to the desired 37mm size. I had to grind away the centre of the step-up ring until it just fit exactly outside the Moment Tele. A bit fiddly but that was the easy part. Finding a piece of tubing the exact diameter to slip over the Moment Tele was much harder. Fortunately I have lots of raw materials, leftover stuff from dismantled, salvaged, um... junk. I mean useful stuff. It needed to be rigid plastic not soft plastic in order for the glue to bond securely.
View attachment 115395
On the left is my plastic sleeve with the filter holder glued to the front with cyanoacrylate glue made by Gorilla glue. In actual fact, the piece of tubing I found was just a bit too tight to fit in and I had to resize the inside diameter about 1/2mm. That was fairly painstaking with lots of checking along the way. Now I’m quite happy with the fit. It presses on smoothly without scratching the lens barrel.
View attachment 115394
Here it is in place on the Moment Tele with a Tiffen close-up lens neatly screwed on in front. I was fortunate enough to find a piece of tubing with a slight flange on the end which provided more surface area to glue the adapter ring.
If you look at this a minute you can see that once it is slipped on there’s nothing left to hold onto to pull it off again. So that means it has to be able to push all the way through until it slides off the other end. It works as slick as you like.

Now wouldn’t it have been easier if the lens came with filter threads in the first place?
Brian, you do know that Moment sell a filter adapter for attaching 62mm filters/macro lenses to their lenses? It does mean that you would have to go out and buy 62mm filters/macros if you don’t already have them.
 
Brian, you do know that Moment sell a filter adapter for attaching 62mm filters/macro lenses to their lenses? It does mean that you would have to go out and buy 62mm filters/macros if you don’t already have them.
Yes. I plan to get one for my wide angle lens. The 62 mm size is necessary for the wide angle lens to avoid vignetting with things like polarizing filters. Close-up lenses are much less useful for the wide angle. I already have 37mm close-up lenses, which are a much handier size for an iPhone kit.
I have a fair number of filters in 49, 58, 67, and 77mm as well as square 3x3” so no matter what I do with the Moment adapter it will involve additional adapters.
Technically speaking, a “macro” lens is one that produces an image at 1:1 or larger on the film/sensor. The rest are just close-up lenses.
 
Last edited:
Sorry I actually meant physical lens filters. :oops:
Ah, yes. Well, my rant still applies for that topic but I see what you’re saying is different.
Strictly speaking, close-up lenses should be called lenses rather than filters. Yes, they are in the same physical format as filters but a filter has no diopter effect.
For your close-up lenses you can easily establish their working range using a metre stick and a tape measure. Put your iPhone on a tripod and position a meter stick on a slight angle running directly away from the camera. Then just put on a close-up lens and see the limits of focus on the yardstick. You need a manual focus app so you can accurately force the camera focus to its minimum and maximum positions. For each close-up lens and each camera lens.

Back in the film days the use of optical filters was much more of a topic. Just about every photographer had a stack of filters, especially for colour control. Things like vignettes had to be created on the camera at the time of shooting. Soft focus effects for portraits. Gradient neutral density filters are still popular. As well as gradient colour effects. So many things are now easier to create in editing. The things that are not easy to create in editing are more in my interest these days. Close-up lenses are in filter format but are not, technically speaking, filters. A polarizing filter cannot be done in software. The polarizer is still a very useful basic tool. The one I’m currently interested in is neutral density for slow shutter effects.
The problem on the iPhone camera is you cannot manually select a shutter speed lower than 1/3 second and that is too short for real slow shutter effects. With the fast f1.8 aperture the shutter speed tends to be very high so the “slow shutter” type apps make multiple shots and put them together but each of those shots is still a very high shutter speed so the resulting blur is made up out of discontinuous fast shutter steps and looks nothing like real slow shutter blur. The only way to eliminate the steps is to overexpose the area to eliminate the detail, and that is completely unsatisfactory.
I’ve had my best luck with slow shutter effects when the light was such that my shutter speed was already down around 1/3 sec. or I was using a polarizer that helped cut down the light level. The best idea is to ensure that the “slow shutter” steps are actually made of the slowest possible shutter speed, around 1/3 sec. then the slow shutter effect is much more realistic.
So the next time the weather is right I’m going back there armed with neutral density filters. I still have a few from the “old days” as well as a couple of new ones I was starting to use on my DSLRs for low shutter speed effects.
The last time I tried this I got too much noise in the images because I was using jpegs. But now that I’ve been using RAW files I think I can overcome this by manually selecting ISO 20. Any day now.
 
A couple of photos from yesterday. After a long dry spell we had a few rainy days and the mushrooms started popping up.
View attachment 115378
View attachment 115377
Both are made using DNG focus stacking in CameraPixels, developed in Affinity. Without the extra depth of field provided by focus stacking I find I usually just delete any photos made that don’t have enough depth of field to cover the entire mushroom.
My standard mushroom gear includes a flat ground-level tripod, some small reflectors and prop sticks, some 37mm close-up lenses with a clip-on holder, polarizing filter, white balance card, a small piece of plastic to kneel on. I often turn my iPhone "upside down" so the lens is closer to the ground. The lowest possible ISO for the least amount of image noise. I tend to use 20 as the number of focus steps in CameraPixels and delete the ones that go outside the useful focus range before developing.
These are truly beautiful captures. I love the CameraPixels/Affinity combination. You have taken it to new levels :thumbs:
 
Ah, yes. Well, my rant still applies for that topic but I see what you’re saying is different.
Strictly speaking, close-up lenses should be called lenses rather than filters. Yes, they are in the same physical format as filters but a filter has no diopter effect.
For your close-up lenses you can easily establish their working range using a metre stick and a tape measure. Put your iPhone on a tripod and position a meter stick on a slight angle running directly away from the camera. Then just put on a close-up lens and see the limits of focus on the yardstick. You need a manual focus app so you can accurately force the camera focus to its minimum and maximum positions. For each close-up lens and each camera lens.

Back in the film days the use of optical filters was much more of a topic. Just about every photographer had a stack of filters, especially for colour control. Things like vignettes had to be created on the camera at the time of shooting. Soft focus effects for portraits. Gradient neutral density filters are still popular. As well as gradient colour effects. So many things are now easier to create in editing. The things that are not easy to create in editing are more in my interest these days. Close-up lenses are in filter format but are not, technically speaking, filters. A polarizing filter cannot be done in software. The polarizer is still a very useful basic tool. The one I’m currently interested in is neutral density for slow shutter effects.
The problem on the iPhone camera is you cannot manually select a shutter speed lower than 1/3 second and that is too short for real slow shutter effects. With the fast f1.8 aperture the shutter speed tends to be very high so the “slow shutter” type apps make multiple shots and put them together but each of those shots is still a very high shutter speed so the resulting blur is made up out of discontinuous fast shutter steps and looks nothing like real slow shutter blur. The only way to eliminate the steps is to overexpose the area to eliminate the detail, and that is completely unsatisfactory.
I’ve had my best luck with slow shutter effects when the light was such that my shutter speed was already down around 1/3 sec. or I was using a polarizer that helped cut down the light level. The best idea is to ensure that the “slow shutter” steps are actually made of the slowest possible shutter speed, around 1/3 sec. then the slow shutter effect is much more realistic.
So the next time the weather is right I’m going back there armed with neutral density filters. I still have a few from the “old days” as well as a couple of new ones I was starting to use on my DSLRs for low shutter speed effects.
The last time I tried this I got too much noise in the images because I was using jpegs. But now that I’ve been using RAW files I think I can overcome this by manually selecting ISO 20. Any day now.
Thanks, Brian. I don’t think there is much point in me using optical filters. I don’t have enough knowledge to really know when to use them successfully.
 
Thanks, Brian. I don’t think there is much point in me using optical filters. I don’t have enough knowledge to really know when to use them successfully.
I have all this stuff stored in my head and taking up space. I would be happy to share what I know if you have any specific things you want to know. There have been plenty of books written on the topic. There are probably books written on individual categories of filters.

Any use of filters in photography is easier if you have the classic RGBCMY photographic colour wheel firmly in your mind. If you have the painter’s colour wheel in mind nothing will make sense. The same is true for any type of colour correction on images. Photography involves both additive and subtractive primaries. Filters pass certain colours and absorb or block certain colours.

It is worthwhile knowing that the image sensor in a digital camera is sensitive to light well beyond the range you can see - infa-red and ultra-violet. Even though there are filters in the camera narrowing the range of light hitting the sensor to the visible spectrum it is still possible for light we can’t see to affect the image. In particular, making the sky turn out lighter than the way we see it. Yes, the UV filter actually has a purpose beyond protecting the lens.

Another thing worth knowing is that our eyes are exposed to far more “far blue” light in recent years from computer screens, TVs, LED light sources, phones and tablets, and this type light has been found to be a causative factor in early onset macular degeneration so make sure your computer glasses, or whatever glasses you use for screen work are protected for far blue wavelengths. As of the last 3 to 5 years you won’t have any trouble getting them at any place selling glasses.

There are a few basic categories of filters used in photographic applications.

• A huge number of the old colour balancing filters are irrelevant with digital cameras now anyhow. This was one of the biggest categories of filters and professional or commercial photographers still need a working knowledge of this, especially for location work.
Proper use of white balance (and a white balance reference card) eliminates them. But knowing how to use a white balance card is very useful. Control of white balance is very poor in most camera apps, and it shows. It is much easier to set the correct white balance at the time of exposure than it is to figure out how to fix it later. The only saving grace in this dilemma is that people often want to change the white balance for dramatic effect rather than accuracy.

• The Polarizer filter is in a class of its own since it doesn’t change colours directly but can enhance colours and clarity by reducing the effects of glare, reflections, and haze. It can have a big effect or none, depending on the angle of the light. The biggest danger is overdoing the polarizing effect on skies, making them too dark. Also, the angle of the light changes the polarizing effect and in normal lenses this is no problem. However, with ultra-wide lenses the angle of view relative to the angle of the light will be drastically different from one side of the image to the other, so the polarizing effect could be very different on each side of the image, which isn’t pretty. Something to watch for.
The classic use of the polarizer is to enhance the contrast between blue sky and clouds. I tend to use it more to improve colour on overcast and rainy days by removing the white of the sky that is reflected by every shiny or wet object (like leaves) in the scene. Great for woods and waterfall scenes. Also to remove glare on some surfaces in a close-up that would otherwise lead to exposure problems. For instance the glare off the shiny top of a mushroom, or leaves, etc. Perfect for water photos, especially when you want to see into the water and not the haze reflected on the water. Close-ups of frogs, etc. Eliminate reflections that appear on glass in some scenes (such as your own reflection).

• Close-up lenses are not really filters in the usual sense but might as well be grouped here since they are mounted the same way. Their main purpose is to allow closer focusing than would otherwise be possible. Like reading glasses for your camera. It takes a stronger power of close-up lens the closer you want to get. Typically sold as +1, +2, +4, etc. sets.

• Neutral density filters simply reduce the amount of light coming through your lens. They don’t change any colours in your photo. Their main use is in reducing the amount of light, allowing you to use different camera settings than would otherwise be possible.
On a DSLR or video camera they are use in bright light to allow a wide open aperture to be used in the lens, giving shallow depth of field in order the blur the background. The iPhone always has its aperture wide open so that doesn’t help us in that case.
The other use is to reduce the light quite a bit, allowing a much slower shutter speed for motion blurring effects, like flowing water in waterfalls. ND filters come in a variety of strengths to block different amounts of light. 2 stops, 4 stops, 6 stops, etc.

The word “stop” in photography refers to one complete step in the shutter speed or aperture setting. This has become blurred because modern cameras now offer almost continuously variable shutter speed and aperture settings. So it is important to remember what the steps used to be. Each step is a doubling, or halving, of the setting, whether shutter speed or light.
So the traditional shutter speed scale goes, 1 second, 1/2 sec, 1/4, 1/8, 1/15, 1/30, 1/60,1/125, 1/250, 1/500, 1/1000, 1/2000. It’s easy to see in this scale that each step is double or half the time depending on which way you are advancing in the scale. Each step is “1 stop”.
The aperture scale (also called f-stops) follows the same pattern but the numbers don’t make sense like the shutter speed numbers do. f1, f1.4, f2, f2.8, f4, f5.6, f8, f11, f16. The weirdness of the numbers is because the f-number is a measure of the diameter of the opening, or aperture, while their effect comes from the area. Once again, each step is referred to as 1 stop and represents either a doubling or halving of the exposure. But the iPhone doesn’t have an adjustable aperture so you can ignore that part.

• Gradient, or graduated, neutral density filters are dark at the top and gradually lighten towards the bottom. The transition from dark to clear can be more or less gradual. Also the degree of darkness might be 1 stop or 2 stops, etc.
Most often used when the sky is very bright and the ground is dark, like sunsets , allowing a single exposure to give a correct exposure for the sky and ground at the same time. It is usually necessary to adjust the height position of the filter relative to your scene so these filters are usually rectangular rather than the round screw-on variety. They are mounted in a special filter holder that allows sliding the filter up/down to the desired position. These filters were more popular with film but still are used in high end photography. An alternative is to combine pictures of different exposures for sky and ground in post or use HDR.

• Then there are the colour filters used for black and white photography. Fairly strong yellow, orange, red, green, blue. Unless you are using film there are apps on the iPhone that internally simulate the use of these filters. So you don’t need any of those either but it helps to understand why they were used so you know how to use the B&W apps.
Even though you end up with a B&W image the film itself is still colour sensitive and the coloured filters are used to alter the relative balance of certain colours to prevent tone mergers.
For instance, you take a picture of a red flower on a green bush but in the B&W photo the red and green look exactly the same shade of grey so you can’t see where the flower begins and the leaves end - the flower disappears. You have a tone merger. This is a very common problem in B&W. It could be any colour/tone combination. The solution is to lighten or darken one or the other colour so there is a bigger difference in how red or green (in this example) is rendered in the final B&W photo. A filter passes it’s own colour and blocks others. It would be your choice whether you decided to have a dark flower against lighter leaves, or a light flower against dark leaves but you need one or the other in order to see the difference in tones.
A lot of people used a yellow filter for B&W if the were doing a lot of landscape photos to make the sky look more the same tone we see with our eyes. Blue skies come out lighter than expected in B&W.
The understanding of how coloured filters controls tonal separation in B&W photos is the single most important thing needed for successful B&W photos. Without that, B&W becomes a crap shoot - sometimes you’re lucky and sometimes you’re not and you have no idea why. You see this all the time in casually made B&W images.
The difficulty when shooting B&W film is that you can’t see beforehand the effect your filter will have in the final image. You just have to know in advance what is going to happen based on experience (and info charts).
There are 2 routes to making B&W on digital cameras. The dumb way is to simply discard the colour information at the start and manipulate the grey-scale image contrast to get the look you want. Avoid any apps that work this way. There are too many situations where nothing you try to do can make the image any good.
The best digital B&W is made from using all 3 RGB colour layers and digitally filtering the layers to manipulate the colours to give the desired tonal separation in the final B&W. This is where an understanding of the colour filters is very useful.

• Tricolour filters: a set of red, green, blue filters in exactly the right densities. This is the earliest form of colour photography. In the early days before colour film, a special camera had 3 B&W negatives, as well as mirrors or prisms, etc, and each negative had one if the 3 tricolour filters. A single press of the shutter made 3 pictures representing the red, green & blue parts of the image. In the darkroom those 3 B&W negatives were each printed in turn, in register, with the associated red, green or blue filter to make a colour print. Imagine the work involved!
Those same 3 filters are the basis for the Harris Shutter apps. The Harris Shutter was first made in the days of the earliest colour images on separate B&W negatives. But much later on, if you made a triple exposure on colour film, or a digital camera, through the tricolour filters, you end up with a completely normal colour photo. Boring so far. But... what if things move between exposures... then things get interesting.
For instance, I once made a picture of fall leaves on the ground in the woods, using a film camera - no chance for error! With my camera set on multiple exposure (on a tripod of course), I made my first exposure with the first colour filter. Then I raked the leaves a bit to change their locations. Also I waited half an hour for the shafts of sun and shadow to move. Then I changed to the next filter and made another exposure, raked the leaves again, and waited another half an hour for the sun to move. I made my last exposure with the last filter. I only had time for the one shot. The resulting picture is quite interesting and unique. At first glance is looks normal enough. A fall scene with coloured leaves. But as you look closer you see a terrific range of colour in the leaves and colours on the edges of trees, shadows, etc. as the sun moved. I’ve seen people study that picture quite a while trying to figure it out, getting more and more perplexed. Why are there some blue leaves? And so on.

And it looks like I’m writing a book, too, so I had better stop now and have some breakfast.
 
Last edited:
I have all this stuff stored in my head and taking up space. I would be happy to share what I know if you have any specific things you want to know. There have been plenty of books written on the topic. There are probably books written on individual categories of filters.

Any use of filters in photography is easier if you have the classic RGBCMY photographic colour wheel firmly in your mind. If you have the painter’s colour wheel in mind nothing will make sense. The same is true for any type of colour correction on images. Photography involves both additive and subtractive primaries. Filters pass certain colours and absorb or block certain colours.

It is worthwhile knowing that the image sensor in a digital camera is sensitive to light well beyond the range you can see - infa-red and ultra-violet. Even though there are filters in the camera narrowing the range of light hitting the sensor to the visible spectrum it is still possible for light we can’t see to affect the image. In particular, making the sky turn out lighter than the way we see it. Yes, the UV filter actually has a purpose beyond protecting the lens.

Another thing worth knowing is that our eyes are exposed to far more “far blue” light in recent years from computer screens, TVs, LED light sources, phones and tablets, and this type light has been found to be a causative factor in early onset macular degeneration so make sure your computer glasses, or whatever glasses you use for screen work are protected for far blue wavelengths. As of the last 3 to 5 years you won’t have any trouble getting them at any place selling glasses.

There are a few basic categories of filters used in photographic applications.

• A huge number of the old colour balancing filters are irrelevant with digital cameras now anyhow. This was one of the biggest categories of filters and professional or commercial photographers still need a working knowledge of this, especially for location work.
Proper use of white balance (and a white balance reference card) eliminates them. But knowing how to use a white balance card is very useful. Control of white balance is very poor in most camera apps, and it shows. It is much easier to set the correct white balance at the time of exposure than it is to figure out how to fix it later. The only saving grace in this dilemma is that people often want to change the white balance for dramatic effect rather than accuracy.

• The Polarizer filter is in a class of its own since it doesn’t change colours directly but can enhance colours and clarity by reducing the effects of glare, reflections, and haze. It can have a big effect or none, depending on the angle of the light. The biggest danger is overdoing the polarizing effect on skies, making them too dark. Also, the angle of the light changes the polarizing effect and in normal lenses this is no problem. However, with ultra-wide lenses the angle of view relative to the angle of the light will be drastically different from one side of the image to the other, so the polarizing effect could be very different on each side of the image, which isn’t pretty. Something to watch for.
The classic use of the polarizer is to enhance the contrast between blue sky and clouds. I tend to use it more to improve colour on overcast and rainy days by removing the white of the sky that is reflected by every shiny or wet object (like leaves) in the scene. Great for woods and waterfall scenes. Also to remove glare on some surfaces in a close-up that would otherwise lead to exposure problems. For instance the glare off the shiny top of a mushroom, or leaves, etc. Perfect for water photos, especially when you want to see into the water and not the haze reflected on the water. Close-ups of frogs, etc. Eliminate reflections that appear on glass in some scenes (such as your own reflection).

• Close-up lenses are not really filters in the usual sense but might as well be grouped here since they are mounted the same way. Their main purpose is to allow closer focusing than would otherwise be possible. Like reading glasses for your camera. It takes a stronger power of close-up lens the closer you want to get. Typically sold as +1, +2, +4, etc. sets.

• Neutral density filters simply reduce the amount of light coming through your lens. They don’t change any colours in your photo. Their main use is in reducing the amount of light, allowing you to use different camera settings than would otherwise be possible.
On a DSLR or video camera they are use in bright light to allow a wide open aperture to be used in the lens, giving shallow depth of field in order the blur the background. The iPhone always has its aperture wide open so that doesn’t help us in that case.
The other use is to reduce the light quite a bit, allowing a much slower shutter speed for motion blurring effects, like flowing water in waterfalls. ND filters come in a variety of strengths to block different amounts of light. 2 stops, 4 stops, 6 stops, etc.

The word “stop” in photography refers to one complete step in the shutter speed or aperture setting. This has become blurred because modern cameras now offer almost continuously variable shutter speed and aperture settings. So it is important to remember what the steps used to be. Each step is a doubling, or halving, of the setting, whether shutter speed or light.
So the traditional shutter speed scale goes, 1 second, 1/2 sec, 1/4, 1/8, 1/15, 1/30, 1/60,1/125, 1/250, 1/500, 1/1000, 1/2000. It’s easy to see in this scale that each step is double or half the time depending on which way you are advancing in the scale. Each step is “1 stop”.
The aperture scale (also called f-stops) follows the same pattern but the numbers don’t make sense like the shutter speed numbers do. f1, f1.4, f2, f2.8, f4, f5.6, f8, f11, f16. The weirdness of the numbers is because the f-number is a measure of the diameter of the opening, or aperture, while their effect comes from the area. Once again, each step is referred to as 1 stop and represents either a doubling or halving of the exposure. But the iPhone doesn’t have an adjustable aperture so you can ignore that part.

• Gradient, or graduated, neutral density filters are dark at the top and gradually lighten towards the bottom. The transition from dark to clear can be more or less gradual. Also the degree of darkness might be 1 stop or 2 stops, etc.
Most often used when the sky is very bright and the ground is dark, like sunsets , allowing a single exposure to give a correct exposure for the sky and ground at the same time. It is usually necessary to adjust the height position of the filter relative to your scene so these filters are usually rectangular rather than the round screw-on variety. They are mounted in a special filter holder that allows sliding the filter up/down to the desired position. These filters were more popular with film but still are used in high end photography. An alternative is to combine pictures of different exposures for sky and ground in post or use HDR.

• Then there are the colour filters used for black and white photography. Fairly strong yellow, orange, red, green, blue. Unless you are using film there are apps on the iPhone that internally simulate the use of these filters. So you don’t need any of those either but it helps to understand why they were used so you know how to use the B&W apps.
Even though you end up with a B&W image the film itself is still colour sensitive and the coloured filters are used to alter the relative balance of certain colours to prevent tone mergers.
For instance, you take a picture of a red flower on a green bush but in the B&W photo the red and green look exactly the same shade of grey so you can’t see where the flower begins and the leaves end - the flower disappears. You have a tone merger. This is a very common problem in B&W. It could be any colour/tone combination. The solution is to lighten or darken one or the other colour so there is a bigger difference in how red or green (in this example) is rendered in the final B&W photo. A filter passes it’s own colour and blocks others. It would be your choice whether you decided to have a dark flower against lighter leaves, or a light flower against dark leaves but you need one or the other in order to see the difference in tones.
A lot of people used a yellow filter for B&W if the were doing a lot of landscape photos to make the sky look more the same tone we see with our eyes. Blue skies come out lighter than expected in B&W.
The understanding of how coloured filters controls tonal separation in B&W photos is the single most important thing needed for successful B&W photos. Without that, B&W becomes a crap shoot - sometimes you’re lucky and sometimes you’re not and you have no idea why. You see this all the time in casually made B&W images.
The difficulty when shooting B&W film is that you can’t see beforehand the effect your filter will have in the final image. You just have to know in advance what is going to happen based on experience (and info charts).
There are 2 routes to making B&W on digital cameras. The dumb way is to simply discard the colour information at the start and manipulate the grey-scale image contrast to get the look you want. Avoid any apps that work this way. There are too many situations where nothing you try to do can make the image any good.
The best digital B&W is made from using all 3 RGB colour layers and digitally filtering the layers to manipulate the colours to give the desired tonal separation in the final B&W. This is where an understanding of the colour filters is very useful.

• Tricolour filters: a set of red, green, blue filters in exactly the right densities. This is the earliest form of colour photography. In the early days before colour film, a special camera had 3 B&W negatives, as well as mirrors or prisms, etc, and each negative had one if the 3 tricolour filters. A single press of the shutter made 3 pictures representing the red, green & blue parts of the image. In the darkroom those 3 B&W negatives were each printed in turn, in register, with the associated red, green or blue filter to make a colour print. Imagine the work involved!
Those same 3 filters are the basis for the Harris Shutter apps. The Harris Shutter was first made in the days of the earliest colour images on separate B&W negatives. But much later on, if you made a triple exposure on colour film, or a digital camera, through the tricolour filters, you end up with a completely normal colour photo. Boring so far. But... what if things move between exposures... then things get interesting.
For instance, I once made a picture of fall leaves on the ground in the woods, using a film camera - no chance for error! With my camera set on multiple exposure (on a tripod of course), I made my first exposure with the first colour filter. Then I raked the leaves a bit to change their locations. Also I waited half an hour for the shafts of sun and shadow to move. Then I changed to the next filter and made another exposure, raked the leaves again, and waited another half an hour for the sun to move. I made my last exposure with the last filter. I only had time for the one shot. The resulting picture is quite interesting and unique. At first glance is looks normal enough. A fall scene with coloured leaves. But as you look closer you see a terrific range of colour in the leaves and colours on the edges of trees, shadows, etc. as the sun moved. I’ve seen people study that picture quite a while trying to figure it out, getting more and more perplexed. Why are there some blue leaves? And so on.

And it looks like I’m writing a book, too, so I had better stop now and have some breakfast.
I certainly have learnt a lot from this but will bookmark it cause I’m bound to forget it until I actually come to use it.
 
I have all this stuff stored in my head and taking up space. I would be happy to share what I know if you have any specific things you want to know. There have been plenty of books written on the topic. There are probably books written on individual categories of filters.

Any use of filters in photography is easier if you have the classic RGBCMY photographic colour wheel firmly in your mind. If you have the painter’s colour wheel in mind nothing will make sense. The same is true for any type of colour correction on images. Photography involves both additive and subtractive primaries. Filters pass certain colours and absorb or block certain colours.

It is worthwhile knowing that the image sensor in a digital camera is sensitive to light well beyond the range you can see - infa-red and ultra-violet. Even though there are filters in the camera narrowing the range of light hitting the sensor to the visible spectrum it is still possible for light we can’t see to affect the image. In particular, making the sky turn out lighter than the way we see it. Yes, the UV filter actually has a purpose beyond protecting the lens.

Another thing worth knowing is that our eyes are exposed to far more “far blue” light in recent years from computer screens, TVs, LED light sources, phones and tablets, and this type light has been found to be a causative factor in early onset macular degeneration so make sure your computer glasses, or whatever glasses you use for screen work are protected for far blue wavelengths. As of the last 3 to 5 years you won’t have any trouble getting them at any place selling glasses.

There are a few basic categories of filters used in photographic applications.

• A huge number of the old colour balancing filters are irrelevant with digital cameras now anyhow. This was one of the biggest categories of filters and professional or commercial photographers still need a working knowledge of this, especially for location work.
Proper use of white balance (and a white balance reference card) eliminates them. But knowing how to use a white balance card is very useful. Control of white balance is very poor in most camera apps, and it shows. It is much easier to set the correct white balance at the time of exposure than it is to figure out how to fix it later. The only saving grace in this dilemma is that people often want to change the white balance for dramatic effect rather than accuracy.

• The Polarizer filter is in a class of its own since it doesn’t change colours directly but can enhance colours and clarity by reducing the effects of glare, reflections, and haze. It can have a big effect or none, depending on the angle of the light. The biggest danger is overdoing the polarizing effect on skies, making them too dark. Also, the angle of the light changes the polarizing effect and in normal lenses this is no problem. However, with ultra-wide lenses the angle of view relative to the angle of the light will be drastically different from one side of the image to the other, so the polarizing effect could be very different on each side of the image, which isn’t pretty. Something to watch for.
The classic use of the polarizer is to enhance the contrast between blue sky and clouds. I tend to use it more to improve colour on overcast and rainy days by removing the white of the sky that is reflected by every shiny or wet object (like leaves) in the scene. Great for woods and waterfall scenes. Also to remove glare on some surfaces in a close-up that would otherwise lead to exposure problems. For instance the glare off the shiny top of a mushroom, or leaves, etc. Perfect for water photos, especially when you want to see into the water and not the haze reflected on the water. Close-ups of frogs, etc. Eliminate reflections that appear on glass in some scenes (such as your own reflection).

• Close-up lenses are not really filters in the usual sense but might as well be grouped here since they are mounted the same way. Their main purpose is to allow closer focusing than would otherwise be possible. Like reading glasses for your camera. It takes a stronger power of close-up lens the closer you want to get. Typically sold as +1, +2, +4, etc. sets.

• Neutral density filters simply reduce the amount of light coming through your lens. They don’t change any colours in your photo. Their main use is in reducing the amount of light, allowing you to use different camera settings than would otherwise be possible.
On a DSLR or video camera they are use in bright light to allow a wide open aperture to be used in the lens, giving shallow depth of field in order the blur the background. The iPhone always has its aperture wide open so that doesn’t help us in that case.
The other use is to reduce the light quite a bit, allowing a much slower shutter speed for motion blurring effects, like flowing water in waterfalls. ND filters come in a variety of strengths to block different amounts of light. 2 stops, 4 stops, 6 stops, etc.

The word “stop” in photography refers to one complete step in the shutter speed or aperture setting. This has become blurred because modern cameras now offer almost continuously variable shutter speed and aperture settings. So it is important to remember what the steps used to be. Each step is a doubling, or halving, of the setting, whether shutter speed or light.
So the traditional shutter speed scale goes, 1 second, 1/2 sec, 1/4, 1/8, 1/15, 1/30, 1/60,1/125, 1/250, 1/500, 1/1000, 1/2000. It’s easy to see in this scale that each step is double or half the time depending on which way you are advancing in the scale. Each step is “1 stop”.
The aperture scale (also called f-stops) follows the same pattern but the numbers don’t make sense like the shutter speed numbers do. f1, f1.4, f2, f2.8, f4, f5.6, f8, f11, f16. The weirdness of the numbers is because the f-number is a measure of the diameter of the opening, or aperture, while their effect comes from the area. Once again, each step is referred to as 1 stop and represents either a doubling or halving of the exposure. But the iPhone doesn’t have an adjustable aperture so you can ignore that part.

• Gradient, or graduated, neutral density filters are dark at the top and gradually lighten towards the bottom. The transition from dark to clear can be more or less gradual. Also the degree of darkness might be 1 stop or 2 stops, etc.
Most often used when the sky is very bright and the ground is dark, like sunsets , allowing a single exposure to give a correct exposure for the sky and ground at the same time. It is usually necessary to adjust the height position of the filter relative to your scene so these filters are usually rectangular rather than the round screw-on variety. They are mounted in a special filter holder that allows sliding the filter up/down to the desired position. These filters were more popular with film but still are used in high end photography. An alternative is to combine pictures of different exposures for sky and ground in post or use HDR.

• Then there are the colour filters used for black and white photography. Fairly strong yellow, orange, red, green, blue. Unless you are using film there are apps on the iPhone that internally simulate the use of these filters. So you don’t need any of those either but it helps to understand why they were used so you know how to use the B&W apps.
Even though you end up with a B&W image the film itself is still colour sensitive and the coloured filters are used to alter the relative balance of certain colours to prevent tone mergers.
For instance, you take a picture of a red flower on a green bush but in the B&W photo the red and green look exactly the same shade of grey so you can’t see where the flower begins and the leaves end - the flower disappears. You have a tone merger. This is a very common problem in B&W. It could be any colour/tone combination. The solution is to lighten or darken one or the other colour so there is a bigger difference in how red or green (in this example) is rendered in the final B&W photo. A filter passes it’s own colour and blocks others. It would be your choice whether you decided to have a dark flower against lighter leaves, or a light flower against dark leaves but you need one or the other in order to see the difference in tones.
A lot of people used a yellow filter for B&W if the were doing a lot of landscape photos to make the sky look more the same tone we see with our eyes. Blue skies come out lighter than expected in B&W.
The understanding of how coloured filters controls tonal separation in B&W photos is the single most important thing needed for successful B&W photos. Without that, B&W becomes a crap shoot - sometimes you’re lucky and sometimes you’re not and you have no idea why. You see this all the time in casually made B&W images.
The difficulty when shooting B&W film is that you can’t see beforehand the effect your filter will have in the final image. You just have to know in advance what is going to happen based on experience (and info charts).
There are 2 routes to making B&W on digital cameras. The dumb way is to simply discard the colour information at the start and manipulate the grey-scale image contrast to get the look you want. Avoid any apps that work this way. There are too many situations where nothing you try to do can make the image any good.
The best digital B&W is made from using all 3 RGB colour layers and digitally filtering the layers to manipulate the colours to give the desired tonal separation in the final B&W. This is where an understanding of the colour filters is very useful.

• Tricolour filters: a set of red, green, blue filters in exactly the right densities. This is the earliest form of colour photography. In the early days before colour film, a special camera had 3 B&W negatives, as well as mirrors or prisms, etc, and each negative had one if the 3 tricolour filters. A single press of the shutter made 3 pictures representing the red, green & blue parts of the image. In the darkroom those 3 B&W negatives were each printed in turn, in register, with the associated red, green or blue filter to make a colour print. Imagine the work involved!
Those same 3 filters are the basis for the Harris Shutter apps. The Harris Shutter was first made in the days of the earliest colour images on separate B&W negatives. But much later on, if you made a triple exposure on colour film, or a digital camera, through the tricolour filters, you end up with a completely normal colour photo. Boring so far. But... what if things move between exposures... then things get interesting.
For instance, I once made a picture of fall leaves on the ground in the woods, using a film camera - no chance for error! With my camera set on multiple exposure (on a tripod of course), I made my first exposure with the first colour filter. Then I raked the leaves a bit to change their locations. Also I waited half an hour for the shafts of sun and shadow to move. Then I changed to the next filter and made another exposure, raked the leaves again, and waited another half an hour for the sun to move. I made my last exposure with the last filter. I only had time for the one shot. The resulting picture is quite interesting and unique. At first glance is looks normal enough. A fall scene with coloured leaves. But as you look closer you see a terrific range of colour in the leaves and colours on the edges of trees, shadows, etc. as the sun moved. I’ve seen people study that picture quite a while trying to figure it out, getting more and more perplexed. Why are there some blue leaves? And so on.

And it looks like I’m writing a book, too, so I had better stop now and have some breakfast.
One part of me is noooooo, keep writing Brian. Then reason pipes up and says that poor man has to eat, so hush Jeffrey ;)
 
Time Spiral. Hourglass.
71F7AF52-1597-48E1-94A7-D4969FACB9BC.jpeg

I posted this on Time Stamp as this week’s timepiece photo but I decided to post it here, too, since I might put more ScanCamera images here as I’m experimenting with it.
 
Preparing for a triple axel.
FB303D2B-EC8D-4C20-8BE1-FE11BC160BCC.jpeg

This is about a 2 minute exposure with a 1 pixel scan width at 4K resolution, scanning top down, using an adjustable speed turntable for the subject. The turntable is much slower than a record player. Maybe 2 or 3 RPM
 
Last edited:
Saturday afternoon it was just on the edge of raining, a light wet mist was falling. The coloured leaves had mostly all fallen as a result of a couple of days of heavy rain and then strong winds. In just a couple of days the leaves were gone. I went looking for colour on the ground. Fallen leaves around brooks, etc. I came across this patch of Hay-Scented Ferns, gone beyond their fall golden colour to cinnamon after a couple of frosts.
362D5536-048B-4B62-8280-9F98C1352895.jpeg

It was fairly gloomy in the woods. My exposures were around 1/15th second in this open area.
PureShot DNG HDR, Affinity.
 
Back
Top Bottom