By "higher" I mean magnifications with a numerical value greater than half the objective diameter in millimetres.
This concept is such a radical departure from accepted thought that I have my own doubts and would appreciate some qualified comments.
I was out birding with a friend yesterday and recounted the resolution measurements (Resolution measurement 8x56 SLC) I had conducted lately. In view of the tiny angles (1 arcsecond is 1/3600th of 1 degree) involved, the friend asked if low arcsecond values for binoculars were of any relevance to what we see through them.
At 2,9" for the 8x56 SLC this is probably not the case but many good binoculars with smaller objectives might only resolve 4-5" and if they had 10x magnification the perceived acuity at 40-50" would be getting very close to the abilities of a user with 20/20 vision. The latter is defined as 1 arcminute (60").
I then realized that the scope I was carrying, a Swarovski ATM65 HD with its moderate magnification (30x W) gets even closer to the visual capabilities of most users. With a measured resolution of 1,78" this corresponds to perceived acuity of 53".
The Dawes' limit defines the theoretical maximum resolutiom of an optical device and is 116 divided by the diameter of the objective in millimetres, hence 1,78" for my (diffraction limited) 65 mm scope, 2.32" for a 50 mm scope or 1" for a 115 mm scope.
Consequently, for users with 20/20 vision, magnifications of much over half the objective diameter in millimetres could be considered empty magnification. Users with a poorer VA would of course profit from higher magnifications, but someone with very high VA might be able to detect the limits of a diffraction-limited scope at said magnification on a well illuminated and high contrast target such as a test chart.
At somewhat advanced age my visus has deteriorated to abot 0,9 but in the field I have often experienced that there were no gains at magnifications above 50x on my Kowa 883 (also a good example). This I had attributed to seeing conditions or the inevitable loss of brightness (and contrast) at exit pupils below 2 mm. At 50x the Kowa has a 1,8 mm exit pupil, but perhaps I was seeing the limits of the scope?
Unrelated to the above are factors concerning the human eye. In a Wikipedia article it is suggested that the eye's maximum acuity is at pupil openings of 3-4 mm and that, at pupil dimensions below 2 mm, acuity is degraded by diffraction effects. A 2mm exit pupil also corresponds to a magnification of half the objective diameter.
Unfortuately there is now a dearth of fixed focal length wide-angle eyepieces, so we are left with the restricted AFoVs of zoom oculars at the lower magnifications and mostly useless magnifications at the higher end.
Numbers sell and the cheap offers of 70x50 binoculars on a certain auction site are often rightly ridiculed on BF, but is there really any point in 45x magnification on a 50 mm scope, or 60x on a 65 mm, let alone the ridiculous magnifications that can be achieved with the current crop of extenders?
John
This concept is such a radical departure from accepted thought that I have my own doubts and would appreciate some qualified comments.
I was out birding with a friend yesterday and recounted the resolution measurements (Resolution measurement 8x56 SLC) I had conducted lately. In view of the tiny angles (1 arcsecond is 1/3600th of 1 degree) involved, the friend asked if low arcsecond values for binoculars were of any relevance to what we see through them.
At 2,9" for the 8x56 SLC this is probably not the case but many good binoculars with smaller objectives might only resolve 4-5" and if they had 10x magnification the perceived acuity at 40-50" would be getting very close to the abilities of a user with 20/20 vision. The latter is defined as 1 arcminute (60").
I then realized that the scope I was carrying, a Swarovski ATM65 HD with its moderate magnification (30x W) gets even closer to the visual capabilities of most users. With a measured resolution of 1,78" this corresponds to perceived acuity of 53".
The Dawes' limit defines the theoretical maximum resolutiom of an optical device and is 116 divided by the diameter of the objective in millimetres, hence 1,78" for my (diffraction limited) 65 mm scope, 2.32" for a 50 mm scope or 1" for a 115 mm scope.
Consequently, for users with 20/20 vision, magnifications of much over half the objective diameter in millimetres could be considered empty magnification. Users with a poorer VA would of course profit from higher magnifications, but someone with very high VA might be able to detect the limits of a diffraction-limited scope at said magnification on a well illuminated and high contrast target such as a test chart.
At somewhat advanced age my visus has deteriorated to abot 0,9 but in the field I have often experienced that there were no gains at magnifications above 50x on my Kowa 883 (also a good example). This I had attributed to seeing conditions or the inevitable loss of brightness (and contrast) at exit pupils below 2 mm. At 50x the Kowa has a 1,8 mm exit pupil, but perhaps I was seeing the limits of the scope?
Unrelated to the above are factors concerning the human eye. In a Wikipedia article it is suggested that the eye's maximum acuity is at pupil openings of 3-4 mm and that, at pupil dimensions below 2 mm, acuity is degraded by diffraction effects. A 2mm exit pupil also corresponds to a magnification of half the objective diameter.
Unfortuately there is now a dearth of fixed focal length wide-angle eyepieces, so we are left with the restricted AFoVs of zoom oculars at the lower magnifications and mostly useless magnifications at the higher end.
Numbers sell and the cheap offers of 70x50 binoculars on a certain auction site are often rightly ridiculed on BF, but is there really any point in 45x magnification on a 50 mm scope, or 60x on a 65 mm, let alone the ridiculous magnifications that can be achieved with the current crop of extenders?
John
Last edited: