What's new
New posts
New media
New media comments
New profile posts
New review items
Latest activity
Forums
New posts
Search forums
Gallery
New media
New comments
Search media
Reviews
New items
Latest content
Latest reviews
Latest questions
Brands
Search reviews
Opus
Birds & Bird Song
Locations
Resources
Contribute
Recent changes
Blogs
Members
Current visitors
New profile posts
Search profile posts
ZEISS
ZEISS Nature Observation
The Most Important Optical Parameters
Innovative Technologies
Conservation Projects
Log in
Register
What's new
Search
Search
Search titles only
By:
New posts
Search forums
Menu
Log in
Register
Install the app
Install
BirdForum is the net's largest birding community dedicated to wild birds and birding, and is
absolutely FREE
!
Register for an account
to take part in lively discussions in the forum, post your pictures in the gallery and more.
Forums
Binoculars & Spotting Scopes
Spotting Scopes & tripod/heads
Audubon ranking of scopes
JavaScript is disabled. For a better experience, please enable JavaScript in your browser before proceeding.
You are using an out of date browser. It may not display this or other websites correctly.
You should upgrade or use an
alternative browser
.
Reply to thread
Message
<blockquote data-quote="aeajr" data-source="post: 3685058" data-attributes="member: 144708"><p><strong>Reading the review, this was not an optics test this was a user experience report. It was based on the perception of actual users with the scopes in the field. As shown in the photo, they lined 'em up and asked people to compare them. It also appears that the scopes were evaluated within price groups so a $300 scope was rated within the $1000 or less category and not against scopes in higher categories. A 7 in one category is not comparable to a 7 in a higher category.</strong> </p><p></p><p>Photo of test environment</p><p><a href="http://cdn.audubon.org/cdn/farfuture/6LBmY-RFQnEwg1oR-d7BV2LUJ4uL9BnMvIUgvpfombs/mtime:1512661479/sites/default/files/scopes_camilla-cerea_-001.jpg" target="_blank">http://cdn.audubon.org/cdn/farfuture/6LBmY-RFQnEwg1oR-d7BV2LUJ4uL9BnMvIUgvpfombs/mtime:1512661479/sites/default/files/scopes_camilla-cerea_-001.jpg</a></p><p></p><p></p><p>I am not supporting or condemning how the test was done, but it is important to understand how the review was done to understand the results. I always find it important to read any sort of review in context. The method seems to make sense to me although it seems they should have sub divided them by aperture and evaluated at uniform magnification. And perhaps they did, but it doesn't say that in the report. </p><p></p><p>I posted a question to the Audubon site for clarification.</p><p></p><p></p><p>Based on my telescope experience, different users will evaluate the same eyepiece differently according to what seems most important to them. And, as we all know the cost to gain a little bit can be quite high. And I am always critical when someone compares a $50 eyepiece to a $500 eyepiece without calling out that price difference. If the </p><p></p><p>So the performance of a $250 scope and a $500 scope may not be that far apart. And the margin between a $1000 and $2000 may not be that great, but when you must have the very best, you have to be willing to pay that premium.</p><p></p><p></p><p><a href="http://www.audubon.org/news/how-we-ranked-em-0" target="_blank">http://www.audubon.org/news/how-we-ranked-em-0</a></p><p></p><p></p><p></p><p><em>For the Audubon Guide to Scopes, we divided the contenders into three categories, based on price. During testing, we the following steps to try to obscure the make/model of the scopes: <strong>Each price group was color-coded and each scope within the group was assigned an identification letter </strong>(e.g., ‘Red A’ or ‘Blue D’), and we covered identifying marks with masking tape. As our goal was to assess scopes within price groups, we asked reviewers to select at least one color group and to test all of the models within it—rather than randomly selecting models from multiple groups.</em></p><p><em></em></p><p><em>We asked reviewers to rate the scopes on a scale of 1 to 10 for each of seven categories, with 10 being the highest score. To determine each scope’s overall score, we calculated a weighted average of its scores in different categories, because we consider some factors, such as sharpness and brightness, to be more important considerations than others, such as edge-to-edge focus. Below are the categories, along with the weight we assigned to each when analyzing the results.</em></p><p><em></em></p><p>Image Quality</p><p></p><p>Sharpness: 1</p><p></p><p>Brightness: 1</p><p></p><p>Color: .8 </p><p></p><p>Edge-to-edge focus: .7</p><p></p><p>Feel Zoom: .9</p><p></p><p>Ease of focus: .9</p><p></p><p>Eye relief: 1</p></blockquote><p></p>
[QUOTE="aeajr, post: 3685058, member: 144708"] [B]Reading the review, this was not an optics test this was a user experience report. It was based on the perception of actual users with the scopes in the field. As shown in the photo, they lined 'em up and asked people to compare them. It also appears that the scopes were evaluated within price groups so a $300 scope was rated within the $1000 or less category and not against scopes in higher categories. A 7 in one category is not comparable to a 7 in a higher category.[/B] Photo of test environment [url]http://cdn.audubon.org/cdn/farfuture/6LBmY-RFQnEwg1oR-d7BV2LUJ4uL9BnMvIUgvpfombs/mtime:1512661479/sites/default/files/scopes_camilla-cerea_-001.jpg[/url] I am not supporting or condemning how the test was done, but it is important to understand how the review was done to understand the results. I always find it important to read any sort of review in context. The method seems to make sense to me although it seems they should have sub divided them by aperture and evaluated at uniform magnification. And perhaps they did, but it doesn't say that in the report. I posted a question to the Audubon site for clarification. Based on my telescope experience, different users will evaluate the same eyepiece differently according to what seems most important to them. And, as we all know the cost to gain a little bit can be quite high. And I am always critical when someone compares a $50 eyepiece to a $500 eyepiece without calling out that price difference. If the So the performance of a $250 scope and a $500 scope may not be that far apart. And the margin between a $1000 and $2000 may not be that great, but when you must have the very best, you have to be willing to pay that premium. [url]http://www.audubon.org/news/how-we-ranked-em-0[/url] [I]For the Audubon Guide to Scopes, we divided the contenders into three categories, based on price. During testing, we the following steps to try to obscure the make/model of the scopes: [B]Each price group was color-coded and each scope within the group was assigned an identification letter [/B](e.g., ‘Red A’ or ‘Blue D’), and we covered identifying marks with masking tape. As our goal was to assess scopes within price groups, we asked reviewers to select at least one color group and to test all of the models within it—rather than randomly selecting models from multiple groups. We asked reviewers to rate the scopes on a scale of 1 to 10 for each of seven categories, with 10 being the highest score. To determine each scope’s overall score, we calculated a weighted average of its scores in different categories, because we consider some factors, such as sharpness and brightness, to be more important considerations than others, such as edge-to-edge focus. Below are the categories, along with the weight we assigned to each when analyzing the results. [/I] Image Quality Sharpness: 1 Brightness: 1 Color: .8 Edge-to-edge focus: .7 Feel Zoom: .9 Ease of focus: .9 Eye relief: 1 [/QUOTE]
Insert quotes...
Verification
Post reply
Forums
Binoculars & Spotting Scopes
Spotting Scopes & tripod/heads
Audubon ranking of scopes
This site uses cookies to help personalise content, tailor your experience and to keep you logged in if you register.
By continuing to use this site, you are consenting to our use of cookies.
Accept
Learn more...
Top