• Welcome to BirdForum, the internet's largest birding community with thousands of members from all over the world. The forums are dedicated to wild birds, birding, binoculars and equipment and all that goes with it.

    Please register for an account to take part in the discussions in the forum, post your pictures in the gallery and more.
ZEISS DTI thermal imaging cameras. For more discoveries at night, and during the day.

Audubon ranking of scopes (1 Viewer)

Interesting that the Trailseeker got good ratings. The 65mm comes in a straight scope. I have pretty much given up using angled scopes though I still have one.
Celestron 52331 TrailSeeker 65 - Straight Spotting Scope (Black) runs $260

Are you saying that you own a Trailseeker 65? I have been considering this or the 80 mm and would welcome any comments from an owner.
 
I can't speak to the two lower categories but the high end just confirms what I have seen over the last eight or nine years. The big Kowa simply works and while I think the 95 Swaro makes it interesting for Kowa its remarkable that as long as the 88 Kowas have been out that no one has made a significantly better scope. My opinion is that Kowa finds small improvements and simply chands the scope (or bino) without making a new model. I don't see another way that they could have been in their position for so long.
Steve
 
Sure sends my blood pressure up to read something like that. I think this may be the most clueless in a long line of clueless group tests from Audubon Magazine and Living Bird Quarterly stretching back to the 1980s. Why oh why won't they learn how to perform some basic optical tests?!

Reading the review, this was not an optics test this was a user experience report. It was based on the perception of actual users with the scopes in the field. As shown in the photo, they lined 'em up and asked people to compare them. It also appears that the scopes were evaluated within price groups so a $300 scope was rated within the $1000 or less category and not against scopes in higher categories. A 7 in one category is not comparable to a 7 in a higher category.

Photo of test environment
http://cdn.audubon.org/cdn/farfutur...s/default/files/scopes_camilla-cerea_-001.jpg


I am not supporting or condemning how the test was done, but it is important to understand how the review was done to understand the results. I always find it important to read any sort of review in context. The method seems to make sense to me although it seems they should have sub divided them by aperture and evaluated at uniform magnification. And perhaps they did, but it doesn't say that in the report.

I posted a question to the Audubon site for clarification.


Based on my telescope experience, different users will evaluate the same eyepiece differently according to what seems most important to them. And, as we all know the cost to gain a little bit can be quite high. And I am always critical when someone compares a $50 eyepiece to a $500 eyepiece without calling out that price difference. If the

So the performance of a $250 scope and a $500 scope may not be that far apart. And the margin between a $1000 and $2000 may not be that great, but when you must have the very best, you have to be willing to pay that premium.


http://www.audubon.org/news/how-we-ranked-em-0



For the Audubon Guide to Scopes, we divided the contenders into three categories, based on price. During testing, we the following steps to try to obscure the make/model of the scopes: Each price group was color-coded and each scope within the group was assigned an identification letter (e.g., ‘Red A’ or ‘Blue D’), and we covered identifying marks with masking tape. As our goal was to assess scopes within price groups, we asked reviewers to select at least one color group and to test all of the models within it—rather than randomly selecting models from multiple groups.

We asked reviewers to rate the scopes on a scale of 1 to 10 for each of seven categories, with 10 being the highest score. To determine each scope’s overall score, we calculated a weighted average of its scores in different categories, because we consider some factors, such as sharpness and brightness, to be more important considerations than others, such as edge-to-edge focus. Below are the categories, along with the weight we assigned to each when analyzing the results.

Image Quality

Sharpness: 1

Brightness: 1

Color: .8

Edge-to-edge focus: .7

Feel Zoom: .9

Ease of focus: .9

Eye relief: 1
 
Last edited:
I have bought some optics because of favourable reviews, but when I got them some were woefully bad.

This is because the tester did not have any real knowledge of optics.

Usually I am safe with the optics experts.

But with binoculars, the Nikon SE don't suit me at all because of blackouts, and some others also just don't suit me.
And many binoculars have far too much eye relief for me.

Scopes are easier, but I would still prefer an optics expert rating them.

I have seen some reviews by experts also that are just factually wrong, but nobody is perfect.
 
I'll take the evaluation of one person who knows how to test a telescope any day over the impressions of a multitude of optical innocents. ;)
 
Last edited:
Finally made a decision. I ordered the Celestron Trailseeker 80 20-60X with the 45 degree eyepiece.

I will let you know how it works out. This will be my only purpose built spotting scope so I won't have anything to compare it to other than my telescopes.
 
Warning! This thread is more than 6 years ago old.
It's likely that no further discussion is required, in which case we recommend starting a new thread. If however you feel your response is required you can still do so.

Users who are viewing this thread

Back
Top