Interesting test on top tier scopes.

You know, and I know this isn't practical, so I don't blame the investigator at all, but if you wanted to be rigorous about testing, you would grab 20 or so of each scope, cover them all with black cloth, so the tester doesn't know the brand name, and then test. Then show variability within each scope, and then compare with other scopes.

At the very least, though, the tester should not know which scope he is looking through.

You simply can't overcome operator bias, no matter how honest you are. That's why stuff is blinded in studies.

I had the chance to compare five Swarovski z 5 5-25X52 on an optics chart at 127 yards and low light on antlers at 131 yards. The first and the forth were noticeably better than the second and the third. Later a friend brought his z6 5-30X50. It was about like the second and third for glass and low light. The owner and I were very surprised.
 
The scope of this specific comparison is extremely narrow. And I'm surprised nobody in the last 40+ posts noted the small amount of air there was between the scope and target, and also the limited lighting conditions that were used.

It's not really a secret that high end optics significantly outperform lower tiers in areas like mirage and poor light. A resolution chart at short range will obviously skew results in favor of lower end optics (particularly when a metric used is performance relative to price). Even if you wanted to maintain such a limited comparison, put the charts out further, and the results would definitely change.
 
The scope of this specific comparison is extremely narrow. And I'm surprised nobody in the last 40+ posts noted the small amount of air there was between the scope and target, and also the limited lighting conditions that were used.
As you point out, optics testing is complicated and, in practice, tedious. As narrow as it was, it's apparent to me that the OP went to a fair amount of trouble to perform this test and report results, and I'm grateful to him for doing it for this forum...
 
The scope of this specific comparison is extremely narrow. And I'm surprised nobody in the last 40+ posts noted the small amount of air there was between the scope and target, and also the limited lighting conditions that were used.

It's not really a secret that high end optics significantly outperform lower tiers in areas like mirage and poor light. A resolution chart at short range will obviously skew results in favor of lower end optics (particularly when a metric used is performance relative to price). Even if you wanted to maintain such a limited comparison, put the charts out further, and the results would definitely change.
IMO, the author of the test was pretty explicit about the scope and rationale of his testing process. As defined, the test was probably one of the more "structured" ones of the many I've read, and, seemed well executed.
 
Thread on SH is interesting on some of the top tier scopes of today. Leupold is holding its own as well as NF, which isn't surprising. They seem to have had a few nice surprises, that said it also looks like they may have gotten a few duds, happens even in the best scopes and binoculars. I've seen it more times that I'd like for the kind of coin they ask for these things.


Now that's good info
 
Agree Greyfox, it's one of the better text imo, may not wanna be what some wanna hear but it's on point.
 
Top