How We Ranked ’Em

An explanation of our methodology.

For the ±¬ÁϹ«Éç Guide to Scopes, we divided the contenders into three categories, based on price. During testing, we the following steps to try to obscure the make/model of the scopes: Each price group was color-coded and each scope within the group was assigned an identification letter (e.g., ‘Red A’ or ‘Blue D’), and we covered identifying marks with masking tape. As our goal was to assess scopes within price groups, we asked reviewers to select at least one color group and to test all of the models within it—rather than randomly selecting models from multiple groups.

We asked reviewers to rate the scopes on a scale of 1 to 10 for each of seven categories, with 10 being the highest score. To determine each scope’s overall score, we calculated a weighted average of its scores in different categories, because we consider some factors, such as sharpness and brightness, to be more important considerations than others, such as edge-to-edge focus. Below are the categories, along with the weight we assigned to each when analyzing the results.

Image Quality

Sharpness: 1

Brightness: 1

Color: .8         

Edge-to-edge focus: .7

Feel

Zoom: .9

Ease of focus: .9

Eye relief: 1