Originally Posted by birdrl
Ok I read the sticky on altitude and barometric pressure. I am still confused. Here is my question. How do I calculate the change in my zero if I sight in my rifle at 1000 feet elevation with standard atmospheric conditions at 100 yards and actually test my drops on my BDC reticle then go to 3000 feet elevation with standard atmospheric conditions, now my 100 yard zero is over 1 inch different and my reticle yardages are also different? Maybe I don't understand the terminology in these ballistic programs but I don't see where you can calculate what those differences will be. Thanks
You've just discovered the problem with BDC knobs and reticles. They're calibrated for only one trajectory. You can use your computer programs to see the effects of air density (the combined effect of altitude, barometric pressure, temperature and humidity) to decide how far you can shoot without worrying about the change in trajectory or make up a chart of hold offs vs distance. Using a lookup table rather defeats the point of having BCD knobs or reticles, but it's no slower than using standard target knobs. The is NO simple knob setting which will correct the problem.
Crosswind variation is normally a larger concern than drop variation from altitude changes at moderate ranges. Wind speed usually increases with elevation in typical mountainous terrain. Reduced air density however reduces the amount of bullet deflection for a given crosswind speed. Most ballistics programs show that effect too.
The solution is to stalk closer to the game but on this forum such talk is blasphemy.