Re-zeroing at altitude

jamiebolseth

Well-Known Member
Joined
Feb 12, 2015
Messages
401
I've got a question/topic that I'd like to get people's thoughts on...

I live at about 1000 ft. above sea level (ASL). I hunt at 8000 ft. above sea level (ASL). I zero all my rifles at 100 yds, and dial elevation corrections in MOA for shots longer than 100 yds. Specific rifle I'm talking about is a super accurate 7SAUM shooting 175 bergers @2770 fps. I'm using the Applied Ballistics app for ballistics solutions.

The question is whether my 100 yd zero will change from 1000 ft. ASL to 8000 ft. ASL? Common sense tells me it has to. Bullet is hitting more air molecules during the first hundred yds of flight when flying through denser air, so it has to slow more. And the software computes a velocity of about 35 fps slower at 100 yds when shooting through the denser air, so that seems to back up my theory. The question is whether it's noticeable enough to even observe based on this rifle's (and my ability) to shoot roughly 0.2-0.3 MOA groups.

When I look at the AB software there's an option to turn on "Enable Zero Atmosphere", which seems like the feature that I'm looking for. To me it seems like I would enter my zero conditions at the 1000 ft. ASL environment. And then when I travel to 8000 ft I would expect the software to account for the fact that my zero conditions are significantly different. For example, I expected to see a small correction of, say, D0.25 at 100 yds when I shoot at 8000 ft ASL since the bullet is traveling through less dense air. But this is not what I see. Instead I always just get 0.0 correction for 100 yds.

Then I thought, well, 100 yds just isn't far enough to see the difference (bullets still flying to fast/flat to notice). So I set my zero distance in the computer to 1500 yds. With "Enable Zero Atmosphere" turned on. I calculated a solution for 1500 yds with the exact same conditions as my zero atmosphere (1000 ft. ASL). Not surprisingly, I got a correction of 0.0 at 1500 yds. Then I changed the conditions of the shot to be at 8000 ft. ASL thinking I'd see a correction needed at 1500 yds to account for the difference in air density. BUT - I still got a correction of 0.0 to hit at 1500 yds.

I either don't understand what the "Enable Zero Atmosphere" feature does. Or it doesn't work. Or I am just confused all together.

All this started because my perfect 100 yd zero is "between clicks" on the hundred yd target at 1000 ASL, and I got to wondering if it made sense to zero slightly low based on the fact that I would be hunting at higher elevation. So I set out to prove that my thinking was correct with the software, but I can't make sense of what I'm seeing.

Can anybody enlighten me?
 
I use Shooter. I always enter my zero atmospheric information, usually 28.09-28.25 baro (not just elevation which is about 1700'AMSL), between 60-90°F. 200 yard zero on all my rifles, which increases the chances for more error vs. 100 yard zero.
With that being said, with using a ballistic calculator, I have made first round hits at elevations between 2000'-9500'+ and between 40 to 1500+ yards by simply entering correct conditions.
I would bet, with accurate inputs during zero @ 1000', if you go up to 7000', it should correct to show approximately a .1-.2" high impact. But, temps make a difference too. If you zero @ 80°, but are hunting @ 30°, it will bring your zero back down. Dense air, being from elevation or temps, is still dense air. Now if you zero @ 1000' & 30°, then go shoot at 8000' & 80°, you will see a bigger change in drops at distance, being much flatter trajectory.
 
I have no experience but it's on my mind as well. I feel like it will be minimum if any but you're dope at long range will be different based on reduced drag in the thin air.

One way to know for sure is to shoot it once you get there. Easy enough to do then adjust if necessary. After traveling, you should at lease check zero regardless IMO.
 
have you actually put bullets through paper at the different altitudes? that would definitely prove or disprove any theories you or others may have
 
There has been quite a big of discussion on this topic over on the Snipers Hide forum. The consensus is that the difference in zero is so small that it basically doesn't matter when you zero the rifle system at 100 yds.

One way to think about it is like this:
1. The bullet begins to drop as soon as it leaves the barrel. Gravity starts pulling it downward at the rate of 32.2 feet per second every second. But the barrel is inclined upward.
2. When you zero the rifle at 100 yards the bullet rises from the elevation of the muzzle and "touches" the line of sight, (from the scope), before it continues dropping.
3. The Time of Flight between the firing position to the target when shooting over 100 yards , is so short that Gravity hasn't caused the bullet to drop too fast YET. (about 0.11 second)

In the ballistic program your using do you have a separate group of entry boxes that allows you go enter Zero Atmosphere numbers? In the program I use those boxes are located in the ammunition entry page.
 
Last edited:
That is usually enabled when your zero is past 100 yards. So say you zero your rifle at 300 yds then you would enable it and it would account for the difference in atmospheric conditions. With 100 yd zero there will not be a noticeable enough difference for a solution. I have zeroed my rifle at 100 yds at sea level and went on hunt in NM and I was camping at 9K. Soon as I arrive I check my zero and have done this several times and I have never seen a difference in zero and this has been with several calibers.
 
As a matter of practicality... My question is really one of curiosity and an attempt to develop a full understanding of how the ballistics software works, and the impact of air density on my bullet.

I don't really believe that I'll miss/wound my mule deer at 600 yds because of a potential difference in my 100 yd zero due to air density. So in a practical sense the answer is likely "it doesn't matter, you'll still kill your deer". But things that don't make sense bug me.

The explanation that "at 100 yds it's too small to matter" makes perfect sense. But when I changed my zero distance to 1500 yds. it HAS to make a difference, but the computer didn't reflect that. So that's really a head-scratcher for me.

The question "Have you shot at both elevations to observe the difference?" is another point to think about... The answer in my case is "kind of". The land I hunt belongs to our family, so I shoot there often, and have targets setup past 1000 yds. Sure - I can go up there and hit 10" targets at 800 or 1000 yds, or shot 1" dots at 100 yds. But just the act of relocating introduces a lot of other variables... Vertical drafts in the mountains, dynamic jump from left/right winds, your shooting position changes, the temp changes, the target might be at an angle up/down, mirage conditions might be different. So if I DID see a difference I'm not sure I'd trust that it was purely attributable to air density. This is why I wanted to do this all on the computer where you can artificially hold all the other variables constant. But when I did that I couldn't make sense of what I was seeing.

Anyway - Interesting thoughts on the topic...

JamieB
 
That is usually enabled when your zero is past 100 yards. So say you zero your rifle at 300 yds then you would enable it and it would account for the difference in atmospheric conditions. With 100 yd zero there will not be a noticeable enough difference for a solution. I have zeroed my rifle at 100 yds at sea level and went on hunt in NM and I was camping at 9K. Soon as I arrive I check my zero and have done this several times and I have never seen a difference in zero and this has been with several calibers.

Thanks Remmy. What you are describing is exactly what I thought I'd see (longer zero range would produce differences in the solution at different elevations). But when I play with the AB software with longer zero ranges (I went out to 1500 yds for a zero) and different shot conditions (different air densities for the proposed shot) I only ever saw a correction of 0.0 at my zero range (whether a 100 yd zero, or a longer zero). Have you actually seen this feature work in AB?
 
Okay, I just tried what you were trying to do, and came up with a correction. I didn't plug in your exact conditions but I had about 4000' altitude difference. There was some temperature difference I have defined that I didn't change. Down 0.7 mils at 1000 yds. I'm using the Shooter program.
 
Thanks Remmy. What you are describing is exactly what I thought I'd see (longer zero range would produce differences in the solution at different elevations). But when I play with the AB software with longer zero ranges (I went out to 1500 yds for a zero) and different shot conditions (different air densities for the proposed shot) I only ever saw a correction of 0.0 at my zero range (whether a 100 yd zero, or a longer zero). Have you actually seen this feature work in AB?
I would go re-check all of your inputs you have made I think there is something off with your inputs brother. I tried several different altitudes and it continues to give me a correction.
 
Just a quick note since you guys were nice enough to help me out. I downloaded Shooter and that app behaves exactly as I'd expect. With a long range zero (300 yds plus) it calls for a correction at that zero if I'm shooting at a significantly different air density (altitude) than my Zero Atmosphere.

For some reason AB doesn't behave the same way for me. So it's either a software problem, or a software user problem. But either way I feel like I understand the effects of altitude on my zero. At 100 yds it's not noticeable. With a 300 yd zero it starts to show up with altitude changes of greater than 5000 ft. Thos is really what I wanted to know, so thanks for the help and the thoughts.

JamieB
 
Warning! This thread is more than 5 years ago old.
It's likely that no further discussion is required, in which case we recommend starting a new thread. If however you feel your response is required you can still do so.
Top