Please only respond to this thread if you have knowledge of advanced ballistics and/or experience with Sean Kennedy's Shooter Ballistics App.
I shoot a Christensen Arms 300 RUM. I am using a Schmidt & Bender 4-16 x 42 PM II LP scope with single turn 0.1 mrad click values. I am confident the scoping is tracking correctly. I'm zero'd at 100 yards, and I have accurate zero data including a zero atmosphere which includes station pressure, temperature, relative humidity, etc.
I am not getting ballistic solutions that match my target data once I get over around 500 yards, and I have no idea why. My rifle profile is 100% correct; I even confirmed my sight height to the nearest hundredth of an inch. My bullet profile is 100% correct as well. I am shooting a custom load, 180 grain Nosler Accubond bullet. I am using Bryan Litz's ballistic profile, i.e., a BC of 0.246 with the G7 drag model.
I am shooting at accurate, known distances, but once I get over 500 yards Shooter
is calling for too aggressive of come-up values. For example, I shoot U1.9 mils at 500 yards, but shooter calls for U2.0 mils. I know this seems petty but it gets worse, obviously, at longer distances. For example, at 800 yards Shooter
calls for U4.3 mils, but I actually only need to come up U4.0 mils, and that 0.3 mil difference amounts to 8.64 inches at 800 yards! 1,000 yards is a train wreck when using the Shooter
produced data and all of this is made worse when I shoot at very high elevation or make extreme changes to the current environmental conditions.
I have tripled checked every input, and I am accurately measuring environmental data with a Kestrel 4500. The only variable that can possibly be causing this is the muzzle velocity. Unfortunately, Shooter
does not allow you to enter actual muzzle velocities for known temperatures. Instead, the program allows you to input what it calls "velocity variation." This is a fancy way of saying you calculate the change in velocity in feet per second and divide by the change in temperature in order to establish a ratio. The issue is I know this ratio is not linear, but rather logarithmic, so it's not accurate - more of a best guess really. I have logged actual muzzle velocities for a range of different temperatures, but it doesn't do me a lot of good if I can't input the data into the program itself.
So far, I have discovered two options for forcing Shooter
to calculate a ballistic solution / drop chart that matches (or gets pretty close) my actual range data. The first way I have accomplished this is to essentially back solve for a velocity variation that creates a drop chart that matches my actual shooting data. In other words, use trial and error and input various velocity variations that spit out solutions that match my data.
The second option is to use the velocity calibration feature, which allows you to input actual come up values for known distances and then let the program do its own computation of adjusted muzzle velocity. This appears to me to be less accurate than the first option I just described though because, for example, when I shoot U4.0 mils at 800 yards instead of what shooter
calls for, which is U4.3 mils, it calculates an adjusted muzzle velocity several hundred feet per second slower than what my chronograph consistently captures.
1) Am I missing something very obvious?
2) How accurate are these ballistics programs typically? Am I asking for an unreasonable level of accuracy?
3) If I use the ballistic solution that I have created by using option #1 above (back solving for velocity variation), can I expect my solutions to be accurate going forward? i.e., can I use this as a baseline solution, and, therefore, make changes to environmentals, etc. later and expect an accurate output?