Actually, this post was made half-jokingly. I was simply having some fun playing around with a theoretical situation of a guy going out on a Saturday afternoon to hunt large, dangerous games at distances in excess of half a mile with a .17 Remington in Death Valley (perhaps they escaped from a nearby zoo) when the temperature plummets to -200 degrees below zero and straight line winds are howling around at 200 miles per hour. Haven't you ever done that before?
Ahhh... yeah, but I generally try to keep things w/i the realm of the possible and practical: i.e. comparing a 90gr JLK @ 2850fps from a .223 Rem spacegun AR vs. a 123gr Lapua Scenar @ 3100fps from a 6.5-08 vs a 142gr SMK @ 2750fps from 6.5-08, etc. comparing wind drift (primary concern for me), drop involved (for the sake of knowing whether my sights/scope have enough adjustment to get there w/o holding off) etc.
I've done similar exercises for varminting, etc. But to me the issue isn't 'danger space', cuz thats what I have a range finder for, or for prairie dogs, can walk the shots in if my range estimate is a little off. The wind however can change from shot to shot, so I either have to be able to see it and dope it, or ballistically overcome it as much as possible. Once I'm beyond point blank range, total drop isn't a huge issue as long as my sights can handle it; I have drop charts and a rangefinder. Danger space isn't something I even worry about.
I guess the concept that I personally find important in all of this is that of tolerances. How far can I afford to be off when asessing the parameters of a given situation before everything blows up. Would a reloader want to use brass that would cause a catastrophe if he was 1/000000 of a grain off in measuring his powder, allowing him no tolerance for error? Would you want to use tires on your car that would explode if you put one tiny bit too much air into them?
Also, you gave me another idea. How about this? People say wind is often a more signifcant factor than distance at long range. This is typically true because, as we all know, the distance is typically more constant than the wind.
So, now what determines the measure of a particular load's suitability to long range shooting in harsh winds? Is it total wind drift? That is certainly one measure.
How about we attempt to measure it in a different way. For the purposes of this discussion, I'll call it Wind Velocity Approximation Tolerance.
Let's look at a .243 Winchester - 95 grain Nosler Soft Point - with a Muzzle Velocity of 3,100 feet per second and a G1 Ballistic Coefficient of .284. At 400 yards, it experiences 22.5 inches of Wind Drift with a 10 m/h Crosswind.
Let's look at a 7mm Lazzeroni Firebird - 140 grain Nosler Ballistic Tip. It has a Muzzle Velocity of 3,750 feet per second and a G1 Ballistic Coefficient of .54. At 400 yards, it experiences 11.3 inches of Wind Drift in a 10 m/h Crosswind.
Nothing unexpected. The 7mm Lazzeroni Firebird beats it, except the fact that the 7mm Lazzeroni Firebird is a better wind-bucker than the .243 Winchester isn't the real point of this, it's merely to demonstrate a concept. Extremes have a tendency to illustrate a point because they can magnify the differences and bring them to light. I'm not always one to get bogged down into specifics when looking at things from a pure conceptual view.
Anyhow, what does all this Wind Drift information mean? On one hand, it means that the guy with the .243 Winchester has to make more clicks on his scope. That in and of itself isn't always a big deal. However, there is the unpredictability factor as well as the difficulty in approxiamting the velocity of the wind that leaves both shooters wondering how many clicks to make on their scope whether they have time to do so or not.
Let's take a look at things a little bit differently. Let's assume a 6 inch kill zone. In order for both shooters to maintain shot placement within +- 3 inches of the target (to avoid shooting the animal in the ass and making it run off after the first shot), the guy with the .243 Winchester must approximate the velocity of the wind accurately to within .92 miles per hour no matter what velocity the wind is at. He has .92 m/h Wind Velocity Approximation Tolerance at 500 yards. The guy with the 7mm Lazzeroni Firebird must approximate the velocity of the wind accurately to within 2.655 m/h no matter how fast the wind is blowing. He has 2.655 m/h Wind Velocity Approximation Tolerance at 500 yards. Their tolerance for error changes with the distance, except not with the Wind Velocity. That is, at 500 yards, the guy with the .243 Winchester has to approximate the velocity of the wind accurately to within .92 m/h whether the wind is blowing at 10 m/h or 200 m/h.
However, there is also a situation analogous to Maximum Point Blank Range here. With the 7mm Lazzeroni Firebird cartridge listed avove you can ignore winds up to a speed of 71.4 m/h at 100 yards, 7.7 m/h at 300 yards, and 3.8 m/h at 500 yards and still maintain shot placement within +- 3 inches.
Here is a comparison of Wind Velocity Approximation Tolerances in miles per hour (analogous to "Danger Space" or Distance Approximation Tolerance if that term is not correct)
The short version as I see it (by no means the right or only way) is that the 'better bullet' (I really wish you would stop the bizarre extreme examples: 52gr .223 vs 175gr .308, .243 vs 7mm Lazzeroni, etc. Try staying w/i the same caliber/cartridge and simply varying the bullet selection for a more reasonable comparison) does give the shooter somewhat more margin for error in calculating the wind speed. The catch is that the wind speed is not something you normally can get an *exact* fix on, as it varies btwn the firing point and the target depending on the lay of the land, etc. so that the wind can literally be going not only different speeds at different points (and heights) along the trajectory, but completely different directions.
*That*, to me, is where the 'better' bullet earns its keep; it allows more of a fudge factor for all the stuff out there that you *can't* see, *can't* quantify, and *can't* calculate down to some decimal value. Even if you could, then the question comes up as to the validity of the readings: When was the last time any of the instruments (rangefinder, thermometer, barometer, inclinometer, etc.) ever been checked against a calibrations standard? Never? So how do you 'know' what any of the values *really* are? That might be exactly what you've been trying to say, but no offense, you manage to say it in the most convoluted manner I've seen in recent history. Any chance you are an engineer or scientist or something similar by trade? [img]images/icons/grin.gif[/img]
Even shorter version: you are making this a lot harder than you need to for connecting on a shot at distance. Just go out and practice, man.