How would one theoretically rate the increasing difficulty of hitting a standard target as distances increase?
Leaving out the problems of wind.
This is a theoretical question so actual rifle accuracy can be discounted!
I believe it may be expressed using something like the inverse square law but I'm not sure.
For example: (all targets are the same size reguardless of distance)
Moving from 100 to 200 yards does it become twice as difficult or 4 times as difficult. I'd venture it's 4 times as difficult because the errors are propogated in both the horizontal AND vertical direction.
So, in my thinking it can be expressed like this for distances out to 1000 yards using 100 yards as the standard.
200 yards = 4 times the difficulty
300 yards = 9 times more difficult
400 yards = 16 times more difficult
500 yards = 25 times
600 yards = 36 times
700 yards = 49 times
800 yards = 64 times
900 yards = 81 times
1000 yards = 100 times more difficult
and for the ULR guys
2000 yards = 400 times
2500 yards = 625 times
Does this seem correct?
Leaving out the problems of wind.
This is a theoretical question so actual rifle accuracy can be discounted!
I believe it may be expressed using something like the inverse square law but I'm not sure.
For example: (all targets are the same size reguardless of distance)
Moving from 100 to 200 yards does it become twice as difficult or 4 times as difficult. I'd venture it's 4 times as difficult because the errors are propogated in both the horizontal AND vertical direction.
So, in my thinking it can be expressed like this for distances out to 1000 yards using 100 yards as the standard.
200 yards = 4 times the difficulty
300 yards = 9 times more difficult
400 yards = 16 times more difficult
500 yards = 25 times
600 yards = 36 times
700 yards = 49 times
800 yards = 64 times
900 yards = 81 times
1000 yards = 100 times more difficult
and for the ULR guys
2000 yards = 400 times
2500 yards = 625 times
Does this seem correct?