I have developed a couple of loads for my .308 that produce awesome groups at 100 yards. I haven’t had time to test them at longer ranges yet, but I did collect a ton of chronograph data from each load. When I calculate bullet drop at 1000 yards it becomes very apparent that a 60 fps variation in muzzle velocity can easily mean the difference between a kill and a miss at great distances. The question I have is what amount of variation do we expect to have to live with? My 168gr SMK load has a std. deviation of 9 fps. My 190gr SMK load has a std. deviation of 16 fps. Does anybody here know what muzzle velocity variation the 1000 yard pro shooters get? I’ve also read some debate about match primers and their effect on variation. One article said that the Fed. Match primers were on the bottom of the match primer scale, and said that RWS primers were the best. I would love to know for sure, but I really don’t want to burn up a barrel with a thousand dollars worth of ammo to prove which primers are the best. Has anybody seen any definitive data to support any theories?