Measuring OAL in precision ammo...
I've often wondered how others do this -- particularly those who are building precision ammo. I'm talking about BR quality ammo (and yes, I have a rifle that will allow me to see the variance on paper).
When I measure loaded ammo with my Starrett calipers, clamp-on comparator, and comparator base, my setup allows for very repeatable, precise readings. I use Redding competition micrometer seating dies for all bullet seating. I hold the case flat against the comparator base, and I center the bullet in the comparator before lightly lowering the comparator onto the ogive with consistent light pressure. Nevertheless, I will often get variance from round to round in terms of the OAL to ogive after seating. I attribute this (perhaps incorrectly) to variance in the bullet dimensions. In an effort to deal with the variance, I wind up seating all of my rounds to something greater than what I'm shooting for, and then "walking" down the OAL a thousandth at a time on each round until I get the reading I'm looking for. To say that this is painstaking is an understatement.
I'd love to just set the seating die, confirm the first round, and then load 30 rounds. However, when I do that, I invaribly wind up with a few that our out of spec by a couple of thousandths -- some lower, some higher. For normal shooting, who cares. When I'm doing OAL testing, however, it matters.
How do the rest of you tackle this issue? Most importantly, what's causing the variance? Is it bullet dimensions, or is it undulations in the case head causing it to sit less than level on the comparator base? Is it errors in measurement, and am I just not thinking about it correctly? I've fine tuned my measuring technique enough that I'm pretty confident it's not me, but I've been wrong before. Those of you who nitpick like I do, what are your thoughts, and how do you deal with this issue in your ammo?