Yup, me again. Here's my latest puzzle (whether real or imagined it's still a puzzle). Problem: (Assume NO WIND) Shoot a round at 45 degrees for a distance of 200 yards (this is the actual flight distance of the bullet). The "gravity" or "base" distance is 141 yards so we correctly (or incorrectly) adjust the scope for 141 yards. We fire the shot and it hits LOW. My particular puzzle at this moment has to do with why the round hits LOW. I KNOW that the bullet needs to travel 200 yards but it's on a 45 degree angle so I must adjust for the slope. However, the bullet must also loose velocity over this 200 yard flight and it take time to make this 200 yards flight. If I just assume that the 141 yards setting for the "gravity" or "Base" distance is the correct scope setting I cheat myself because it takes less time for the bullet to go 141 yards and it retaines more velocity over the shorter 141 yard flight. I cheat myself into thinking that a bullet traveling 200 yards on a slope can do it in the same amount of time it'd take to go 141 yards on a level trajectory. My bullet is traveling .05 to .08 seconds longer and will have an arrival velocity as much as 100 fps slower the the level 141 yard shot. WHEW!!! Glad that out in the open... Now, all you rocket scientists help me figure out how to account for this additional drop due to the slower velocity/added Time Of Flight (TOF). I thought I might just figure the additional drop as if the bullet just left the muzzle and traveled the "additional" TOF, adding this drop onto the 141 yard drop data (it'd be somewhere in the .5 MOA area I guess). I also thought I might "adjust" the BC to allow the bullet to make the 141 yard trip in the 200 yard TOF specification and use the drop for this "adjusted" bullet. (This doesn't sound like the correct approach but I'm grasping for straws here.) Anyone still listening??? Thanks in advance..