
Maybe this has been done before, but I've never seen it and it seemed like a fun exercise. This graph is intended to be used to determine arrow speed based on the arrow's drop over distance. It is used by shooting an arrow (or group of arrows) at 20 yards with a 20 yard pin and then backing up to a longer distance and shooting an arrow (or group) with the same pin. The difference between the two arrows (or groups) is the amount the arrow drops between those two distances. Using that measurement and the graph, you can determine the arrow's speed.
The example shown on the graph predicts that a bow that shoots an arrow at 272 fps would see an arrow drop 22 inches between 20 and 40 yards. This may require a pretty large target. Or (assuming your 20 yd pin is dialed in) you can skip the first part and just align your 20 yd pin with the top of the target or an aiming point above the target when shooting at 40 yds.
Note: this graph is based solely on gravity calculations and ignores air friction and some other variables, so it's not going to be perfect, but it should be close.
Anybody with a chrono care to give this a try and let us know how well it works?