In one form or another, we’ve received that particular bit of feedback a hundred times over. I’ve responded personally, in detail, at least a half a dozen times. We heard something like it again on our recent story on the Titleist TS Driver – and I responded again.
Given how frequently the assertion is made, we decided to pull my reply out of the comments section, add a bit of additional detail, clean it up a bit, and stick it on the front page. We think it’s time golfers fully understood what the USGA limits really mean and where the line is between distance reality and distance mythology.
Permit me to start by being absolutely clear in saying that short of a proper fitting, the newness of any head isn’t going to give you anything close to 10 more yards. That shipped sailed. Year-over-year, responsible manufacturers don’t promise much in any quantifiable terms. From a marketing perspective, faster is much more impactful than an honest promise of 1-more yard. But small as the gains may be, let me also be clear when I say that driver distance is not maxed out.
One more time with feeling: DRIVER DISTANCE IS NOT MAXED OUT.
While this story focuses on distance, I want to also briefly touch on forgiveness (MOI). The USGA’s cap on MOI is 5900. The highest MOI model on the market right now is the PING G400 MAX with an iyy (heel/toe) MOI value of ~5700. That’s doesn’t leave a ton of room, but it’s quite obviously not maxed out either. The rest of the industry still has plenty of opportunities to make gains. We can haggle about the point of diminishing returns on MOI some other day. Today is about showing that there is still room within the rules for the OEMs to dabble.
With that out of the way, let’s get to what many believe is the hard stop on distance-driven innovation.
CT vs. COR
So that we’re all on the same page, let’s cover exactly what the USGA is testing and make sure everyone knows the numbers. The limit for CT is 239μs (microseconds). The USGA allows for a tolerance of another 18 microseconds, making the real-world limit 257μs . To measure CT, a golf club is secured in a pendulum testing apparatus, the pendulum is dropped on the clubface, and the measurement is taken. The process is repeated over several points on the face. The actual CT measurement (characteristic time) reflects the duration that the pendulum remains in contact with the face. More face flexure translates as more rebound (and theoretically more speed), and so the CT rule is often characterized as a limit on how much the clubface is allowed to flex.
The CT rule was implemented in 2004. Previously (beginning in 1988), the USGA rules were based on COR (coefficient of restitution). The COR limit was .822 with a tolerance of .008, which is how we get to the commonly referenced .830 limit. Your definition for COR (courtesy of Wikipedia) is this: The coefficient of restitution (COR) is the ratio of the final to initial relative velocity between two objects after they collide.
As it relates to the USGA rule, it’s pretty simple. If you fire a golf ball into a clubhead at a speed of 100MPH, the ball’s rebound velocity can’t exceed 83MPH. Much like the USGA’s current ball tests, the COR test was rigid and exceptionally difficult to engineer around. Face tech, body tech, anything else manufacturers could dream up, the limit on velocity was absolute. The downside of using COR is that the test requires a higher degree of precision in the setup. Getting everything lined-up up precisely as it needed to be was difficult and time-consuming, and not always 100% repeatable. And so, the CT test was born.
Why does this matter?
I’ve been told of at least four drivers on the market today (and I’d wager there are more) that would not be legal under the COR test, but that passed the CT test. Companies don’t discuss this stuff publicly because nobody wants to ruffle feathers or raise the attention (and ire) of the USGA. It’s my understanding the USGA has reserved the right to test COR at its discretion, but the insiders I’ve spoken with are unaware of any situation where that’s happened.
The takeaway from this is that COR testing is damn near absolute, but with CT, there’s still just a little bit of wiggle room within the rules.
Now that we have a handle on exactly what it is that’s being testing, let’s for a moment assume that everyone in the industry is right at the 257μs limit. The actual portion of the face where 100% of the allowable CT is maintained represents a small percentage of the total face area. Missing the sweet spot by just a few millimeters drops ball speed. As you move farther from center, ball speed declines even more. So, as manufacturers boost MOI in conjunction with face technologies that retain a greater percentage of ball speed on off-center hits, they are effectively increasing real-world distance. I’d wager that many golfers tend to think of off-center misses as something way out on the toe or a low heal clank, but in terms of what has a quantifiable impact on ball speed, off-center means literally anything that isn’t damn near dead-nuts center. You miss the sweet spot, ball speed drops. Through actual advances in driver technology, every year it’s dropping a bit less over a larger area of the face, and you’re getting more distance because of it.
Bottom line, even if every manufacturer was at 257μs in the sweet spot, most of the rest of the face isn’t close.
Distance isn’t Just About Ball Speed
It’s also important to keep in mind that the USGA testing, in practical terms, addresses ball speed only. Distance isn’t simply a matter of ball speed. While ball speed is the most significant contributor, launch and spin also play a role in distance. If a given design allows the golfer to hit the ball higher with less spin, distance will increase even if ball speed is constant. The USGA has no test that regulates the relationship between launch and spin. This is why the evolution of CG placement is so critical. If you look at where driver CG was five years ago compared to where it is today, the improvement is undeniable.
Think about this: the COR rule was put in place in 1988. It was replaced by the CT rule in 2004. Does anyone here honestly believe that drivers made in 1988 or even one made in 2004 will perform as well as one made in 2018? By that thinking, distance gains should have ended 14 years ago. Granted, OEMs continue to oversell distance (everything is faster and more forgiving), but despite pervasive myths to the contrary, nobody has promised 10 more yards! in the driver category in a close to a decade. Gains are small, but they’re real.
All of this is before we start talking about things like aerodynamics – if the clubhead produces less drag, you can swing it faster. More head speed = more ball speed. Apart from rules governing basic shape and dimension, the USGA does not have any rules for aerodynamic properties. Gains here are also admittedly small, and they disproportionally favor higher swing speed players, but even small gains defeat any argument of maxed out.
Weight is another contributing factor to distance. Some golfers swing lighter clubs faster. Again, more head speed = more ball speed.
Shafts are another area that golfers overlook when making the “distance is maxed argument”. We don’t swing clubheads, we swing golf clubs, and every one of those has a shaft. If a shaft can store more energy and deliver it to the ball at impact, you have a recipe for more distance. Manufacturers – even shaft manufacturers – are only just beginning to fully understand what is achievable by way of improved shaft designs. I believe the shaft category is the closest thing we have to a next frontier in driver distance.
Those last bits are why fitting is now a crucial part of the distance equation. Companies often design for the middle of the market, but taking the time to get fit and thereby leveraging the right combination of all of the above to fit your game, you can unlock quite a bit more distance. Maybe even 10 more yards.
The Industry’s Dirty Little CT Secret
Finally, let’s circle back to where I suggested we assume everyone hits the 257μs limit. That was absurd on my part. Only a tiny percentage of driver manufactured today have a CT approaching 257μs.
As with anything else in the club space, drivers have CT tolerances. For any production run, actual CTs from part to part fall on a bell curve. For most brands, only a small percentage of those that make it to retail butt right up against the limit. For some companies, the bulk, if not all, of the high/max CT heads end up in the tour department, and the really hot ones end up in the special drawer.
Some aren’t going to want to hear this, but this is an area where larger brands have a pronounced advantage. Larger brands often have tighter control over the factories – and many have staff who work out of those factories to help ensure quality – CT being one of the metrics that larger brands (higher volume) can more rigidity maintain. Smaller brands who produce in smaller volume have less control and often greater variance from part to part. Tighter tolerances cost more and without the volume that large OEMs do, it’s difficult to maintain tight CT tolerances and still make the requisite margin. Everybody has a production curve, but with larger brands, the spread isn’t as wide.
We’ve seen this in our testing. Not so long ago, we had a couple of heads for which the performance wasn’t quite measuring up to past experience, so we sent them off to be tested by a 3rd party. What we found was a different CG position, different MOI measurements, and slightly lower CT. This is the reality of production. You could get a hot head (possibly even over the limit), but you’re more likely to get one that’s safely under the limit, and in the worst of cases, a complete dud.
Another example; we recently had several heads CT-checked. The actual measured values ranged from 219μs to 254μs, with an average of 236.3μs. Tell me again about how everything is already at the limit.
Not to get off-track, but this is exactly why I always recommend you buy the exact club you demoed.
Improving tolerances is part of the reason why you see a shift towards manufacturing consistency as part of the marketing/performance story. You saw this with Cobra’s F8 story last year. Titleist says it takes measures to ensure every TS head is on spec. Others will almost certainly follow suit in 2019, but we’re still a long way from squeezing the curve entirely. The consequence of that is that while the technical limit might be 257μs, actual design targets for most are much closer to 239-240μs with an expected tolerance of 10μs.
It’s a safe bet that nobody that no one is actively designing much higher than that because, while 257μs might be the limit, hitting the 250μs mark will likely raise eyebrows at the USGA. An insider I spoke with last week told me a CT of 250μs all but guarantees a warning letter from the USGA, and could kick off a full-scale investigation which would include, among other things, acquiring retail heads for testing. Nobody wants that, but at the same time, manufacturers will look to be more aggressive as improved manufacturing methods evolve to produce more consistent results that yield consistent CTs (likely just below 250μs).
As that happens, more golfers will get drivers that are ACTUALLY closer to the limit rather than a driver that falls somewhere within a broad specification that currently includes anything from a head at (and in some cases over) the limit or another that, by comparison, is woefully slow. Manufacturing consistency itself is a means to increase the distance golfers experience.
What’s important to understand in the distance is maxed debate is that USGA regulates only one piece of the distance equation. It’s not an insignificant piece, but it’s far from the only piece. There are a myriad of ways manufacturers are squeezing out a bit of extra distance, and while I think it’s reasonable to say that short of some massive material breakthrough (like Titanium was), evolution will be plodding (a yard – maybe a bit more, maybe less – each year), the manufacturers most definitely still have some room in which to work.