GOLF REVIEWER'S MAGIC SHOW

GOLF REVIEWERS MAGIC SHOW

 

You probably don’t realize it yet, but as a seasoned golfer or new to the game, we have all had front row seats to the most magnificent golf review magic show you have ever witnessed.

Inadvertently or intentionally,

golf manufacturers, retail stores, or anyone in the business of reviewing golf products are deceiving you, save one. I am willing to bet it is a mix of the two. None the less the result is the same; they are all trying to make you think their product is better based on their motivation or bias. Naturally, this is to be expected, but as consumers, who or what are we to believe?

My goal

and the sole purpose of Golfing With Rob is to bring you the unfiltered truth.  I almost fell into the same trap as those mentioned above. If not for my desire to provide to you the unfiltered truth, and if I was beholden to any one company, I would have ended up a co-magician on that stage.  

I just recently completed my review of the Titleist PRO V1.

In the process, I discovered that the flaws I found in my tests are precisely what is happening in the golfing industry. Companies are manipulating this flaw to create their desired result. The other possibility is they are not aware of the defect. I find that harder to believe due to their willingness to omit specific data points that are available on the launch monitors they claim to use.

I was aware this flaw existed

but did not fully appreciate the impact until I reviewed a couple of high-quality golf balls, and compared the average data. The unintended flaw stuck out like a hack in a Pro Golf Tournament.

First, allow me to show you how easy it is to manipulate data if you omit key data points.

If I were to take bets

on whether the PRO V1 or the Benchmark ball came out on top, I would win either way and not have to cheat.

If the majority bet on PRO V1, I would use my average data against the Benchmark shown below

Average Comparison Table

 

This graph represents

a 20 ball average for both balls, using the same pitching wedge. The Callaway Superhot was used as the Benchmark ball. The Benchmark outdistances the PRO V1 by 4.85 yards. I win, show me the money!  What if more people bet on the Callaway Superhot, you ask? Let’s take a look

Raw Data from respective spread sheets.

  

Using the same data set

(with the reference), I have now substantiated that PRO V 1 outperformed the Benchmark ball by 7 yards. I win either way by simply changing how I present the data. If I had included two other data points, it would have been impossible for me to pull this off.

The question you might be asking now is, “How did you produce two different outcomes with the exact same data? And shouldn’t averages make the data more accurate?”

The first question is achieved by not providing the swing speed and or the PTI. The answer to the second question is “no.” You could say that the above example is the warm-up act, and the answer to this second question is the main event.  The answer to question two is a little challenging to explain, but I will try to keep it simple.  

When I set the parameters to create the Benchmark and further testing, I wanted a small margin of error. I established a swing speed variance of 4 MPH and a PTI (Smash Factor) variance of 0.04. I also limited the offline distance to 15 yards for all clubs except the Driver, due to the greater distance. Since PTI and swing speed are the only two data points that can be altered by the golfer, there was no need to set further parameters. Some will argue that you can control the ball spin and launch angle, and they would be correct, but I am assuming the same swing plane and ball positing that allows for the PTI to be within the set criteria.  

Looking at compared averages, I realized that even though the individual ball data falls into the same parameter when averaged, the two controllable data points can come out higher or lower.

I know what you are thinking, which is “duh you idiot that is what is supposed to happen,” and usually, you would be correct except in the world of golf.

If the average swing speed and PTI are too far apart (which is not much) no more than 0.02 difference in PTI and 1-2 MPH difference inswing speed, the golf ball distance is not going to be averaged correctly in relation to what the golf ball is being compared—thus giving an advantage to the ball or club with the higher swing speed or PTI average. What we are genuinely interested in is the golf balls’ natural designed performance.

If the controllable data point averages are close, then they can be used and stated for the record. The following example is a comparison chart table with two golf balls and the Benchmark.

Driver Average Comparison Table.

 

 

This graph depicts how averaging does work when the Club speed and PTI are very close or the same. The swing speed is > 1MPHdifference, and the PTI is the same. As a result, the accuracy of the other data points is an accurate record of the golf ball’s natural design performance.

Now let’s look at the 5 Iron average.

5 Iron Average Comparison Table.

Here you can see that while swing speed and PTI are within the 4 MPH and 0.04 tolerance, there is still enough difference to alter the distance averages. By using this data, the real ball design performance is masked by the difference in swing speed and PTI, as slight as it is.

You could, however, compare averages with the Volvik and Benchmark due to an almost equal trade-off between the club swing speed and PTI. The results between the two balls are negligible in carry and total yards.

Using averages is okay if close attention is made to the results of the swing speed and PTI. Outside of that, there are simply too many variables to accurately use averages when evaluating a golf ball or golf club’s performance.

In my opinion, the only way you can consistently and accurately measure a golf ball or golf club’s performance is by direct matching and comparison.

Looking at the chart below, when the swing speeds and PTI are matched, the rest of the data is more reliable. We can now see with high confidence that the Pro V1 launch angle is consistently higher than the superhot golf ball (used in establishing the Benchmark), thus proving that Titleist’s claim that the PRO V1 has a higher launch angle is correct.

Direct Match Raw Data Comparison Table with Ball Reference.

 

  

In conclusion:

when looking for the best ball, club, or anything performance-based related to golf, make sure that the swing speed and PTI (smash factor) are included in the results along with the number of balls hit and the type of ball used. The inclusion of the launch angle will also help with the validity of the testing. I didn’t touch on launch angles, but when looking at the results, pay attention to that data point as it also plays a factor in ball distance.

As a result of my findings, I am going to change from posting averages direct matching and comparison. Implementing this method will provide my visitors with the most accurate and reliable golf product reviews available and stay true to my claim that Golfing with Rob is the most accurate and honest Golf review site on the internet

 

*The reason there are more ball stats for one ball over another is due to more matching swings for that specific parameter. Not to show bias, all shots matching that particular criterion will be presented.

Let us know what you think