The skill of reading greens is an imperfect science which befuddles weekend hackers and touring pros alike. There are days when the hole appears to be covered in cellophane and others where sinking putts is like shooting fish in a barrel. The crux of the matter is putting is entirely theoretical in that everything one takes into account before hitting a putt may or may not matter, yet once the ball starts rolling, nothing more can be done.
It’s a confounding situation and one for which there are numerous answers from a myriad of authors.
There are methods like Aimpoint and gurus like Dave Stockton. Everyone has a theory, a can’t fail tip or some other sort of hackneyed system. However, what Ryan Engle has to offer is a bit different.
His app, Golf Scope (currently available for iOS users), aims to take the guesswork out of green reading. Engle is an engineer by trade and a golfer by passion. He has a plethora of experience in the small business tech startup space and worked as the lead engineer on an app for Glasses.com, which at that time was owned by 1800 Contacts. As a contractor, Engle didn’t have an equity stake when the application was eventually sold, but contacts he made through that venture have provided much-needed capital investment for Golf Scope. So long story short, Engle left a comfortable salaried position to chase his passion. “I have enough to buy myself a year, but I also have a house, wife, and baby,” he said, “so there’s certainly some risk.” Successful entrepreneurs often see opportunities others simply don’t, and Engle feels that by getting his product to market before others, he might be running a bit in front of where the market currently is, but ultimately he feels that will serve him well.
Golf Scope is an augmented reality (AR) app which uses 3D mapping and GPS technology to create overlays of greens to determine the optimal line for each putt. It’s not unlike Toptracer technology. The difference is that Golf Scope is predictive; it shows you the line before you putt.
Using the camera on your phone, you snap a picture of the ball and then scan the green and capture a photo of the hole. The app can recognize the hole so long as the user is within approximately 18 feet of it. From there, the app creates a 3D rendering of the green and displays the optimal line for the putt. The app will also present alternative lines based on how aggressively the player wants to play the shot. So long as you document each putt, Golf Scope will calculate your strokes-gained putting data based on PGA Tour putting statistics. Effectively, this will allow the player to see how many strokes they gain or lose relative to a tour player during a round. Eventually, the goal is to assemble data which will allow the golfer to calculate a strokes-gained statistic based on more than just distance (e.g., speed, slope, predominant break, and direction) and disaggregate the information any number of ways including anticipated score (putting for birdie vs. putting for par) and stated par for the hole.
A 7-day trial gives users (iOS only for now) a week to demo the full suite, but with a one-time price of $5.99, it’s easily worth skipping a trip through Starbucks, particularly if you play events or tournaments without the benefit of a local caddy or years of local course knowledge. Think of it as Cliff’s Notes for green reading.
In beta testing, I found the app to be pleasantly accurate, though the version I used didn’t offer multiple lines for different speeds. Also, because the app doesn’t yet have machine learning capabilities, it wasn’t able to account for geographic features (lakes, mountains) which have a pronounced effect on the greens at my home course. Otherwise, the app was both easier to use and more precise than expected.
WHERE DO WE GO NOW?
Tradition and technology can make for odd bedfellows, and golf seems to be caught in a constant pillow fight between the two. But, this is where golf – or at least some portion of it – is headed. In many ways, golf is still in the dark ages when it comes to the use of technology, and it’s unreasonable to expect ruling bodies to proactively create room for products like Golf Scope. Rulemaking bodies are by nature reactive, and as such, advances in technology and the quandary in which it puts ruling bodies is often what dictates rule changes, not the other way around.
Arccos Caddie 2.0 is a prime example, though there are many others (18 Birdies, ShotScope) which are exploring various ways in which big data can be leveraged to improve performance. Augmented Reality technology might be new to the golf space, but as a technology AR systems date back to the early 1990s. Moreover, a plethora of data already exists, but the challenge lies in making it accessible to golfers in ways which are both meaningful and affordable.
Like every other sport, golf is subject to evolution. One can like it or hate it, but it’s futile thinking to pretend such advancements and innovations can be curbed.
I’m certain the “this isn’t the way golf was intended to be played” contingent will gripe and contend Golf Scope takes some skill out of the game. By that line of thinking, we quickly enter the circular conversation where one side feels AR apps are taking techy things a step too far and the other side points to 460cc drivers, hybrids, shafts made from exotic materials, and balls with more layers than my wife’s bean dip as evidence the industry has no problem with advancements where the sole purpose is making the game easier.
Another perspective is to consider the concept of a digital caddy. Beyond the requisite duties of carrying clubs, providing yardages and tending flags, what’s the value-added purpose of an actual human caddy?
If the answer is to give the player helpful information to help them shoot a lower score, I have to ask, what’s the real difference? Golf Scope and Jim “Bones” McKay (Phil Mickelson’s long-time caddy) ultimately serve the same purpose, though the premise is Golf Scope can produce the same (if not more accurate) information without requiring decades of experience and volumes of handwritten notes to do so.
The essence of golf is executing the shot, not gathering information which creates the context in which it’s played.
Some people don’t like too much technology mixing with golf – and as consumers, that’s a choice we all get to make. No one is forced to use a rangefinder, GPS or own a smartphone. Luddite golf is entirely acceptable, and products like Golf Scope are only valuable in so far as the end user sees value in them. Seems simple, but too often critics seem to forget the existence of any product doesn’t preclude the availability of other products nor does it require anyone to purchase it.
Engle’s primary goal in launching Golf Scope is to get as many golfers using the app (both the free and premium versions) in the next year. Not everyone can afford a professional caddy, so to democratize data holds some philanthropic meaning for Engle as well. While the raw number of users doesn’t necessarily immediately translate to revenue, assembling a vocal and active user-base first will give Engle several ways in which he can monetize the app moving forward.
Because this is version 1.0, Engle expects future iterations to better leverage machine learning structures where the app will be able to dis-aggregate data based on any number of filters (course type, putt length, dominant putt break, score if the putt is made) and give players real-time feedback, much like a caddy would. Engle will make his mark on the industry in some fashion, but he’s also a proxy for the next wave of engineers who will be responsible for merging traditional elements of the game with evolving AR technologies and big data.
It’s commonplace to ask Alexa for the weather or to order a pizza. It’s was only a matter of time before our phones started reading putts too.
Will you try it? Buy it? Or just decry it?