My stats class from 1996 pays off (plus: why AI brand recommendations are basically random)


Hey Reader!

Way back in 1996 I had to take one statistics class as a part of my geography degree. I never thought that I'd be using it again, 30 years later, but here we are!

A client wanted to know if we can predict how many appointments they'll book based on their website traffic. The answer is yes, and yes it did involve stats, but it really wasn't as bad as I thought it was going to be.

I started with two columns of data: website sessions per week and booked appointments per week. I then put that into a scatter plot with sessions on the x-axis and appointments on the y-axis and there was a fairly clear pattern. More sessions generally means more appointments, which isn't a shock. But the question is if that relationship is consistent enough to be useful.

Then I dug way back in my memory banks to use linear regression. It's certainly a lot easier now than it used to be, Google Sheets will do the hard work for you. Select your scatter plot, add a trendline, and check "show equation", which will give you something in the format y = mx + b.

The formula in this case was Appointments = (Sessions × 0.511) + 1604. The slope (0.511) means that for roughly every two additional sessions, we'd expect about one more appointment. The intercept (1604) is the baseline, that's the number of appointments you'd theoretically book even with zero sessions, which accounts for all the non-website drivers like repeat customers, radio ads, and phone calls.

This isn't a perfect predictor since there are still variations for things like holidays, weather, the economy, phase of the moon and so on. To see how strong the relationship is, look at the R² value, and you can display that by going into the trendline options and checking "display R squared". That will tell you how much of the variation in appointments is actually explained by sessions. The closer to 1, the stronger the relationship. In our case, the R² value was 0.347 which means that there is a real relationship, but it isn't extremely strong. You can think of it like this: sessions explain about 35% of the variation in booked appointments, with the other 65% driven by other factors. So it's not a crystal ball, but it is a useful directional tool and it's certainly good enough when it comes to pacing. We aren't betting on the exact number being right, we just want to know if we're trending in the right direction.

The next step from here is to try to introduce more variables, such as seasonality, location, etc which then will lead us into multiple regression land. I'll be sure to post another update once that analysis is done. But for right now, we needed a simple "are we on track" chart and one variable and a moderate R² can fit the bill nicely.

Dana DiTomaso

Founder
dana@kpplaybook.com


How Cookieless Tracking Actually Works (And What It Can't Do)

There's a lot of confusion about what "cookieless" actually means in practice, so I put together a guide that walks through how analytics tools handle tracking without cookies, what GA4's consent mode and behavioral modeling actually do, and what each approach can and can't tell you. There isn't one analytics tool that can solve everything, so the key is knowing what questions each tool can actually answer.

Read the full guide →


Articles Worth Your Time
———•

AI Brand Recommendations Are Basically Random (And That Matters for Your Tracking Budget)

Rand Fishkin published research on AI recommendation consistency that I think anyone considering investing in AI visibility tracking needs to read. Based on nearly 3,000 prompts across ChatGPT, Claude, and Google's AI, he found that if you ask the same question 100 times, there's less than a 1-in-100 chance you'll get the same list of brand recommendations twice.

This matters because companies are spending over $100 million a year on AI tracking and visibility tools, and much of that investment is built on the assumption that AI recommendations work like search rankings. They don't. Position tracking in AI responses is essentially meaningless.

That being said, while the exact lists are random, certain brands do show up consistently across many runs, appearing in 60-90% of responses for a given topic. So visibility, which should be measured as how often your brand appears, not where it ranks, might still tell you something useful.


The Funnel Is Dead, Long Live the Creative Asset

Juliana Jackson published an introduction to Creative Intelligence that tackles a massive problem in advertising: now that we can generate creative assets at massive scale, how do you actually know what's working and why?

Her core insight is that we've been treating creative assets as black boxes. When "Creative A" wins an A/B test, we rarely know if it was the pacing, the headline, or the audience segment. We learn nothing transferable, and we essentially reset to zero with every new campaign.

Juliana outlines the idea of the "creative space", where every ad is treated as a coordinate in a multidimensional system. That’s the kind of thinking that bridges the gap between data teams and creative teams. If you're running any kind of performance marketing, this is a must-read.


Where to Actually Invest in Search and Visibility This Year

Jason Dodge at Black Truck Media (and Analytics for Agencies member!) published a thoughtful outlook on where brands should focus their search investment, and it's less about predictions and more about the foundational shifts that are already happening.

What I find particularly valuable is the measurement section, where Jason references the AI overview traffic tracking framework we've been using at Kick Point and Analytics Playbook. His point is spot-on: tracking impressions or showing up in AI overviews is interesting, but showing that those visitors are actually engaging and converting is what matters. If you're not parsing out that traffic now, you won't have the data to answer those questions later.


Where You Can Find Me
———•

Looker Studio Workshop at SMX Munich

Just a reminder that SMX Munich is only a month away (how???). There's still room for my Looker Studio workshop, so if you are in Europe (or would like to be in Europe), please come check it out!


That's it for this edition of The Huddle. As always, if you have questions or want to share what you're working on, just hit reply!

Want more? You got it!

📈

Practical GA4

Sign up →

📘

Free Resources

Get yours today →

🛠️

GA4 Workshops

Level up in GA4 →

Was this email forwarded to you? Sign up here!

PO Box 68171 RPO Bonnie Doon Shopping Centre, Edmonton, Alberta T6C 4N6
Unsubscribe · Preferences

Analytics Playbook by Dana DiTomaso

Analytics Playbook gives you the analytics skills you need to land more clients, level up your career, or make smarter marketing decisions. Get bi-weekly insights curated by analytics expert Dana DiTomaso. Each issue includes expert tips, must-read articles, and free resources, all designed to help you take action and see real results.

Read more from Analytics Playbook by Dana DiTomaso

Hey Reader! I've been thinking a lot about a situation that’s been coming up lately in conversations with clients and leads, and that’s diagnosis before investigation. Here's what I mean. At some point you'll have a situation where someone on your team (or a client, or a stakeholder) sees a number that looks wrong and immediately wants to fix it. Traffic's down? Let's redesign the homepage. Conversions dropped? Must be the new checkout flow. Engagement rate tanked? Time to rewrite all the...

Hey Reader! How do you listen for listenable events on a website you can't see, even with a VPN? That's the challenge I was working through last month, and I came up with a solution that might help some of you facing the same issue. First, I created a trigger in Google Tag Manager that listens for any event at all. It's a custom event trigger with an event name of .* (and make sure to turn on regex matching). Then, for your tag, it will look like this: This way you can see all the possible...

Hey Reader! This is the last edition of The Huddle until January, so I wanted to take a moment to look back at what resonated most with you this year. First, thank you. When I started this newsletter, I had a modest list of subscribers that grew slowly. But something happened this year and now the list is four times the size that it was in January. I'm genuinely grateful that you've chosen to spend a few minutes with me every couple of weeks. It means a lot to know that what I'm sharing is...