Dave has been running Netlify Analytics (server-side) and Fathom (client-side) for a few years, and the results are in: data integrity is hard. They offer a solid overview of why neither dataset should be trusted, but why neither is necessarily wrong, either. Still, the big takeaway for me is an old one: do not trust data blindly, and if your data is only coming from one source (and therefore cannot be verified independently), trust it even less 😉
On the differences between the datasets:
The data tells me I get somewhere between 12k and 26k visitors to my site who open anywhere between 18k and 333k pages.
On data integrity (or lack thereof):
My trust in analytics data is at an all-time low. Browser-level privacy improvements, ad blockers, and bot traffic have obliterated the data integrity.
On the perils of basing business decisions on data alone:
If I, or some hypothetical manager, put too much stock into these metrics I could see it causing a firestorm of reprioritization based on bot traffic. We’d be chasing the tail of a health check bot somewhere in Germany.
On what to do about it (and a nice use of the concept of a "sage"):
If your goal is to grow your product, you need a mixture of research (academic and applied), user tests, A/B tests, analytics, performance audits, accessibility audits, and a plethora of other considerations. Even if you have all those processes in place, I still think you need a human — a sage — who can read the tea leaves and interpret the data in a way that the business understands.