If they hadn't objectively penalized you, everyone would be complaining that Google gave preferential treatment to their own products.
It's a bad thing that Google's own analytics product isn't good enough to still get 4x100 in Google's own perf tool. It means that people will give up and accept lower scores because it's "impossible" to be perfect. That harms everyone who uses the web.
(Disclosure: I work for Google, speaking only for myself)
If there's an analytics platform with a closer feature match to GA, but with good page speed scores, that might be more convincing to them.
Nor does everyone have the luxury of hiring a PM to do it for them.
I do this more than google analytics today, on one side because i want to respect my users on the other side because i too block google analytics (as many) and i want my data to reflect something somewhat close to reality.
Edit:// GoAccess is a favorite of mine https://goaccess.io/
Too many people blocking that script. I have about 20 different sites using it that I manage in some form bit cannot couch for it anymore.
Fighting this fight at the moment:
marketing folk - "we're seeing more responses from old people than young people, we should focus on that market"
technical folk - "Are we allowing for the fact that older people are less likely to be blocking GA and therefore most of those untracked clicks are likely to be younger?"
marketing folk - "well, no, but we don't have any information on those, so we can't make any decisions about them."
technical folk - "but we know GA is blocked by ad-blockers. And we know that ad-blockers are used more by younger, more tech-savvy people. And we know that approx 60% of the visits to our site are not registering on GA. So... can we include that in our analysis?"
marketing folk - "...."
technical folk - "...."
marketing folk - "I don't know how to change the pretty graph that GA produces to include that."
I'm also personally shocked at how FEW people relatively end up using ad blockers. Its a night and day difference
And this is 2 years ago so I assume it's even higher now. It's a lot more than I expected for sure. They only count consumers though.. Perhaps companies don't always allow it, but I always use uBlock at work.
Marketing folk - "Wait...if we're getting more responses from old people than young people and old people are more likely to run adblockers than young people, let's increase our spend on Google Ads because only people without adblockers will see them."
Technical folk - Get into woodworking.
Edit - If the technical folk push back, that's when marketing folk will say that 'the law of really big numbers' means that 40% of a big market is still worth a lot. Trust me, woodworking....:)
That's the problem, accurate visitor tracking wasn't on the web design goals.
But if you want an independent track to verify your JS report, the server logs have almost an almost completely disjunct set of problems.
Did you write your own scripts to use your proxy URLs with the same API?
This assumes you self-host analytics.js and proxy GA API requests through your domain without changing path/query. If you manage to reverse engineer analytics.js and change the path/query AND proxy through your own domain, then this likely wouldn't be detected. But there's a chance that Google will make changes to analytics.js that aren't compatible with your reverse engineering it, and your setup breaks.
We now record key metrics just through our backend. No tracking, no cookies, just aggregate numbers.
Consult `man 1 grep` for information on how to query it.
Or something like goaccess to get nice charts.
I go out of my way to not use any Google services and I don't like when websites negate that choice by using Google analytics, Facebook pixels etc.
Data from the Chrome UX Report (CrUX) is going to be used in results ranking as part of the page experience update - this comes from real-world usage of Chrome
GA affecting Lighthouse scores may be a good storyline for Simple Analytics (and there are plenty of reasons not to use GA) but you can still use GA and pass all the core web vitals
It's understandable that Simple Analytics would use Lighthouse to measure performance and extrapolate from there. I'm not sure how else they could do the test — as presumably they don't have access to Google's data.
GA makes some money, and maybe helps global tracking.
Otoh, if it's an easy way to get ranking, that's going to be abused. And parts of the org do seem to want fast pages to win, so if GA means slow pages, there's a conflict.
1. Pagespeed insights
2. Web vitals, further sub divided into Core Web Vitals and just Web Vitals.
They all work together or are sub components of the other.
Even more, two of three Core Web Vitals (Cumulative Layout Shift and First Input Delay) are not really reliably measures in a typical lab environment like PSI and Lighthouse: they require actual user interaction with a page. They are essentially RUM (Real User Monitoring) metrics.
Other Web Vitals like Time to First Byte are much more deterministic.
Measured by what metric(s)? Core Web Vitals.
First is when GA script is on <head> section. This is mostly popular, but making CWV scores little bit low.
Second is when GA script is anywhere on page, but not on <head>. Like before </body> or in <body>. This doesn't hurt your CWV scores.
It is hurting the score. The GA script is NOT in the <head>.
Because fonts, not because GTM.
Maybe for performance reasons?
Putting in body - you can not catch all interactions, but won't stop rendering.
Everything is an compromise...
But Chrome has much better JS engine - V8.
"No, it's not the case that we penalize for Google Analytics. We don't special-case Google products in Search, but that goes both ways. The LH score is not what we use in Search, but my ancient WP + GA site is 100 there." John Mueller
a) have the page jump around 30-60 seconds after the page loads (when the reader is probably half way through it)
b) leave the page blank for the 30-60 seconds it takes for the font to load, if it finishes loading at all.
I don't think this option exists with GA4, unfortunately.
The problem is that most of my clients don't care enough about their stats to pay $19/month. So they opt for the "free" option (Google Analytics) which is now being positioned for the high-end market.
I've used GA once on an open source focused blog, and the information was entirely "interesting" but I didn't get anything useful out of it other than a vague "hey people are visiting" picture that really changed nothing about what I was doing.
Has analytics changed to be more useful? Who cares what countries people are visiting from? I feel like people don't question their need for analytics as much as they should and just automatically do it.
Thanks for the explanation in any case, I at least can picture the first working with bigcorp that doesn't know any other way of measuring the impact of what they fund.
In the end - ditched Google Analytics and just use log files & sales data.
And the GA script size is about 40KB.
It's kind of like saying that buying Google Ads will boost your website's organic search engine rankings.
I really suggest that SimpleAnalytics.com update the title tag on this, as it's just wrong. Period.
That doesn't mean that using Google Analytics doesn't slow down your site (a bit) and Google should speed it up. I've had that complaint for years now, and Google just hasn't done anything about it that we can noticeably see.