A 12-step upgrade procedure is insane [1] [2]. It's like they want people to ditch their product. I have an extremely basic setup that took 5 minutes to start using. Why is this not a one-click upgrade?
I find it amusing someone actually had to take the effort to classify those and find a way to differentiate what constituted "very low effort" vs. "low effort" or how many meetings were involved.
I honestly wonder if it's deliberate to shake out all the free tier customers who can't afford a migration. It's a plausible approach if they are shifting strategy to deep-pocketed enterprise customers.
Bingo. Their target market is the managers and decision makers who don't have time or knowledge to do anything other than look at the Magic Quadrant and tell their team which to implement.
I am a data analyst with 10+ years of experience. I have setup uncountable GA1 - GA3 implementations, eTracker, Matomo, Adobe Analytics from s_code to Appmeasurement and via Adobe Launch, Snowplow and quite a few others.
I also have setup three implementations of Firebase and GA4. I can't for the love of god use this tool. It is in my eyes nothing but a mechanism for upselling GCE (esp. Big Query).
Answering basic management and marketing questions about marketing goals (or other conversion goals reached) is a PITA. Building basic dashboards with Google Data Studio also.
Taken together with the EU data privacy regulators regarding GA I nowadays just recommend clients to just ignore GA. Even if the USP is "but I can see my advertising numbers in there as well".
Do they want to push people away from using their product?
A plausible alternative depends on your use case. If you just want basic numbers (visitors, visits, page views, few interactions tracked) one can probably chose any of the myriad of alternatives like fathom, Plausible, Matomo, and so on.
While with most of them you get what I would call "vanity metrics" (so not really actionable, more or less just "numbers go up or down") at least Matomo (self hosted) provides the ability to create essentially the experience of a free GA3 implementation.
If you "just" had a code drop of basic GA3 code: just use whatever floats your boat. If you have custom dimensions, user interactions tracked as events, goals and stuff like that in the free GA3 version - go with Matomo (self hosted). Configure Matomo to have more than 5 custom dimensions (easy to do) and you can replicate (nearly) everything from your GA3 implementation.
Only things you will be missing in any case is integration with Search Console and Google Ads.
> While with most of them you get what I would call "vanity metrics" (so not really actionable, more or less just "numbers go up or down")
This comment couldn't have come at a better time for me. I was just trying to compare the free analytics available and plausible looked okay to me. But since I haven't done anything actionable yet I think I'll research a little more.
> so not really actionable
What would you define as actionable metrics? Are you talking about the custom page triggers like scroll depth, click on obl, etc? Can you recommend what are the biggest actionable events for a saas site since you have experience with it?
My source of knowledge on this is sadly a few youtube tutorials but if there is any resource you would recommend I'd highly appreciate the link.
Actionable depends on your needs. As always the typical answer you receive from an analyst is: It depends.
Sorry, couldn't resist the in joke of the team I am part of.
But let me try an example. Visits as a global metric don't tell you much. They go up, they go down. And even if you do different marketing things (like SEA, newsletters, social media marketing, organic social and so on) and only look at visits from NY of these channels that doesn't tell you much (except for the volume).
But if you look at what a visit from every one of these channels costs you on average we are getting somewhere. Add into the mix the conversion rate for every one of these channels (depending on your goals it could be e-commerce conversions, newsletter sign-ups, sign-ups for a platform and so on). Now you can already see cost per acquisition/visit and cost per conversion.
So now you can answer the question about what marketing efforts bring more bang for the buck.
That is one example of how I define actionable metrics. Metrics that help you/enable you to make better business decisions.
How I would structure and report things depends on the business. But maybe this gives you an idea already.
Wow that's such an insightful comment. Never really thought about that. Puts my whole thinking about website analytics in a new light.
Really appreciate you taking the time to write this. After some initial research I see not using UTM tracking tags(1) in all my URLs (or redirects) is a big mistake. So duly noted. You're right about one thing though that since there is no direct integration with Adwords / GWM you're losing out on a lot of info using anything else other than GA.
In the end you are not loosing this data. It is just one step removed. So you need to think about extracting it from Google Ads into Excel/Google Sheets (or an external dashboarding solution) and combining it with a data extract from your analytics solution.
And if you do more, like social media advertising, Microsoft Ads (think Bing) or Twitter they all don't easily integrate into Google Analytics. For all of them you would need to look ad the marketing performance in the platform specific tool and combine this with your on site metrics as described above.
Additionally you would want to take the consent rate into account (if serving European customers). Many of the consent solutions provide this number so that you can use it to adjust things like CPO (cost per order) or cost per visit (CPV) as you are probably not seeing 20 to 60 percent of your traffic due to consent not given. This depends massively on your target audience and the countries you are serving. I have seen this range at multiple international clients of mine for different brands and markets.
So GA having this integration is nice but definitely not sufficient for marketing.
Even when thinking about the integrations for the (quite expensive) paid offerings of GA and DV360 and the likes.
I don't know why people classify fathom and Plausible in the same category as Google Analytics. There's a clear distinction between an analytics software and a counter.
Matomo has some features but I can never like it's cluttered interfaces that don't lead to immediate actionable intels.
There is a continuum of alternatives from tool like fathom, plausible on the more 'vanity metrics' side, GA4, Matomo and others somewhere around the middle and the fully custom corporate actionable solutions like the offerings from Adobe (and others) or custom built solutions.
It depends on what the people needs or want. A smallish blog has different needs that a global company with multiple brands in different markets doing e-commerce, brand advertising and what not (think VW or BMW). And a medium sized B2B e-commerce shop has again differing needs.
Sorry for the late reply. Evening with my SO and sleep. :-)
Personally I like Snowplow. But I am a tech loving nerd. If you have the capacity to build and maintain the tracker, servers and (for reporting/visualizing) necessary ETL infrastructure you can build the perfect fit in terms of what data to collect in what way and how to use it for your org).
The cost for building and maintaining it (with redundancy, stability and everything else) on the other hand probably would be in the same ballpark as solutions like parts of the Adobe Experience Cloud. Especially the new 'Customer Journey Analytics' offering is very intriguing. Having watched the summit I was a little bit blown away by the new capabilities for reporting they just announced.
I have to admit that I actually like working with Adobe Analytics (and CJA) and the development of the solution over the last few years has been great to watch). Still it has its problems. For example using Audience Manager creates a tracking call with an unique identifier being sent to their data center in Virginia. Even if all your data resides in the European data centers this is happening. This currently is a problem for clients of mine as the explicitly choose Adobe over Google because sales told them their users' data would not leave Europe under normal circumstances.
But back to Snowplow. You need some Form of visualizing and reporting infrastructure on top. So something like Apache Superset. Works well. But another thing to build, maintain and develop further.
I like both. And I like to be able to fully own the data pipeline. Especially in cases where you are in a regulated industry or have potentially sensitive data. I know of at least on bank using Snowplow to not only track the website and online banking, but ATM usage, call center calls and visits to your personal consultant/teller. They build a real time dashboard from that data so that when you visit your bank the employee there can see in real time your last touch points and interactions. Also they build product recommendations on top of that data that in the cases known to me actually are beneficial to the customer. Like telling you when your behavior with getting money from ATMs of different banks could warrant an add on so that you would save on fees. Because they value the customer happiness as a way to generate less churn and additional WOM marketing (word of mouth).
So it totally depends on your needs and the business in my book.
Naive question. Sorry. I don’t do web app cloud micro service analytics sort of things.
I’m used to quite a bit of verbiage in HN posts about Google ruining the internet with commercialization and ads and analytics and that sort of thing.
But the comments here of which I’m a naive observer I guess, seem to be of a “how dare Google change what is an essential part of our tools!”
Is this just a case of different HN subcultures taking time to post to each other? Or is it just my lack of understanding of the nuances of which part of modern web apps is good and/or evil?
Not really, at least you observed something. It is more like a silent majority. And I have been heavily downvoted for pointing this out previously.
Sometimes post 2016 it went all the way to Ads equals evil on HN along with the rest of the internet and social media. Around 2017 / 2018 tracking becomes evil. This is further fueled in 2019 when Apple went on full PR attack ( or one could argue adding fuel to fire ) about tracking and ads. ( They subsequently pulled back and state they are not against ads after industry backlash ). Target advertising ( whatever that means [1] ). And working in FANNG ( before it was called MAAMA ) would become morally wrong for a short period of time on HN.
But there are, legitimate uses for Ads, and legitimate uses for tracking. So every once in a while you see a few people who dare to offer their contrarian view, those that are working in the ad industry, or working as a business that requires the usage of tracking, funnelling and ads for sales. And offer their views that somehow gets the upvote to top post. Assuming the upvote is not gamed in anyway, that means there are a huge number of HN readers that rarely participate in any discussions, but upvoted something that largely go against the common trend on HN.
There are also some sort of political angle to it. Although I tend to think they are separate issues rather than causes.
The "evil ads" argument doesn't even look really thought through to me. The way I look at it, there's 3 options:
1. Do everything for free. Rarely tenable.
2. Charge for things. This necessarily puts focus of business on monetization, or at the very least, it frequently pits providing and extracting value at odds.
3. Delegate monetization (a.k.a. ads). Then you can just focus on providing maximum value to your customers and money scales proportionally.
As a builder of things who's not greedy, #3 sounds like positively best option, if it's financially viable and product type makes it possible.
From the consumer point of view, the business model does not even enter the equation. What matters is media pollution: my display, my computing, my rules ! To hell with the publisher's expectations of a social contract binding viewership with acceptance of advertisement. That is of course incoherent with the taste for free resources, but that is not enough cognitive dissonance to stop anyone from setting up ad blockers.
From my vantage point, that's perfectly fine. Block ads, change the fonts, color the site orange if you want. As you say, your display, your rules. I communicate to my users that I myself use an ad blocker on my own property. There's no dissonance there.
My mom asked me to disable ad blocking I've set up on her devices, as she finds them useful.
> That is of course incoherent with the taste for free resources
Except it's not. Most content that I consume - like your comment here - is created by individuals that do not get paid no matter how the platform that the content is shared on is monetized. It's absurd to say that there would be no content without monetization and when people have content to shere there will be ways to share that content even when ads illegal or blocked by everyone. However as long as ads are accepted by the public at large, ad-laden sites will be more profitable and thus also have more resources to crush any alternatives that provide an ad-free user experience. I'm quite fine with anything that needs ads to survive to disappear as a result of ad-blocking - there will always be something else to fill my time.
False choice. I don't know people who are "all ads are bad". It's "spyware enabling ads are bad" and "ads are too large/frequent/auto playing video/in your face" with a side of "stop eating my cycles and bandwidth".
If companies just put up ads relevant to article as a static image that took up a reasonable amount of space and scrolled with the rest of the content, people wouldn't complain.
This idea there is some "silent majority" of pro-tracking and pro-ad HN users doesn't track with reality. Everyone on HN can uovote and votes are how voices get magnified. If that was the case we'd expect the odd pro-ad/tracking comment to always be near the top of the comments section.
The reality is this is adverse selection: those users probably weren't interested in a post about updates to GA, and didn't click it.
I made a new account, wrote a message, upvoted it with my current account.. and no change in points. I never see any message become black or "less grey" after upvote + refresh. It is a no-op for me.
HN likely does something to prevent vote manipulation from using multiple accounts coming from the same IP. That doesn't exclude the possiblity that your account had a flag to desregard votes before. I personally dislike this kind of hidden moderation like shadow banning. I know people will defend them saying that they are needed to deal with spammers but that does not make them any less dishonest when they invitably affect genuine users.
Word of advice: don't worry about the prevailing HN mood or opinion about pretty much anything except some extremely niche technical topics. (A nice tell is when a post has almost no comments.) Treat HN as an amusing diversion, not as some source of great truth, insight or foresight.
> Is this just a case of different HN subcultures taking time to post to each other?
Yeah, I think that's exactly it. It happens with a lot of other topics too. It's nice that HN does seem to have more success in keeping people with opposing opinions on the site than some other communities.
I think this is a case of people being annoyed that a product that works just fine is being replaced.
I have a handful of personal sites that I've slapped Google Analytics on, and while they're mostly using Google Analytics out of the box, I'd rather not have to spin my wheels reworking some custom events.
I guess some of us are forced to use tools and services from companies we don't like, for the sake of our livelihoods? "It is difficult to get a man to understand something when his salary depends upon his not understanding it." We all gotta draw the line somewhere, you know, and everyone has a price...?
Google is a mixed bag; they used to be a force for real good on the internet, now they're mostly mediocre with a side of evil, riding on their past benevolence. I use their tools here and there, even if I think the internet is overdue for some serious de-googlization.
Facebook is just plain evil. I don't use their stuff in my personal life, but many businesses demand that you set up Facebook ads/analytics for them. I use fake accounts for those.
Amazon is pretty evil too, maybe AWS somewhat less so. I avoid them whenever possible (freelancing, etc., where I get to choose the stack) but oftentimes AWS is baked in to existing stacks and the companies are not interested in switching to anything else.
It doesn't require massive cultural differences, necessarily, but just a degree of moral compromise that most adults have to make in order to survive. Capitalism doesn't easily make for benevolent corporations and moral purity in tech is impossible when like 90% of it is owned by a dozen huge companies.
I stopped using Google Analytics when I realized that it doesn't produce any useful actionable data for me. It's just a feel-good thingy, where I watch the page view numbers and go "oooh".
Those page view numbers do have a relationship to my business. But it's like measuring traffic on a highway when your store is next to it. The numbers will be somewhat related, but not actionable in any way.
I guess it makes sense if you get paid by page views (ad-supported sites?). But I have a B2B SaaS and I get paid by subscriptions. I rarely care about any of the metrics that GA shows, and I found that many of them are bogus anyway. I could probably wrestle GA into measuring what I need, but why bother, there are easier ways to handle that.
A nice side benefit is that I'm not participating in the global people-tracking business and I'm not subjecting my users/customers to that.
Google Analytics has 3 layers:
-the vanity metrics anyone can look at - this is what you described. How many pages did they view, how much time did they spend on the page?
-users interactions - this requires some skill. You can set up custom tracking to view specific user interactions to answer questions such as: Which page of my sales funnel gets users frustrated and makes them abandon it?
-full user journey - this might require using API and user data (instead of page data) where you get to view every single event they did during their visit. This is probably the highest level of GA skill and requires a good setup, but when you get there, you can answer pretty much every question you have about user interactions.
Not OP, but I have experience with the layers they describe. These are not separate Google Analytics products, but levels of implementation of the Google Analytics code.
The first level they mention is out-of-the-box, no setup Google Analytics. You install the site code and you get metrics. You don't need to be a developer or have any real skills to accomplish this. It is essentially low-effort, low-return.
The 2nd and 3rd layers do not require an investment in Google, but they do require an investment in people. You'll need someone who understands the Google Analytics API, the Measurement Protocol, how the script collects and sorts data, who can install, configure and implement Google Tag Manager, and who has at least an intermediate understanding of HTML/CSS/JS.
There is no nomenclature apart from the aspects of Google Analytics I described above, but you can read more about them here: https://developers.google.com/analytics/
1.: No - but you need people understanding your business and GA as well as being able to implement additional tracking events via Google Tagmager (for example) to be able to answer questions like: What percentage of our users came via paid social media advertising, viewed a product, added it to the shopping cart, but abandoned the checkout. And how many of them are newsletter recipients so that we might be able to send them a reminder, that they have items in their shopping cart.
2.: No official nomenclature - just deeper data analyst's knowledge to ask the business people the right questions, develop an understanding of what to measure, talk to the developers to have them help provide (if necessary) additional info in a dataLayer structure and implement the necessary tracking events in the tagmanager.
After that said analyst needs to build the necessary dashboards/reports so that management/business can have the initial questions answered.
Data/Web Analytics in a nutshell (and described way reduced).
Not OP, but the simplest ways to do the deeper stuff:
#2 - custom GA events implemented via Google Tag Manager triggers. only takes a few hours to learn.
#3 - once you identify a few particular funnels you want to dive deep into, just skip GA altogether and use an all-purpose recording tool like Hotjar that'll let you automatically record entire user journeys starting or ending at some page. go through those with your UX team/person and look for potentially confusing interactions. takes like 5 min to set up, several days to collect data, and then a good UX person to interpret likely pain points that your users are running into. it's really hard to do a good funnel analysis in Google Analytics. Hotjar, or to some degree Facebook Insights, make that analysis a lot easier because they're GUI-driven and monitor the DOM visually, not code-and-config based like GA
I AM THE OP (felt like I needed to say that to not break the chain :D )
I feel like your questions were answered - pretty much the answer I'd give.
1) The only cost is cost of a person who has a deep understanding of Google Analytics. Google Analytics courses are hours long but they only teach you the first layer. 2nd and 3rd come with experience - I've been working with GA for about 4 years and there are still things that surprise me, caveats that we sometimes forget about.
Quick example: Beginners will quite often select 2 dates (separately), get number of users from those 2 dates, add them together and say: that's the number of users we had in those 2 dates. But if you selected those 2 dates together, you'd see a lower number. Why? Because a user who visited in both periods would count as 2 if looking at separate dates, but only as 1 if looking together.
2) There isn't anything official but there's lots of blog posts - from great people like Simo Ahava and Julius Fedorovicius. If you're just starting out, I'd recommend sitting down and writing questions you want answering. Default GA implementation will not give you answers to all your questions - they had to create a product that works OK for most, so it doesn't answer specific business questions. The greatest problem with people saying Google Analytics doesn't give them the data they need is they don't start with the question. Examples of tracking I recently implemented because I taught the team to ask questions WHEN they change/implement something:
-Are users printing the pages? Tracking is set up to track when they use browsers print button.
-Do we have any searches on page that return no results? Do we have searches that return too many results?
-How accurate are our results - which position of the result do they normally click on.
Those questions you wouldn't answer with basic setup. By setting up custom event tracking, we can answer them.
Lastly, on the most advanced setup (keep in mind you might not actually need it) - new version of Google Analytics offers free integration with Big Query. This means you get to analyse the data using SQL.
That is what I noticed for mobile analytics (firebase analytics) as well. It's nice looking at the daily active users and such, but you don't really get any actionable data from it.
For me, a much better approach would be my own analytics, tailored to my product using simple sql statements.
For example how many of my users are tracking mainly dividends vs tracking their portfolio? To get such data I would have to invest more time into setting the right firebase analytics flags than just creating a BI Dashboard that uses one sql query under the hood.
I had similar frustrations with google analytics and decided to do the same.
Are there best practices to follow? I just have a table with: id, id of thing being accessed, user email, time, event enum, and notes.
It works well enough, and I have a python script to query the table and aggregate and print graph or print the results. But it would be great if there was some "bring your own data" dashboard that would let me view the time series w/ things like filters
Wow, looks good. I am planning to build something similar but with risk/quant analytics that I am familiar with from Capital Markets. I have mostly worked around traders in my day job.
I am going to build a risk management app, for your stock portfolio. Not a quant fond. Please do not invest your money in a quant fund, all such constructions are based on the assumptions that the market dynamics an be analysed and that they then stay the same for the lifetime of the fund which is very rarely the case.
Next to not actionable data, it baffles me that Google doesn't provide any migration tools for their data. The data structure is quite different, but just saying to all your customers: "We found a better way, everybody, please change now" seems too unfriendly.
As the most dominant player in the market, they could do this (and to a certain extent still can). But for how long?
The lack of data migration may be related to recent EU lawsuits that essentially forced them to delete all data that has been collected without legal reason. I guess it's hard to impossible to separate this data from "clean" data collected outside the EU after several rounds of aggregation.
The data migration might stem from the fact that the underlying data model is so totally different, that it just might not be possible to migrate anything.
GA3 and GA4 are so different, it isn't like the US and Russia, it is Earth and Jupiter (not in terms of impact or size, but just vastly different).
check out https://hockeystack.com: it's an end-to-end analytics platform for B2B SaaS. completely no-code, doesn't require setup, cookieless, and we store our data in EU
it allows you to understand what drives revenue and measure the exact MRR you get from a blog post, churned users you get from an ad campaign, etc.
I did check it out and it looks really good. As opposed to some other comments, I don't have a problem with pricing (my SaaS is a healthy business and I gladly pay for quality).
But using this solution would mean loading third-party JavaScript with third-party trackers. Which for many reasons is a complete non-starter. I'm quite happy to have a site with zero third-party scripts/trackers.
So, call me when I can send the events myself from my code :-)
> A nice side benefit is that I'm not participating in the global people-tracking business and I'm not subjecting my users/customers to that.
^^^ This !
If its free, then you are the product.
I'm sure all that analytics data is a veritable goldmine for Google.
I pulled Google off all our web properties three years ago (it was originally put there by the web designers) and no amount of blog hype will ever convince me to put it back.
Same story here. And imho it mirrors the near addiction people have of tracking their vitals and exercise data. Usage data != actionable data. What sort of action does knowing I took 15000 steps on avg this week help me take?
Really? Are you being deliberately obtuse? For someone that has not been active at all and is looking to see what kind of activity effort brings them to a desired activity target, then taking notes of what you've done and what was achieved is useful info. "How often do I need to do _______ to achieve my goal of _______" If you thought you needed to plan 1 hour per day, but realize you only need 15 mins per day, that's useful. NOBODY is going to count their steps manually.
Personally seeing that I've done way too little steps/exercise reminds me and motivates me to be more active. If you are always equally active no matter what then yeah it's just not for you but not everyone is you.
point in time data doesn't really help as you mentioned. But if one can segment this and compare and contrast with other cohorts there could be other items that could be work taking some action on.
I made two changes based on Google analytics data...
I built a mobile compatible website (after previously believing my product was a strictly desktop/laptop only product).
I translated my site into Portuguese, since the majority of my users were from there, and many of those users were acting like they didn't understand the site (written on English).
Those two things were probably together a 20% revenue uplift. Worth having Google analytics.
It might not be useful to you but analytics in general can be useful - e.g. finding out which page gets more traffic or which has a low retention rate. If there's more users on mobile on one site maybe it's worth it to optimize the site for mobile more. Things like that.
I honestly don't care which page gets more traffic — I don't get paid for traffic to web pages. I get paid by customers who subscribe and pay for using my software because it brings them value (continuously).
The usual story is that I should care about traffic and bounce rates, because of short attention spans and potential customers getting bored or not finding what they needed — but seriously, if you are about to start using inventory control and production control software that will make your life easier and that you intend to use for years, attention span is not a problem.
The metrics I do care about are signups, churn, and everything else that is money-related. But those I can (and do) measure myself.
Note that I am not saying that analytics are totally useless, just that they aren't useful for me.
>I honestly don't care which page gets more traffic — I don't get paid for traffic to web pages. I get paid by customers who subscribe and pay for using my software because it brings them value (continuously).
If you apply that thinking to other areas it doesn't make as much sense e.g. 'I don't care if my site takes 2 minutes to load. I don't get paid for how fast my site loads. I get paid by customers who subscribe and pay for using my software because it brings them value.'
Like sure, that's not what you are directly paid for but it doesn't necessarily mean the data can't help you increase signups.
If you think of it in ML terms - that's extra data that would likely increase a theoretical model's accuracy as it's not fully covered by your other (more important) metrics. Except in this case the model is you and it's possible the extra work isn't worth it for the benefit which could be tiny.
It depends on the type of business and what you're measuring but it's pretty vital to a lot of companies. If you have a sales funnel or a sophisticated acquisition strategy it's mandatory to track all that or you don't know what strategies are working.
> I guess it makes sense if you get paid by page views (ad-supported sites?)
I use adsense for that. It counts page views which is what i need. GA seemed to me like something full of marketing buzzwords for which i really really dont care
What we've done before is just add events in our db, or even just aggregate log data to find what we want in a grafana dashboard or whatever. Much more reliable than GA with them being blocked. And often I don't need the data they add, I care more about how many clicked a button versus how many saw it, and I'm in a better position to track that than Google.
I track all money-related metrics very precisely (using ProfitWell and spreadsheets).
I wrote code for traffic source attribution and used it to find out that my money spent on super-targeted LinkedIn, Facebook and Google ads resulted in precisely $0 revenue, so I stopped advertising and the code is now rusty.
I wrote code for A/B testing (this is something you really want to have integrated with your site) using the multi-armed bandit approach and used it to find out that with low traffic of a B2B SaaS it takes months to reach conclusions. That is still sometimes used, but not a whole lot, and only because the multi-armed bandit approach with Thompson sampling is nice enough to prioritize "better" options over time, so you can leave it running safely without baby-sitting it the whole time.
Overall the problem with a B2B SaaS is that unless you are already huge, the amount of good actionable data that you can work with is small. It's much better to spend the effort on making a product that people will want to use, e.g. talk to your customers.
but you just track visits? You should track custom events to get actionable data, right? at least that's what we do: map which events we want, send it to GA and then get the reports to support decisions.
You might serve from a CDN, the CDN might not log at that granularity (at least not visibly to you as an enduser), maybe you want to dedupe refreshes/navigation (sessions), maybe people behind a NAT matter to you (when they share an IP), maybe you want to filter out bot traffic (not trivial), maybe you don't want to have to deal with log rotation and backups or want to install a GUI parser for your marketing people... a whole lot of things that are difficult to implement yourself vs dropping in one 3rd-party script and calling it a day.
Google Analytics or its competitors take a few minutes to set up. Recreating that whole stack to feature parity would take forever.
Not everything is on a server you control (e.g. my blog is on Github pages), and not everything reaches the server with caching. Also it takes less time to setup.
The way Google mentions "privacy" in the first sentence reminds me of a mining company mentioning "the environment" in the first sentence of their plan to dig a gigantic hole somewhere.
A related note: I have concerns with Google's recent nagging of website owners to upload user data to Google, under the banner of "enhanced conversions".
The idea is that Google takes the user data and matches with their user accounts. So a user logged into Google who clicks an ad, can be tracked across devices without the need for third party cookies.
They expect the upload to happen on the conversion page - such as the thanks page on a website contact form. To me it feels like Google are trying to convince business owners that to be "leaders in your industry" they must help Google fill the void created by third party cookie banning.
Site owners are now pushed to upload every contact form submission to Google for the purpose of what is effectively next-gen third party tracking. It bothers me how blatant and obnoxious the spin is around such pressure on businesses to share their customer data with Google, when it wasn't shared before for the purpose of tracking. "Don't worry, it's hashed data" is not good enough, and not the point.
Most interesting part about this for me is the name. I was never aware of there being a GA 3. It seems like every product went from numbered versions to continuous updates like when windows went to 10 and declared there would be no more numbered versions. But now the trend seems to be reversing and everyone wants that yearly numbered announcement with the hype that comes along with it. I wonder if we will start seeing more consumer software and online services hold back features to go in one big release rather than letting them drip out as they are done.
The problem with rolling releases is that there’s no way to say “I’m using the API as it was at version 3”. The last few years I’ve been increasingly thinking about software versions where the major version is part of the dependency name. So product Foo@1.2.3 is really “Foo1” at version 2.3. Foo1 is what you’re depending on, and so long as the semver guarantee holds, you should be able to incorporate rolling releases of Foo1’s versions. But because Foo2 (Foo@2.0.0) has a different API, it’s a different dependency and should be treated as if it were separate object in the dependency tree. And generally Foo1 and Foo2 should be able to sit, live and work side-by-side.
The fact that Foo1 and Foo2 are both called "Foo" is an artefact of how humans write software, but its meaningless to the computer. Foo2 usually has all the features of Foo1, but because the API is programmatically incompatible, when it comes to the dependency tree, they should be totally separate items.
You see forms of this all over the place. Eg, HTTP APIs are usually /api/v1/ not /api/v1.2/. Debian's apt repository has packages for 'llvm-10', 'llvm-11', 'llvm-12', and so on. I used to think this was silly; but now I think it might be the only way to do that in a coherent way.
Cargo and Poetry treat different major versions of software as effectively different packages by default. However I've seen pushback against upper version bounds (which putting different major versions under different names is an extension of): https://iscinumpy.dev/post/bound-version-constraints/. I'm not sure how to feel myself.
Interesting. But python's package system makes this actively hard to do. From that article:
> For example, let’s say you support click^7. Click 8 comes out, and someone writes a library requiring click^8. Now your library can’t be installed at the same time as that other library, your requirements do not overlap.
What I'm arguing for is that click^7 and click^8 should be seen as different packages that (like any other packages with different names) can be installed side-by-side without issue. It sounds weird, but it makes this problem goes away entirely.
Nodejs and (I think) cargo both support this. I've never run into errors like this in rust or node.
Another advantage of this approach: Integration tests in all previous versions of a package should pass on the current version of that package. I'd love to see test runners which actually enforce this - which should be pretty easy to do in a language like rust where integration tests are clearly separated.
Sometimes (rarely) it makes sense to allow an integration test for react16 v1.0 to fail on react16 v2.0, but that implies a violation of semver compatibility. So it should be the exception. Not the default.
The semver spec implies this, but doesn't say it outright. And that leads to lots of overcomplicated tooling!
I'm going to try being even more clear:
"Package Foo v1.2.3" should usually be spelled "Package Foo1, v2.3"
For example, the migration from react 17.x.x to react 18.x.x is actually a change from depending on package react17 to depending on react18. React17 and react18 are two different packages which have been designed by the same team, and happen to have similar APIs in order to make migration easier.
If I designed npm today I'd make "react16", "react17" and "react18" all be essentially different packages (under a namespace). They should probably only share their lists of authorized authors.
You can see this confusion all over the place. For example, when setting up peerDependencies in a nodejs project, there are suggestions[1] to use a version like `"chai": ">= 1.5.2 < 2"`. This is because chai v1 and chai v2 are essentially different packages. You want to say "chai v1 should be a peer dependency". But its unintentionally awkward to say that clearly. And with the current nomenclature, there's no way to express a dependency on both chai1 and chai2 - despite them being different, incompatible packages.
Even though nodejs is obsessed about semver, it still makes a mess of things by pretending as if major versions and minor versions are similar. They are not.
Weird that nodejs.org doesn't say so, but that's usually written `"^1.5.2"` and is by far the most common way dependencies are specified in my experience. Cargo even makes the ^ implicit, and you can just write "1.5.2".
GA4 feels like a huge step backwards. Tons of functionality in the previous version is missing or replaced with smart “insights” (where you have to try to describe your query in plain text, which it will always get wrong). A good time to move on to something better.
It goes along with the philosophy of removing useful data that they've had for at least the last 10 years. Metrics like average position or search impression share in Adwords got removed, along with consistently broadening keyword match types (plus other quiet, but signficant shifts like moving away from second price auctions). The biggest introductions emphasized automated bidding strategies, view through conversions and hyper-optimistic offline conversion attribution tools. Analytics removed organic keyword data years ago and has shifted some focus towards multichannel attribution, but this is the biggest leap. The goal in PR speak is "understanding the customer journey across all channels" but in reality it is more like trying to capture more dollars from brand advertisers (see: big orgs with layers of burecracy with a big budget and 0 accountability) and mask the inefficiency of their platform for others (think lead gen-- Google tested running their own insurance comparison service, but it was more profitable to charge insurers $xx.xx per click instead. Similar situation with OTAs/home services).
They keep adding layers of opacity to mask ever shrinking ROIs while making it easier for anyone to spend more on ads. Eventually they will get to the point where you set a budget and you get what they are willing to provide based on what their "AI" decides is best to maximize their fill rates, auction participants and their earnings per page view. No keywords or managed placements. It isn't like there is an alternative and their inertia isn't going anywhere.
I wonder if Google realized that Analytics has become unfathomably complex for non-power users and wants a clean break now.
I run a side business[1] that is lucky to not be dependent on search ads. We started using GA 10-ish years ago because it was free and took almost no effort to set up. But over the years it just got more and more complicated to use, to the point where it got hard to act on the results.
Last year, I switched to Plausible[2] for data protection reasons. The unexpected side effect was that I suddenly understand every single metric in my analytics tool. As a result, I use it a lot more for actual decision-making.
Plausible is great for my content website. The benefit of not tracking my users or pestering them with a cookie notice outweighs the inconvenience of limited insights.
I miss seeing the user flow from page to page, but that’s it.
On my personal website, I just don’t use any analytics at all. It’s not a metric I want to be exposed to.
Kudos for mentioning Plausible. I also love and use them. They actually inspired me to start a privacy-focused products analytics tool that doesn't track unique users (and is open-source). Shamelessly plugging it here: https://fugu.lol (GitHub: https://github.com/shafy/fugu)
I’m glad that there are more and more such simple, single-purpose tools now. It took me a long time to learn this, but I’m now actively staying away from anything that markets itself as “enterprise-ready”. Too much hidden cost in keeping up with all the shifting trends and mental models.
Thanks :-) Yes exactly. I personally enjoy tools that focus on simplicity and don't chase every trend and feature, so that's what I'm trying to do with Fugu, too.
Yep, I think that's the proper summary. I like the all-in move to "events", but the UI, especially for content-heavy websites, is incredibly more difficult to navigate and extract insights from.
I agree that less features is upside for better understanding of all collected data. Simple dashboard is all I need and I guess it is same for most of websites.
Plausible seems interesting, except this weirdness: "Made and hosted in the EU" (incorporated in Estonia), but pricing is in GBP. Immediately sets off an alarm IMHO.
Yup. Tried again from another connection and it was in €. Before posting the previous comment, I verified that there was no currency selector on the pricing page, so there are two separate issues, but not the one I initially pointed.
I suspect this is due to slow adoption. The majority of SaaS reporting tools don’t currently have integrations for GA4 last I checked (admittedly been awhile). The particular one I had in mind had it coming in the next year or so, but the lack of a clear date seemed to indicate to me it wasn’t a priority because, well, it wasn’t driving their bottom line. This should grease the wheels.
We’ve been creating and including both with gtag(‘config’, ‘ID’) lines for some time now in anticipation of something like this.
This seems kind of insane, so many people are using the old version of Google Analytics and have absolutely no compelling reason to upgrade other than Google forcing them. And it seems GA4 is fairly different in some ways, this is going to be a ton of churn for a ton of people. I guess Google thinks there really is no alternative, they can just force people to do extra work for no gain and they'll grumble and accept it?
aside of cookies not working anymore and the old model becoming illegal in some countries? I think it's compelling enough for people that actually need the data. For you maybe not and that's ok.
Nothing else in the Analytics space gives the right integration between your SEO and PPC. So for example, if I wanted to create an Analytics -based custom PPC audience based on a number of factors such as device, pages visited, events triggered and location (layered) directly into my AdWords campaigns, I'm going to have to use GA. And if I want to use Optimize for my PPC campaigns, then I'm going to have to still use GA. It's all out of the box and it's all "free", compared with the multiple vendors solutions I'd need to make the same scenario work in order to drive sales from web visitors. That's the google lock in right there. Meh.
I honestly don't give a fsck about the new version or why it needs to be a breaking change.
All I hear is:
> We will begin sunsetting Universal Analytics — the previous generation of Analytics — next year. All standard Universal Analytics properties will stop processing new hits on July 1, 2023
Great, plenty of time to replace GA with another analytics solution.
I ditched analytics entirely when I realized they didn't tell me anything I didn't already know. Stat dashboards are skinner boxes that get in the way of doing the work of trying new things, getting feedback, and using it for the next thing.
While it might be a bit hyperbolic to suggest that analytics don't tell you anything new, I've seen vanity metrics toted around on more than one occasion as if it meant anything actionable or otherwise unknowable. I think this is more common than the other way around. It's like the ol' conundrum where an A/B test is performed on a design change, and the B version results in more "engagement" with the site; what they fail to openly mention is that so-called engagement may just be the frustration of the user trying to bypass your crummy design update.
IMO, the best analytics for many businesses is customer acquisition and revenue as well as customer feedback. If you aren't adequately responding to those inherent analytics then you shouldn't yet be using The Google's form of black magic.
You've seen vanity metrics? That's throwing the baby out with the bath water. You just haven't seen the value it brings. Analytics isn't black magic and it won't run your business for you.
Some people think the existence of vanity metrics or stupid tests means that the entire concept is bullshit. That is their problem, and reveals more about their limited perspective than it does about whether or not analytics is important.
I started back in the '00s with server-side stat tools like AWStats and never learned anything new from checking them. Why would I keep wasting time? My subjective experience tells me they're useless for me, and this is a common enough experience.
That doesn't mean everyone will have the same experience, but it does mean most people probably don't need to feed into Google's ability to track everyone just to look at their stats and say "yep, that's where I'd expect a person to come from."
If knowing where your traffic comes from and what it does on your site has no value to you then I'm not going to try and convince you otherwise. You can try and ask your users who either won't respond to you or potentially give you incorrect or misleading responses, if you want analytics, you know, in order to make business decisions.
Your website has GA4 installed on it, you might want to remove that.
On the other hand the paid version is no fuss, affordable and works great. Its much cheaper than fiddling with how to selfhost it. Not sure what bugs would be in paid that are not in the self hosted one. It´s same thing.
Bugs aren't in their code. It's the way they've configured their server that hosts the .js. Their headers are fucked up so some browsers will never load the .js correctly, and you'll never know about it. So your stats will never be remotely true.
We're building the most comprehensive, but still easy-to-use analytics tool out there. Specifically useful for SaaS businesses that want to collect and unify marketing, product engagement, sales, and subscription revenue data.
I've seen tons of similar analytics products pop up in the last year and have avoided them all for having out-of-country data storage (and usually less functionality, etc). With how volatile data-related laws have been in the EU the past few years, IMO it feels risky to try to store anything there, let alone view it as a feature!
I think you're putting your finger on why it is a selling point. You're avoiding them because they're out of country, and that makes sense. For Europeans, the default has been "US hosted data" for a long while, and having options that store data in our jurisdiction is thus a selling point.
Fewer jurisdictions is almost certainly going to be easier, and these options offer that for the European market.
Thanks for the explanation. HN is so notoriously US-centric at times that it didn't occur to me that these services offer the same perks from the other side of the pond.
I'm not sure how you get to that conclusion. Data privacy laws are among the strongest in the world here. It is a selling point because of that, but also because there are laws against sending user data to non-EU countries.
It feels risky to have my data stored anywhere else other than the EU. Feels like my data is just up for grabs anywhere else with no clear privacy protection. Imo EU is the best place to have your data.
Your pricing is an order of magnitude too high, given there are better free solutions out there. Would be like me selling a search engine service for $99 a month when Google and Bing exist.
It's compliant if you show a cookie banner on your website. It not really privacy-friendly as you are still setting cookies on your users devices.
There is one point from the Google announcement that I fully agree with and that is that the future of analytics is cookieless by design (only google won't be a part of this)
Edit: I am asking because I know that there are better alternatives than matomo. But there still needs to be some alternative for 'oldschool' hostings with LAMP stack.
if the primary issue here is data loss...isn't part of the solution to also find a way to take more ownership over the data, regardless of what you're using for analytics?
i.e. instrument a standard spec with something like www.Rudderstack.com and write it to your warehouse/datalake + whatever analytics tool you want.
That way if you ever want to move off your analytics tool (or a solution providing the data spec), you still own the data for whatever you replace it with.
By sending the data to the WH, also lets you own any proprietary data you create - like the custom PPC model referenced in this comment: https://news.ycombinator.com/item?id=30707913
Look, not to rag on your employer's product too much - I'm sure it works for some people. But the first thing I checked was the pricing: $750/month just to get some graphs in grafana (and support, if you're a business and care about that kind of thing), and SSO is in the "contact us" enterprise tier.
Throwing in another choice: nocookieanalytics.com great interface and very simple. Just what I needed to see which one of my blog posts were famous, (none of them are)
I hate everything about GA4. It's a totally counter intuitive, massively over complex interface. Things that were easy in the old version are now either insanely hard or simply missing.
I'm moving my clients to Matomo and they're loving it.
Personal opinion, Google is catastrophically shooting themselves in the foot here, and could lose their market dominance.
GA's second-biggest moat (behind being free) is its existing install base. It's already in place on approximately every website on the Internet. And no one wants to change analytics vendors, because that breaks history, which is a no-no. You want to know if your visits are up/down vs last month or last year, you literally cannot answer that unless you are using the same tool this years as you were last year.
By forcing a migration to a non-compatible version, Google loses that big advantage. If you're going to lose that anyways, then there's less of a reason to stick with GA. Now you're forcing people to gut-and-replace their existing data collection process, what incentive is there for people to stick with Google? One of the biggest reasons to stay with GA was just killed.
The timing is also exceptionally atrocious. Google is, again, forcing people to expend the effort to rebuild their analytics implementations from scratch. Meanwhile court rulings are coming down on a monthly basis enumerating different ways in which Google Analytics violates GDPR. A lot of companies might just brush off these rulings by simply doing nothing, but now they're being forced to actively invest time and effort into Google Analytics. A lot of them are going to be doing an evaluation against privacy-conscious competitors that simply would not have happened if Google wasn't kicking people of their existing platform.
The GA4 data model is also completely different from the GA Universal data model, which is an impediment to their fourth-biggest moat, the fact that every piece of BI software in the universe has a GA plug-in. Many of those plug-ins work despite not being actively maintained since 2015; the GA4 ecosystem is smaller and will take some time to recover.
(The third-biggest moat is Google Ads integration. I suspect that Google imagines this to be bigger than it really is in practice.)
It’s an interesting argument, but in my view, one that sailed 2-3 years ago when they force-migrated everyone to firebase analytics, which fairly drastically changed our data views given as it was so heavily geared to revenue generating click businesses (as opposed to B2B SaaS).
At that point we instituted a big program to move to Amplitude (which we’ve been fairly happy with, but probably won’t be perfect until we get our own data overlays through eventually getting a data lake etc etc) - now the move BACK to GA … II would never touch GA again or consider it for a new project
> 2-3 years ago when they force-migrated everyone to firebase analytics
No, this is the same migration. This is my point: you weren't actually forced to migrate until now. The migration has been optional up until this announcement, and many people haven't. What your team did 2-3 years ago, the majority of organizations have been putting off. But now they can't anymore.
Wow, is that right. We had some weird little projects that were web based that we didn't do anything with that we could still view through analytics.google but I felt that the mobile stuff was GOOOONE unless we did something.
So they've basically said 'GA is closing unless you want to pay 120k/pa so move to firebase - GA is now Firebase Analytics' to now 'Firebase Analytics is Google Analytics again, but that thing we said you had to do you now have to definitively do?' ?
100% - I jumped right on it when they said that, did a few hours of hurried research, then discovered that it wasn't actually possible to use Firebase for what we use GA for. Figured we just had to wait until they hit feature parity so I've been checking the status of things a few times a year ever since.
...and there's been very little movement. We can still create "universal analytics" properties, we still can't do the same stuff in Firebase, and the GA4 "Measurement Protocol" is still in beta.
I think our eventual move is going to be to a non-Google product.
You’re right, there was so much feature asymmetry we couldn’t justify sticking around - we would have lost any valuable insights into user behaviour. Firebase analytics just seemed like a ‘made for ads and influencers’ analytics tool. What a monumental cockup by the team that decided that it was important to lose that pool of users. Because of that shift we actively avoided making any choices re any of the other Google cloud or firebase tools.
There’s heaps of good ones around now - our shift to amplitude was a good one but I know there are a lot of other really simple and free ones about too
It sounds like you were working on app tracking rather than web tracking, so your timeline makes more sense. The various GA SDKs for different platforms stopped working a few years ago, forcing a move to firebase-which-is-now-GA.
On the website side of things, which is GA's core user-base, Universal Analytics has remained supported up til this announcement.
Glad to hear you’ve been happy with Amplitude. We’ve been doing a lot RE helping customers federate data to data lakes/warehouses. If I can ask- what data lake you using and what data from Amplitude are you looking to get into it? (Raw dump of analytics data or something else?)
Hi,
Thanks for your interest, you've got a great product there and it's been really cool seeing it evolve.
We're actually not currently using a data lake. We were initially on the startup plan which was fantastic (the ability to do templated reports etc is a big one for us) but currently we couldn't justify the cost of a full subscription at that point (and now we are just making do until we can complete our next round at which point we will re-look at the whole situation)
We love that you have an API and used fivetran to ETL to (for want of something cheap + low BI tool costs) BigQuery where we were hoping to be able to attach Data Studio to it to achieve our objectives.
This may not have been the right approach.
We couldn't spare anyone from our dev team to take the time to work out the transforms/queries required to make it useful in Data Studio so we spent some time contracting the work out, but that fell over because they were on lend from a mate's startup and I think just never moved out of the blocks. Also maybe we were using the incorrect tools.
Essentially we were looking to get a fair proportion of our data out, in order to create data views for our clients on their usage (... platform in bio, but hospital analytics stuff - no patient info, all operational efficiency). This was fulfilled really nicely by Templates but in the reports we now generate (... a bit laborious but using a offshore contractor who takes all the pain out of it) we are also able to put benchmarking averages against their peer hospitals etc in which wasn't something it was immediately obvious we could achieve with Templates. Ultimately we want those individual dashboards available for the Hospitals to view, as well as be able to return things like views per period for their content on our dashboard, which we know we can also achieve through the API
Yea, as a first-time Google analytics user I recently had to navigate the GA4 vs UA products (the blogging platform I used only supported UA, but every new guide is for GA4). When it came to instrumenting my actual product, I just used Amplitude and try very hard to not have to open my GA4 dashboard (really, just to see traffic attribution). Can't believe _this_ is supposed to be some pillar of their legendary money making machine...
I think you’re right, and I think we should consider the possibility that Google doesn’t want to maintain market share dominance of basic web analytics anymore. Because what do they get for it, in a world where large-scale cookie data pooling is increasingly difficult and in some cases illegal?
The only reason I've ever installed GA on any website was to get higher rankings on the Google ad sales pages to get better CPCs (including Reddit). That moat might be bigger than you think.
> Google is, again, forcing people to expend the effort to rebuild their analytics implementations from scratch. Meanwhile court rulings are coming down on a monthly basis enumerating different ways in which Google Analytics violates GDPR.
This is precisely why they're doing it with urgency though. These rulings are based on Universal Analytics. GA4, along with their Consent Mode (still in beta) is supposed to be designed to work entirely without cookies and IP addresses.
As a hobbyist, this bums me out; I no longer will have the data from the day my humble blog made the front page of HN. Wish there was a way to migrate UA data to GA4, but from a bit of digging that doesn't seem possible.
Keep in mind Google can suspend and lock your account at anytime for "suspicous behavour", there's no one you can contact, use this shutdown of UA to make the change today and start owning your own data. There are plenty of other good alternatives in the other comments.
Same. I run a hobby site, and like being able to look back over all time and see the (in my case slow and gradual) buildup of traffic. The lack of migration route for historical data is frustrating, but I also don't particularly appreciate extra non-value-adding work being dumped on me for something I do for fun.
Most of my users are blocking GA these days. I don't put it on new sites anymore. Crawlers from a certain country do execute GA js. This causes stats to vary wildly.
Not sure if it will be worth my time to go back and remove the old tags from those sites. Cannot see a reason to upgrade.
It was nice to know, but there are more important metrics.
The multivariate testing suite was fun at one time. However it isn't as useful when your core users are all blocking it.
Yes, Google finally realized that everyone hated looking at 'pageviews', but loves looking for an event called 'page_view'. Much easier to understand!
Honestly though, for a huge segment of their users this hides/obscures the number one metric they look for. I understand that with SPA"s and stuff things are much more event driven, but they're absolutely going to lose some users over this.
Question being how many more do they gain?
Especially in the mobile app space maybe - if you have a game where you upgrade your sword with gems and attack your neighbor, which if those actions would be a pageview?
That’s fair! I do think you could trigger events for that kind of thing in Universal Analytics though. Seemed to have plenty of affordances for use in apps as well. I still question whether going for a ‘clean break’ is the best idea. Obviously products have to improve, but I’m surprised to see them imperil their massive user base this way.
I had dozens of (client) websites that ran GA, but then Google, the internet behemoth with some of the greatest engineers in the world, couldn't even figure out how to stop spam from polluting their control panel.
It became useless to me and I haven't used it in at least five years.
If there's anyone here who still uses GA, can you tell me if they ever solved the problem of referrer spam?
Interesting -- I got quite a bit up until 3 or 4 years ago, but it's essentially vanished for me. Not sure which cases they've cleaned up, and which ones they haven't I guess.
Google Analytics 4 has a few changes that should improve their privacy aspects:
IP Anonymization: not GDPR proof
Server location: not changeable
Use of Cookie Banner: still needed
Deleting Individual User Data: improvement
Limited Data Storage: slight improvement
In short, they say they fix a lot of privacy issues, but they don't really address the real issues like the use of IP addresses and the location of your data. [1]
Disclosure: I'm the founder of Simple Analytics. [2]
GA4's IP Anonymization is good but not sufficient. Your IP Address is exposed to Google's servers, and is redacted after collection. Per GDPR, this still counts as Google processing your IP Address.
It is not clear what actually works when you incorporate Consent Mode. I haven't actually found any reports that include the data supposedly collected. I also have some doubts that Consent Mode does not actually require consent (see IP Address concerns above).
No, in the article they say alongside this announcement they're going beyond IP Anonymization and ceasing logging or storing IP addresses altogether.
Of course, any network request is going to send an IP address... that's how the Internet works. If that alone is enough to make a service illegal then third-party analytics is going to be just flat out impossible for everyone.
> any network request is going to send an IP address
Yup, that's my point.
> If that alone is enough to make a service illegal then third-party analytics is going to be just flat out impossible for everyone.
Yup, that's my point.
EU court rulings say it's illegal to transfer data to Google, because Google is subject to the CLOUD Act in the United States. IP Address is personal data, and is considered transferred just by having the user's client connect to Google's servers.
Google's description of what they will or will not do with the IP Address is legally invalid, because the CLOUD Act can compel them to break their promises. This ruling, while unfortunate, is correct.
Third-party analytics in general are not illegal. What's illegal is third-party analytics (or third-party anything) hosted by a US company subject to the CLOUD Act. Even then, it's possible to make it legal by running it through a self-hosted proxy which anonymizes the data before sending it over to Google. But a direct connection from the user-agent to Google servers is illegal.
Fathom managed to do it. They created a separate European infrastructure and created an European company with European employees to manage it. The original Canadian company have no access to the data in Europe.
I always thought Universal Analytics was the new product and GA 4 was the old product. That's how naming usually works ... go from numbered releases to continuous rolling releases. Not until a few months ago that I realized it was the other way around.
I thought that was just me. Just learned that in this article. I actually was not using some stuff that said they support GA4 because I thought they were legacy. Sounds like a branding fail to replace "universal" with a random number.
It looks like Google is finally reacting to some of the criticism [0] and changes a few things. I think it's still best to switch out your analytics or abandon it entirely if you don't rely on it. There are plenty of alternatives now [1]. GA is still a complex tool for people working with it full time.
Would be more interested in sending it bogus data.
Strangling it to death with ad blockers isn't working (or at least is doing less than the self inflicted damage being caused by mismanagement over at Google).
How can we make its output completely wrong, instead of just incomplete?
I'll happily stick a "do not track” header (or the new one) on the bogus requests. In my book, they've been warned to back off at that point, so any bad outcomes from tracking the requests are on them.
FYI GA4 was released over a year ago, it seems it hasn't 'stuck' with not many people upgrading, at least with many people I've worked with. It is the default version when you setup a new GA property though.
I see many people don't like this change, so I'm shameless plugging my self-hosted analytics platform if you want to move away from GA: https://www.uxwizz.com/
It makes sense that google is attempting to adapt to new laws but they do it so slowly that it’s quite concerning for the future. Also the market has become much more competitive and there is plenty of alternatives to chose from at very fair prices.
Seems like an (short term) opportunity for someone to build a product that pulls existing Google Analytics data via their API and stores it in a database to allow for viewing historical analytics data.
Are there any competitors out there that can import historical data from Google Analytics? I've not checked, can you even export your historical data from GA?
You might want to check out Simple Analytics as well. From a privacy perspective this is actually a good shout. We classified it as follows:
Matomo looks more like Google and the data is yours. Plausible and Simple Analytics are more similar to one another in the sense that they only collect privacy-insenstive data.
The main difference is that plausible still operates in a 'grey-area' by anonymizing IP addresses for 24 hours. This can be considered fingerprinting.
The thing is that "GA alternative" is a broad question, as GA has A LOT of features and it's also lacking many features, it depends on what you're looking for and what's your use case.
> All standard Universal Analytics properties will stop processing new hits on July 1, 2023, and 360 Universal Analytics properties will stop processing new hits on October 1, 2023. After that, you’ll be able to access your previously processed data in Universal Analytics for at least six months.
I am a Google Analytics consultant. Let me make this very clear: There are different products and everything will be lost.
This is not an "upgrade." This is "replace product with a different product in the same space." Old GA data is incompatible with GA4 data, and there is no way to bring data from one into the other. Google is discontinuing Google Analytics, and applying the Google Analytics brand to Firebase Analytics.
Nothing carries over. Configuration, implementation, goals, user permissions, historical data, nothing. It all has to be re-built from scratch.
Seriously? They're shutting down Google Analytics? Deleting all data six months later? The replacement is a completely different product with the same name but no migration path, and wasn't even the default option less than two years ago?
The Google deprecation team has really outdone themselves this time. I really would have thought they'd deprecate Gmail before Google Analytics! I've got to imagine there are legal reasons behind this, because no product team could sincerely believe that this is a good idea for the product, right?
> Seriously? They're shutting down Google Analytics? Deleting all data six months later? The replacement is a completely different product with the same name but no migration path, and wasn't even the default option less than two years ago?
Yes. That's what the original post and the entire discussion here is about.
I'm confused, you say that no data will be lost, then someone replies to you and says all data will be lost, and then you agree with them? What am I missing.
How are you reading "All standard Universal Analytics properties will stop processing new hits" as "everything will keep working fine" and "you’ll be able to access your previously processed data [for six months]" as a guarantee no data will be lost?
Because everyone's going to update their properties so analytics keep working. It's not like when a new chat product comes out and splinters your contacts into multiple chat products.
It's like when, e.g., Heroku sends me an email saying I need to move apps off of a stack they're retiring. They're not killing off my app; I just need to run the command or agree to a maintenance window so it keeps working after a certain date. If I for whatever reason refuse to do anything, I can't seriously claim they're shutting me down. Especially with a long notice period.
"Doing so will allow you to build the necessary historical data and usage in the new experience, preparing you for continuity once Universal Analytics is no longer available."
That doesn't seem to indicate your old data comes with.
This has been coming for a couple years already. I agree there are some organizations that probably didn't take GA4 seriously until this week, but I don't know how many of them were really delving into their historical analytics to begin with.
Anyway, I don't mean to take away from your moment to hate on Google. :) It's just not a classic Google discontinuation like when they eventually shut down Stadia or whatever.
The first announcement of the date. We all knew it was coming; I figured it would be many years in the future. I still think so, actually... I think Google is bluffing, because nobody is taking up GA4. Some are implementing it, in tandem with UA, but very few are using it.
Nobody wants to move to GA4. It has nothing UA doesn't, and UA has lots of features GA4 doesn't. If they're sunsetting UA next year, there's no way it'll have improved enough by then to be a fully functioning replacement. It has a cleaner structure than UA, because it's dropping a decade of cruft, but that doesn't really help when it just doesn't do as much.
The people that are moving to GA4 are running both UA and GA4 in parallel, so that they'll have a history set when UA shuts off. Those are companies that are paying attention. I would be surprised, by the 2023 date, if even 25% of the UA installations have switched to GA4. (Of course, all the larger companies will have.)
Side note... I looked into Shopify a month or two back. They have a drop in UA implementation that is almost as easy as clicking on a checkbox. Not only do they not have a wizard for implementing GA4, they say it's not even possible for you to add manually. No due date was given. A lot of sites have UA built in as the assumed forever default.
Google Analytics 4 is a web analytics service that allows you to track traffic and engagement on your websites and apps. This documentation contains implementation instructions and reference materials aimed at developers.
[1] https://i.imgur.com/uZmfUfJ.png
[2] https://support.google.com/analytics/answer/10759417