Hacker News new | past | comments | ask | show | jobs | submit login
Bringing the best of open source to Google Cloud customers (cloud.google.com)
75 points by espeed on April 12, 2019 | hide | past | favorite | 16 comments



Great move on Google's part marketing-wise with all the discussions about Amazon using other tools' api layers and then doing whatever they please underneath without upstreaming. Sure, yes yes, they might just be solving very specific hardware/network issues for the most part only relevant to their data centers but still.

Google, please don't let this announcement be marketing fluff alone!

That said, I hope Google/GCP partners with the Postgres team sometime in the near future as well!

And lastly a sidenote (kind of unrelated): there are enough big players in Chromium now with Microsoft and their Edge wrapper that just like how V8 is now, Chromium at el. (the whole "bundle") needs to try and remain as democratic as possible moving forward. Thinking of [1].

[1]: https://bugs.chromium.org/p/chromium/issues/detail?id=896897...


> That said, I hope Google/GCP partners with the Postgres team sometime in the near future as well!

I had the same thought. Cloud SQL Postgres still does not offer point-in-time-recovery, which makes it a non-starter for many use-cases.

I'm not sure "the Postgres team" is a commercial entity you can do business with, and probably one of the best commercial entities, Citus Data - was recently acquired by Microsoft but there are others I'm sure, but maybe not any as close to the core team as the companies they announced today.


Some other companies that specialize in Postgres support/hosting/commercial tooling:

- https://www.crunchydata.com/

- https://www.enterprisedb.com/


definitely interesting that google is taking a friendly approach to partners while AWS is simply forking their code and selling it themselves. It'll be interesting to see which strategy pays off. Still a decent gap between # of AWS and GCP customers, but this could help them catch up.

also shameless plug (about as shameless as it gets), but my company is offering free 90 day database migrations to Google Cloud for customers actively looking to move their ops to GCP.

https://www.striim.com/google-cloud-database-migration-servi...


As I see it, we have these open-source / core products, and the companies maintaining them are looking to provide certain value-added features (particularly critical to the enterprise crowd) in exchange for some vendor lock-in.

- Amazon's tactics are to either offer a proprietary alternative to those enterprise features (Elasticache, DocumentDB), or to vend a competing open-source alternative with the aforementioned features (ElasticSearch, Corretto)

- Google's approach looks to be partnering with the maintainer to offer a shared walled garden (so while you're still locked-in, you can easily migrate between blessed partners).

I'm not yet convinced GCP's tactic is overall good for opensource as a whole, since they are setting themselves up as de facto benevolent gatekeepers; although admittedly they will allow these projects to thrive under their current stewardship in the medium term.

Edit: I'm also interested in seeing what happens here when there are multiple competing vendors who have equivalent claims to stewardship of the underlying project - ex:- Hortonworks and Cloudera pre-merger, or Cloudera vs. MapR. Also, how vendors with vast proprietary ecosystems work out in this "native fully managed" setting, ex:- Databricks which has managed to get on by on the Azure and AWS marketplaces.


Somewhat bold because it seems now harder for Google to reverse their stance than for AWS to reverse theirs.


But it now gives Google cloud integrated services that will always have the newest features whereas the other cloud providers will have to pay to maintain their in house solutions themselves due to the recent changes in many popular open source license's.

That leapfrogs AWS and azure in feature parity on those services while also funding said services further because they get a cut.

Brilliant move IMO and I think it'll be a boom for improving open source software and increasing competition between the clouds.


Sure, I wasn't saying it was a bad idea, just a bold one. Hard to pull back with the strong statements in the post..."equal collaborators, and not simply a resource to be mined"


It looks like this is only for the paid/enterprise versions of the open-source software, which is a nice start. However, AWS has had their marketplace as well with additional hourly charges.

I understand Google has had their own contributions to certain projects, but I wonder if Google will do anything to differentiate from that and AWS. For example, add MariaDB to their SQL product and contribute some of the income back the MariaDB Foundation.


This is such a great move on part of Google. Gets them good PR and helps the companies behind these open source projects. I hope Azure also follows through and does something like this so it forces AWS's hand eventually.


I am glad they are partnering with these open source projects and companies, but they mention Mongo in their list of open source partners. It is my understanding that MongoDB is no longer Open Source, and therefore no longer qualifies to be in such a list of Open Source Open Source centric partners.


Seems like the license brouhaha spurred some changes and new relationships. Probably not a bad thing.


I still don't like Google benefits from all these opensource projects and having at the same time an extremely non transparant pricing models which can change anytime. Plus the fact that they have shady ways of adding your business to their vendor lock in trickery, is reason enough that I would never use gcp for our businesses.


I’ve never found Google’s pricing non-transparent. Complex, perhaps, but all the prices that I’ve seen are public.

It’s better than the old enterprise pricing model of “come play golf with us while we figure out how much we can milk you for.”


I wish these cloud providers would "open source" their pricing models. I know some people were probably abusing the hell out of the matrix api but the price hike is quite insane tbh. Maybe this sounds dumb but it's for this reason that I go with Digital Ocean for my personal stuff.

For example, something like: compute time + availability + hardware + ?

At least with lambdas you can average out time through individual function blocks and scope it to hopefully know what pieces of the users code are the most intensive. Maybe lambda get us closer to this transparency model by showing us down to the functional block what is costing what.


The Cloud Billing Catalog API offers programmatic access to all GCP pricing.

https://cloud.google.com/blog/products/gcp/introducing-cloud...

Re: pricing changes, I presume you're referring to the Distance Matrix API under Maps? That's rather separate from Cloud, but at the end of the day doing a lookup of all routes between M places and N places is going to cost Google M*N more than a single lookup.

Disclaimer: I work at GCP and used to work at Maps (but before the pricing change, which I was not involved in).




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: