Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Amazon announces 'Bedrock' AI platform to take on OpenAI (businessinsider.com)
110 points by bundie on April 13, 2023 | hide | past | favorite | 47 comments


In case it's not obvious, this is what "landing page" customer discovery looks like for $trillion companies who have nothing to show but smoke and a whole lot of stick rubbing. The signup form is one massive "we don't have a clue what we're doing, here are dozens of options, tell us absolutely everything about how you plan to use our hypothetical cough excuse me awesome and totally real bedrock service".

AWS will play hard in this space but as someone else in this thread eloquently put it: this is what the sound of executive butt clenching looks like writ large. Microsoft can only be laughing.


I don't think you're wrong per se, but all of AWS's landing pages look exactly like this sort of enterprise sales "call us for pricing" pitch. Of course AWS will offer a service in this area -- all of the hyperscalers are AI factories now -- but it wouldn't be a surprise to discover that Bedrock is a repackaging of the SageMaker ecosystem.


That's every AWS landing page.


This actually exists, though I haven't bothered to try it yet: https://aws.amazon.com/codewhisperer/


Used this and Github Copilot. I turned code whisperer off even though it's free because it was causing extra mental load to look through all these bad recommendations.


I hits hood grasping at straws here!


I used to be an AWS true believer. Now I find it increasingly difficult to be enthused about anything AWS. It’s all so expensive, complex and locked in. I recently shut down everything I had on AWS except Route 53 and workmail, both of which I really like. The craziest AWS thing is that they moved all GPU instances to a quota system where you have to request access, specifying the number and type and location of instances that you want to run. It’s like Azure which has an equally terrible quota system. Anyhow I needed to run some performance tests in GPU instances. In the past I would have spun up a bunch of different instance types, run my tests and moved on. In this case I applied for a quota of one machine type and 24 hours later AWS got back to me to approve my request to run one instance. At that point I gave up on the idea of AWS being the heart of any GPU based infrastructure.


Everything you say seems equally applicable to any cloud provider.

They're all expensive and they all want to lock you in, because they're businesses. GPU's are in extremely high demand now so that's just how it is. And they're complex because different customers have different needs, so that's inherent.

I don't see how any of your complaints are anything a for-profit enterprise cloud provider could change and still survive as a business.


GPU's in cloud is just a bad business unfortunately.

There is only one vendor (for anything scientific/ML anyway) and you have to buy specific (overpriced) models from them to be compliant with the license. So zero negotiating power, and everything is proprietary so you cant build any software/support edge.

You cant oversell a GPU like you can CPU/memory, and most of the computation happens on the chip so no one cares how good the rest of your stack is.

And finally you are in a neverending battle with crypto miners, which would never be profitable on cloud so they have no intention of paying you.


Aren't interconnect bandwidth and latency important for training workloads?


Yes, but to move the needle you need to use another single vendor Mellanox (which nvidia owns).

So the (slight) differentiations are.

AWS - gpudirect and high bandwidth (but also high latency)

GCP - 16 gpus on one machine

Azure/OCI - mellanox


They behave this way because they are so tight on GPU capacity given incredible demand. Getting a GPU instance on GCP is as miserable of an experience.

Try Lambda Labs, there's usually more availability, and more importantly you get some visibility into currently available capacity.


GPU instances are also a favorite option for crypto miners using compromised AWS accounts. They run up $20k in bills in a short period, which AWS typically nulls out if you beg.


They all are, what did you switch too? Suppose it's kind of the point really, it's a cloud with a bunch of "tools".

You are welcome to install MySQL or elasticsearch on another VPS outside of AWS and use it. But that's not what customers want.


It's not like "Open" AI is any less locked in or opaque though...


OpenAI is a very limited API surface and there are already projects out there creating interfaces on top of N GPT implementations spanning private & public, giving you choice.

The comparison here is apples & oranges


We told you this would happen, but no, we were luddites.


I know Character.ai prefers Oracle Cloud for GPU compute workloads because OCI can actually deliver significant capacity in a reasonable amount of time (often < 24 hours, almost always within a few days for hundreds [or more] of GPU instances, especially if you aren't picky about the region).

I'm all for bagging in Oracle in general, yet it's undeniable this is one area where they're leagues ahead and winning.


It sounds like they temporarily have excess capacity? I don't think you can count on that to last.


is that because they are 10x the scale... or because they have 1/10th the customers demand...


besides GPU instances, have you tried out lightsail? i find its quite competitive with DO, Linode, etc. with AWS infra backing


I’m waiting on the AI that is an expert at all things AWS. I’d like to learn about AWS services and then be able to have it generate SAM or CloudFormation templates.

I realized this is probably (hopefully) being worked on by some Amazon engineers. It’s so obvious when I started imagining what AI tools are around the corner.

Honestly, I wouldn’t be surprised if programming languages each had their own AI that could (try to) serve as an expert in the language.


I just read somewhere today that their new code completion tool, CodeWhisperer, is better at Amazon APIs than GitHub Copilot (makes sense).


It was on HN front page today: https://news.ycombinator.com/item?id=35554460



Can't ChatGPT [mostly] do this already? It was trained on all AWS documentation up until the training cut off


Kind of. I’ve been using it, and it gets stuff wrong (hallucinations) or it just doesn’t know about things (I.e. the new additions to GameLift).

Granted GPT is what I use to do this now. But I’m also ready to see what focused, tailored AI would be able to do.

P.S. if any Amazon engineers see this and there is indeed a focused AI in the works, I would love to be a pre-alpha/alpha tester. I am willing to provide useful kind feedback.


try phind.com expert mode uses gpt4 agents to look up for fresh stuff on the web and digest it for you


Did you try GPT-3.5 or 4?


Bedrock isn't a rival. It's a layer of abstraction on top of other second-tier foundation models that should allow developers to seamlessly switch between models. "Titan" is their horse in the LLM race and it's not scheduled to arrive at the starting gate for several more months.


The timing of this seems more like an attempt to stay relevant rather than some concrete milestone


I don't see it. They'll sell you GPU time to run and create models and access to data sets to train them on. This looks more like a proof of concept for using AWS to do that and a way to juice the market for it.

It's in their interest to have a competitive offering so they can use it in their own stuff without depending on others. Amazon.com's search is garbage right now, for example. Productizing their efforts gets other people to pay for it.


They’ve seen what Microsoft has in the pipeline and the TAM created in the past couple months out of thin air and I can almost hear butts clenching in their HQ.


This reminds me of when one week, Webcrawler was the go-to search engine. Then it was Ask Jeeves, then Yahoo, then Google. This is one of those times where companies that used to be on top can slip very far behind.


Jumping on the blockchain bandwagon didn't cost new entrants that much, but this AI fever is going to seriously hurt some of these companies financially.


This doesnt seem like a ChatGPT rival...it doesnt even seem like a GPT rival


Amazon Titan would be the GPT rival, based on what the article claims. Bedrock is the product name for the suite of services that allow you to spin up text-gen, image-gen, image classification, text summary, search, and chat bots.


It just seems like a framework to use other released models but on AWS hardware.


At first, I thought AWS was launching their own SQLite hosted database.

BedrockDB is a SQLite based database with MySQL compatible drivers.

https://bedrockdb.com


Much better headline than the other few articles posted about this. Amazon's docs even fail to mention what bedrock and titan actually are!


This was prior to this announcement, but...

> Watch Amazon turn off AWS services and then use ai to replicate all the customer's software products and then reroute to their own services.

https://twitter.com/PaulYacoubian/status/1641560232869462017


easier to learn about this at https://aws.amazon.com/bedrock/ rather than bi article.


LLM from AmazonBasics


powered by mechanical turk from 3rd world countries


GPT Chat at home


My buddy whos livelihood depends on his Amazon stock vesting has been extremely anti-GPT.

Wonder if it was due to Amazon being slow to release something.


I wonder how John Digweed feels about the name. ^_^




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: