The argument of more compute power for this plan can be true, but this is also a pricing tactic known as the decoy effect or anchoring. Here's how it works:
1. A company introduces a high-priced option (the "decoy"), often not intended to be the best value for most customers.
2. This premium option makes the other plans seem like better deals in comparison, nudging customers toward the one the company actually wants to sell.
In this case for Chat GPT is:
Option A: Basic Plan - Free
Option B: Plus Plan - $20/month
Option C: Pro Plan - $200/month
Even if the company has no intention of selling the Pro Plan, its presence makes the Plus Plan seem more reasonably priced and valuable.
While not inherently unethical, the decoy effect can be seen as manipulative if it exploits customers’ biases or lacks transparency about the true value of each plan.
Of course this breaks down once you have a competitor like Anthropic, serving similarly-priced Plan A and B for their equivalently powerful models; adding a more expensive decoy plan C doesn't help OpenAI when their plan B pricing is primarily compared against Anthropic's plan B.
Leadership at this crop of tech companies is more like followership. Whether it's 'no politics', or sudden layoffs, or 'founder mode', or 'work from home'... one CEO has an idea and three dozen other CEOs unthinkingly adopt it.
Several comments in this thread have used Anthropic's lower pricing as a criticism, but it's probably moot: a month from now Anthropic will release its own $200 model.
As Nvidia's CEO likes to say, the price is set by the second best.
From an API standpoint, it seems like enterprises are currently split between anthropic and ChatGPT and most are willing to use substitutes. For the consumer, ChatGPT is the clear favorite (better branding, better iPhone app)
An example of this is something I learned from a former employee who went to work for Encyclopedia Brittanica 'back in the day'. I actually invited the former employee to come back to our office so I could understand and learn from exactly what he had been taught (noting of course this was back before the internet obviously where info like that was not as available...)
So they charge (as I recall from what he told me I could be off) something like $450 for shipping the books (don't recall the actual amount but it seemed high at the time).
So the salesman is taught to start off the sales pitch with a set of encylopedia's costing at the time let's say $40,000 some 'gold plated version'.
The potential buyer laughs and then salesman then says 'plus $450 for shipping!!!'.
They then move on to the more reasonable versions costing let's say $1000 or whatever.
As a result of the first example of high priced the customer (in addition to the positioning you are talking about) the customer is setup to accept the shipping charge (which was relatively high).
That’s a really basic sales technique much older than the 1975 study. I wonder if it went under a different name or this was a case of studying and then publishing something that was already well-known outside of academia.
I use GPT-4 because 4o is inferior. I keep trying 4o but it consistently underperforms. GPT-4 is not working as hard anymore compared to a few months ago. If this release said it allows GPT-4 more processing time to find more answers and filter them, I’d then see transparency of service and happily pay the money. As it is I’ll still give it a try and figure it out, but I’d like to live in a world where companies can be honest about their missteps. As it is I have to live in this constructed reality that makes sense to me given the evidence despite what people claim. Am I fooling/gaslighting myself?? Who knows?
Glad I'm not the only one. I see 4o as a lot more of a sidegrade. At this point I mix them up and I legitimately can't tell, sometimes I get bad responses from 4, sometimes 4o.
Responses from gpt-4 sound more like AI, but I haven't had seemingly as many issues as with 4o.
Also the feature of 4o where it just spits out a ton of information, or rewrites the entire code is frustrating
Yes the looping. They should make and sell a squishy mascot you could order, something in the style of Clippy, so that when it loops, I could pluck it off my monitor and punch it in the face.
1. A company introduces a high-priced option (the "decoy"), often not intended to be the best value for most customers.
2. This premium option makes the other plans seem like better deals in comparison, nudging customers toward the one the company actually wants to sell.
In this case for Chat GPT is:
Option A: Basic Plan - Free
Option B: Plus Plan - $20/month
Option C: Pro Plan - $200/month
Even if the company has no intention of selling the Pro Plan, its presence makes the Plus Plan seem more reasonably priced and valuable.
While not inherently unethical, the decoy effect can be seen as manipulative if it exploits customers’ biases or lacks transparency about the true value of each plan.