

Open source software only comes in one edition: awesome. - fogus
http://www.codinghorror.com/blog/archives/001283.html

======
CodeMage
As usual, Jeff mixes different concepts, does some hand-waving and comes to a
stunningly weird conclusion. What Microsoft did with artificial memory
limitation is ridiculous, unless someone comes up to offer a reasonable
technical explanation. However, I find it equally ridiculous to jump from
there to a conclusion that open source is better because it does no market
segmentation.

Let's keep one thing clear, if you're not selling your product then you won't
do any market segmentation for it. You might, however, do market segmentation
when it comes to offering support for that product. Does that make you as evil
as Microsoft? Are 37 Signals evil because they offer different plans whose
prices probably don't scale linearly with the costs behind them (in other
words, they don't get the same profit across plans)?

Of course, Jeff didn't actually use the word "evil", but the implication of
moral inferiority is almost palpable. Open source has many advantages (and
disadvantages, too), but not segmenting the market is not one of those;
rather, it's a consequence of not selling the software itself.

~~~
codinghorror
Do you really believe that the 37signals approach is not _effectively_ the
same thing?

Consider the basecamp $99 plan vs the basecamp $49 plan:

100 projects / 35 projects vs. 20 GB storage / 10 GB storage

Do you _really_ believe those extra 10 GB and 65 projects cost 37signals $50 a
month to deliver to you?

Similarly, do you _really_ believe that the cost of supporting 48 GB of memory
versus 32 GB cost Microsoft $1000 per customer to build?

Wolf in sheep's clothing, exact same concept with Web 2.0 patina. Rich
customers pay more.

Anyway, my argument isn't really about the money, but the mental friction. I'm
sick of dealing with marketing weasel feature matrices

~~~
CodeMage
_Consider the basecamp $99 plan vs the basecamp $49 plan:

100 projects / 35 projects vs. 20 GB storage / 10 GB storage

Do you really believe those extra 10 GB and 65 projects cost 37signals $50 a
month to deliver to you?_

No, I don't. What I said was "they offer different plans whose prices probably
don't scale linearly with the costs behind them (in other words, they don't
get the same profit across plans)". I acknowledged that the price is not a
linear function of the costs and that they get more profit from more expensive
plans. My question was "why is that evil?"

 _Similarly, do you really believe that the cost of supporting 48 GB of memory
versus 32 GB cost Microsoft $1000 per customer to build?_

I'm sorry if I sound hostile, Jeff, but I read your post before commenting on
it -- did you read my comment before replying? I said that the artificial
memory limitation is completely ridiculous (unless there's a valid technical
reason for it). As I explicitly state in my other comments, I believe that's
unethical.

Here's the summary of what I believe, a Cliffs Notes edition of my comments:
Having rich customers pay more is not unethical, crippling the features of
your product to extort money is.

------
10ren
A company needing over 32GB of server RAM is getting _more value_ from it than
if they only needed 1-4GB RAM. It's a measure of customer value created.

I suspect Jeff is upset mainly because he _didn't know_. 32GB is an arbitrary
seeming boundary, that wasn't signaled clearly (enough to him). He breathed to
life a plan to live his ideal of RAM-cheap, coder-dear. He went to the trouble
to buy RAM. And to pay for it. And to install it... All the while _expecting_
that it would just work.

He feels ripped off, tripped up, cheated, deceived and abused. He didn't get
what he paid for (specifically: what he _thought_ he was paying for). This a
problem of expectation-management. And Jeff is quite right to blame marketing.
And he's right to punish them. As his article has. Go Jeff!

~~~
10ren
_Clarification_ I think my comment seems a bit sarcastic, but it wasn't meant
to be. I think marketing should be as up front and clear as possible about
what customers are getting, instead of hiding it. Publicity (like Jeff's
response) is one of the few pressures on them to do so.

~~~
10ren
_Clarification 2_ : I, like Jeff, also have a pricing matrix for my product.
Sometimes people don't read it properly and are confused - although I'm nice
about it, I tend to feel unsympathetic. This is because it's hard for me to
really see from someone else's point of view, since we only actually ever see
from our own...

It reminds me of the founder/maintainer of a very successful programmer's
website who said, "People will not read what you write [for a web-app GUI],
not matter what you write, or how you write it". His frustration was maxed
out. It occurred to me that one solution is to only provide buttons (i.e.
actions) for what you actually want users to do. This constrains their choices
to only valid ones. This amounts to a kind of "wizard" - but they've been very
successful, so this isn't a bad thing. It also obeys the "don't make me
think!" imperative, provided the GUI is designed-well... It also helps if the
pricing options are also designed well. In fact, I believe it is worth
sacrificing profit/benefit to the customer (or other efficiencies) if by doing
so, you can make the choices clearer and simpler for the customer (eg. clarity
can come from obvious patterns in the pricing, even if they totally don't fit
the true demographics).

To apply this to successfully communicating the limits of each pricing option,
where "successfully" means that the customer hears what you say, you could
have a button for each limit. By consciously selecting the limit, the user
would know what it is. Unfortunately, this can lead to frustrating
backtracking when one choice constrains other choices in unexpected ways. One
can muck around with different orderings of the choices, but I think that
ultimately, the only real solution is to _make_ the options match what the
user would expect.

That is, change your pricing model to accord with what the user expects. Not
vice versa. This is a kind of "pricing positioning", where the goal isn't to
sell more, but to communicate better, by associating your product with a
category, and then following the standard pricing approaches for that category
- whether they suit you or not. The purpose is to communicate, and doing it
well makes everyone happier.

For me, it's intrinsically valuable: successful communication is a joy in
itself; miscommunication is suffering.

------
aneesh
> "Already, I'm confused. Which one of these versions allows me to use all 48
> GB of my server's memory? ... Just try to make sense of it all. I dare you.
> No, I double dog dare you! "

Dare accepted. It's actually right there, 1 click away from the page Jeff
linked to: [http://www.microsoft.com/windowsserver2008/en/us/compare-
spe...](http://www.microsoft.com/windowsserver2008/en/us/compare-specs.aspx)

The answer to his question is Enterprise, Datacenter, and HPC.

------
MicahWedemeyer
Heck, thanks to github, OSS spawns a new "edition" as fast as someone can
click the fork button.

Not that I'm complaining! It's nice to be able to find a project abandoned by
the original dev, but picked up by a 1/2 dozen other people.

------
nopassrecover
This is like complaining that your student version has the artificial
limitation that you can't use it for commercial use. If you needed to use more
than 32GB of ram, something only heavy duty commercial servers would need,
then you buy the heavy duty commercial version of Windows Server.

You have paid a price before that was in essence subsidised by the higher-end
commercial users - now you are one of them (Congratulations!) and get to
"subsidise" standard versions for the rest of us.

It's the same theory as scaled taxation (which is a different argument as
government has separate responsibilities than business) - think of yourself as
in the highest earning bracket now.

------
sutro
Maybe the awesome foundation will give Jeff an awesome grant to upgrade his
windows server to awesome edition. Isn't it an awesome day on HN?

------
sunir
That's hardly the case. In Jeff's words, I dare you, double dog dare you to
count the number of Linux distributions. Each version satisfies a different
customer target, same as each version of Basecamp satisfies a different type
of company.

~~~
whughes
Let's just look at the list of Ubuntu editions, since that's comparable
directly to Windows or Mac OS X.

Ubuntu

Kubuntu

Edubuntu

Xubuntu

Lubuntu

Fluxbuntu

Mythbuntu

Ubuntu JeOS

Ubuntu MID

Ubuntu Netbook Remix

Ubuntu Studio

~~~
mikeryan
But AFAIK they all cost the same price?

~~~
statictype
Well, I think the point of the article was not that segmenting made the
product more expensive, but rather, you now have to sit down and spend some
time carefully deciding which version you need.

------
TravisLS
This post does a very effective job of highlighting rule #1 of market
segmentation: every customer should be almost immediately sure which segment
they fit in.

For example: is this computer for business or personal use? do you need the
"super-duper full-lenth CGI movie" version, the "regular everyday use"
version, or the "i only browse the internet" version? You just can't push it
much further than that.

Vista failed miserably on this front, which surely accounted for more than a
trivial decline in upgrade sales.

------
j_baker
> If I choose open source, I don't have to think about licensing, feature
> matrices, or recurring billing. I know, I know, we don't use software that
> costs money here, but I'd almost be willing to pay for the privilege of not
> having to think about that stuff ever again.

The grass is always greener on the other side isn't it? Don't get me wrong,
FOSS has upsides, but segmentation isn't one of them. If time really is the
cost rather than the cost of the software, FOSS isn't any better. The only
difference is that nobody is artificially segmenting the market. The market
artificially segments itself.

Think about it: do you use Debian? Fedora? CentOS? Or do you go with someone
you can buy a support contract for like Red Hat or Novell or SuSE? Or would
you rather go for a BSD flavor of _nix? OpenSolaris seems to also be gaining
popularity; do you choose it?

And once you get that sorted out, what webserver do you choose? Apache?
Lighttpd? Nginx? Comanche? Are you doing Python? Then you also should consider
CherryPy.

So, to summarize... sometimes I'd be willing to pay money to know that no
matter _what* edition I choose, I'm still buying Windows.

------
yason
Segmentation is a good thing also from the consumer/buyer viewpoint but
"limitations" like this max O/S memory limit are absolutely spinelessly
crippled blatant rip-offs, and have no correlation with the actual work
required to make those features.

The same goes for the stupid restriction of the maximum of ten simultaneous
network connections (can't remember whether it was inbound or both?) imposed
in some basic Windows (Home? Professional? vs. Server?) editions.

An operating system shouldn't limit the capabilities of available hardware.
They could reasonably decide to not support some heavy-duty server hardware at
all except in the server edition. Or they could give the server edition a
better I/O scheduler (that can handle large loads) or something to make it a
must if you want to use Windows on a server.

But if I have a network adapter in my box I expect to be able to max it out in
all imaginable ways regardless of the O/S "edition". Or use as much memory as
I can physically fit in my box.

------
anigbrowl
So why is Jeff Atwood using Windows instead of Linux to run his database
server? Not complaining, but presumably it delivers something that the open-
source alternative doesn't.

~~~
mattmcknight
He uses SQL Server, of course. The root cause is that Joel used to work for
Microsoft. Awesome.

~~~
dmpayton
Not just that, but Jeff is a .Net programmer (IIRC, S.O. is written in C#).

~~~
smokinn
And he spent a good chunk of his career as an evangelist for a Microsoft shop.
(Not Microsoft itself but a "Microsoft partner")

------
samuel
That's nothing. When you want extra processor resources on an IBM iSeries, you
only need to enter some code (after paying IBM big €€), and vóilá, your server
will use some extra cores that were always there. I have heard Sun does the
same, and probably others too.

All in all, it isn't that bad. At least there's an easy upgrade path. Probably
you can't do that with Windows Server.

------
raintrees
Applying the googlenomics concept I just read in Wired, aren't auctions the
next step in pricing's natural selection process?

~~~
sfg
Can auctions function for goods that are not (practically) scarce?

~~~
raintrees
I was thinking of an auction over a specific time period to set price levels.
I think it would also be well to have it re-run/revisited later, to see if the
same price levels still apply (see previous iPhone app charge change
articles).

------
drcode
can we please stop using the word "awesome" already?

:)

~~~
mmc
I say, pick your battles. "fail" as a noun rankles far more. Killing "(epic)
fail" would be truly awesome. I'd even be willing to bring back "tubular" in
trade.

~~~
swolchok
What if we change the slang to "flail", which is at least both a noun and a
verb?

------
sho
For someone who started computing on his father's Osborne 1, hearing people
casually mention that they upgraded their server to 48GB induces something
like vertigo.

A 768,000 fold increase in main memory. Christ we've come a long way.

~~~
neilc
And to think, hard drive seek latency has only improved by 2x or 3x.

~~~
sho
The Osborne 1 didn't even have an HDD! You think HDs are slow, try 5 1/4"
floppies for latency.

Anyway, SSDs soon. I will not be sorry to see the back of spinning magnetic
media of any stripe.

~~~
10ren
x3,000,000 for me (ZX81 with 16 K RAM pack - but if I'd stuck with the
standard 1K, it would be... x48,000,000).

~~~
10ren
One version of Moore's Law predicts x1000 increase in 15 years (x2 every 1.5
years; 2^10 = 1024).

48GB isn't standard today. The ordinary amount in an average desktop PC is
about 2GB. Therefore, in 15 years (2024), we can expect ordinary, standard
desktop PC to have 2TB RAM.

