
Mac Pros, Ara, and Modularity - jjcm
http://jjcm.org/blog/mac_pros_and_modularity/
======
toufka
It really startled me that the comparison was the Pro vs the Ara, and not the
Pro _and_ the Ara. The Pro is very much in the spirit of modularity. True,
everything written demonstrates the different level of granularity of
modularity between the Ara and the Pro. But to the consumer, the forward
facing arguments are exactly the same: "Buy what you need, each separately,
and link them all, as needed rather than a huge AIO - one size fits all".

The Mac Pro is requires you to bring your own HDs, bring your own expansion
cards, bring your own network gear. Which centers the Pro itself at the core
of a modular computing center.

~~~
verbin217
This seems to be a common misunderstanding about the new Mac Pro. Apple has
actually improved modularity by liberating expansion from a fixed physical
space. Additionally, users aren't required to install scary-looking computer
hardware into PCI slots. They can just use the familiar UX pattern of plugging
a cable into a port.

~~~
wmf
Except the Thunderbolt device is twice the price of the equivalent PCIe card
and now you have cables everywhere.

~~~
coldtea
You already had cables everywhere if you were a professional. I don't know any
professional photographer, video editor or audio professional that doesn't use
tons of external storage.

It's mainly non pros that argue that a few internal 2TB bays would be enough,
which is laughable for pro use.

~~~
brusch
But if they were professionals didn't they use SAN (or at least NAS) for this
?

Don't get USB hard disks old pretty quick ? (and the performance with GByte
LAN is much bigger). So it's just a single LAN cable ?

~~~
simonh
I suppose it depends whether you consider Pro and Enterprise to be synonymous.
I don't think they are.

Anyway, what kind of high end Mac user ever bothered with USB drives, when you
could use Firewire and have double the throughput with no CPU overhead?

------
hershel
Clayton christensen , the researcher behind "the innovator's dilemma", says
there are natural cycles in industries between integration and modularity.
Basically , when the performance of a product isn't good enough, there's a
need for tight integration. So the winners usually have tightly integrated
products.

At some point people don't need more performance , the products are "good
enough". Then a modular architecture becomes very usefull because it increased
the rate of innovation, increases competition and reduces prices. At those
phase the winners are usually companies who control modular products.

Now if we look at the mobile market we see signs of a possible shift: for many
consumers , mid level phones are good enough. And jolla unveiled in 5/2013 a
modular phone[1] that achieved good reviews.

Incidently , it's seems a few months after that declaration, one year ago,
motorola has begun working on a modular phone.

So assuming we can extract plenty of value from a modular phone(an it seems to
me that way) , i believe we will see a large shift towards modularity.

[1]The jolla phone is composed of two front, front and back. the front is
fixed, while you can change backs.
[http://en.wikipedia.org/wiki/Jolla#Products](http://en.wikipedia.org/wiki/Jolla#Products)

~~~
revelation
That seems.. wrong.

Back when PCs weren't very good and you had to make real choices about what
processor, hard drive and so on to get, DELLs customization and modularity
business flourished.

Now that performance is "good enough" for practically any computer you can buy
today, the market is moving to highly integrated tablet solutions, the only
choice left being disk space, and it's getting less important every day.
Meanwhile, DELL is going private.

~~~
hershel
>> Back when PCs weren't very good and you had to make real choices about what
processor, hard drive and so on to get, DELLs customization and modularity
business flourished.

The cpu/memory/chipset remained integrated , due to performance reasons. But
external bus performance was good enough to separate some functionality into
external cards.

~~~
revelation
CPU, memory and chipset have always been very much modular parts. Chipset is
determined by which motherboard you buy, memory comes as sticks you put into
the motherboard, and even the CPU sockets in to keep it modular.

I mean just imagine the trouble to keep the CPU seperate from the motherboard,
as is still done today! It is very very difficult to build a socket (and
matching CPU) for the 1000+ pins of a modern CPU where each pin can possibly
carry signals at multiple GHz frequencies.

~~~
hershel
Yes you're right ,they are modular.

The change christensen refers to is the shift from companies that built whole
computers, like IBM - to the IBM pc , which was an assembly of parts from
different companies.

My guess is that at that time , the pc-XT came with memory chips that weren't
soldered , but in sockets. At that context that didn't mean much loss of
performance. That habit of pluggable memory stuck. I'm not sure you lose much
performance due to it.

And if we're talking about integrated memory - intel does have caches.
Probably those are the best ways to deeply integrate memory.

And regarding CPU and board: it would be quite hard to integrate chips and
board. There's was one attempt i know but it failed as far as i know. It's
complex and not economical, but it does offer great performance.

------
nine_k
Obviously Ara and iPhone fill different niches and take different sets of
compromises. Neither of these is better than the other by _all_ parameters,
but each offers _unique_ advantages.

Possibly the niche of Ara will be narrower, because the connectors are going
to add noticeable cost. OTOH the ability to add an entirely different
functional block right into the phone can be very valuable in some
circumstances.

Of course, CPU and RAM need tight integration; I suppose these will come as
one module. But the periphery, like radios, storage, cameras, etc need much
less tight coupling and can use physically narrow high-frequency serial
connections. If a connector is only 4 contacts wide (like USB) or even 6
contacts (like USB3), it can be reasonably cheap an compact.

Most probably a real Ara phone will have far fewer detachable blocks than
currently pictured: a CPU/GPU/RAM block, a radio block, a camera block,
probably an extra extension block for new devices (finger scanner? second
camera for a stereo pair? projector?), and, of course, traditional detachable
flash storage and battery.

------
roc
The bigger problem with Ara-style modularity is that as computing power
shrinks (in both size and power draw) it becomes pointless to think of modules
of "a" computer.

There's simply no point to try to separate processing power _out_ of any
component that has other physical constraints. e.g. displays, lenses,
antennae, etc.

So you won't want to slap a 'better' camera onto your smartphone to leave your
DSLR at home. But not because the interconnect between lens and mobile "base"
will become too wasteful or inefficient. Simply because that future lens will
_be_ a stand-alone camera and it will operate within a network of things, in
which your mobile phone won't be a _necessary_ component.

Similarly with any other useful components. You won't have to make any trade-
off of space other than "what can you physically carry".

Need more storage? Put a storage pod in your pocket. Want a rangefinder? grab
a 'network of things' capable device. Need more battery? grab a power brick
that can charge _anything_ with a capacity limited only by your willingness to
carry it, rather than being locked into a form-factor unrelated to (and
possibly in conflict with) your power needs.

~~~
Zak
People want cameras in their phones not because their cameras lack
connectivity but so that they don't have to carry a separate camera.

Of course, there will always be a niche for cameras with lots of manual
controls, possibly with big lenses, and they'll work as you say. DSLRs with
wireless networking are already on the market, such as the Nikon D5300.

~~~
daliusd
I own mirrorless camera with wifi. I really would love to have camera
component with interchangeable lens on my smartphone just to save some clicks
and own single device instead of two.

------
minikites
It's my understanding that most major appliances (washer, fridge, etc) are
less repairable than they were 50 years ago. I think it's because of several
contributing factors:

* Cheaper manufacturing with simultaneous increase in quality control and reliability.

* More complicated objects lead to more costly repairs, mechanics have to be more skilled, so wages go up.

Because it's simultaneously cheaper to something and more expensive to fix
something, it makes much more sense to replace than to repair. These effects
feed off of each other, too. To make something more reliable and cheaper,
manufacturers seal off more and more parts, making it more and more expensive
to repair, and so on.

~~~
aidenn0
When my fridge broke and the repair person came, it was basically: Either the
circuit board or the compressor needs replacement

~~~
bluedino
Yea, and it's a $400 circuit board instead of a $5 switch.

------
scragg
> We're already seeing workarounds - graphics cards are using two, or even
> three PCI-E slots to get that precious bandwidth they need. With 4k displays
> on the horizon, and textures in games being updated accordingly, we're going
> to need more bandwidth to graphics cards. At some point, we're either going
> to have to switch to two dimensional connectors like processors are using
> for GPUs (which will still only delay our issues), or we're going to have to
> move away from the 1mm build fabrication for our interconnects. If the later
> happens, we simply can't rely on consumers to properly line up lanes in
> components.

They aren't actually using the slot, just potentially covering a slot right?
One of the fastest desktop cards out, r290x only uses one PCI-E slot.

~~~
oakwhiz
Yes, every graphics card that I have seen merely occupies additional slots to
make space for the cooling devices. However, I do believe that Nvidia's SLI
and AMD's CrossfireX multi-GPU interconnects are a proprietary variation on
PCI-E, so in some respects that could be considered an additional slot, but it
only provides bandwidth between GPUs.

~~~
jjcm
oakwhiz is correct, I was wrong in my statements about cards taking up more
than one slot. The blog has been updated accordingly. Sorry for the
misinformation all, I could have sworn there were cards that did, but googling
turned up nothing.

------
beloch
I sincerely doubt interconnects are going to be the problem the OP thinks they
will. First of all, individual connectors have not changed much in size, but
the serial data-rate we can push through copper connections continues to rise.
Fiber optical connections are already used to implement PCIe in many
applications (Thunderbolt is, in part, a fiber optical PCIe link).

One huge advantage of fiber optics over copper is that you can send many data
streams down a single fiber using different wavelengths and/or optical modes.
This is why backbone fiber bandwidth keeps growing even over fiber links that
haven't been upgraded. It's a function of what you hook up to the ends of the
fiber, not the fiber itself. There is _tremendous_ room for bandwidth growth
in optical fiber.

Apple definitely doesn't need to use non-standard interconnects for the new
mac pros just like they didn't need to use custom connectors for SSD's in
their laptops. They just wanted to.

~~~
gji
I'm not sure fiber optics are particularly viable in the next 5-10 years as
interconnects between hardware components. Adding photonics to a hardware
component increases size and cost fairly significantly, which is why you don't
see many fiber optic interconnects yet, even for applications where cable size
is important. Moreover, to get the kind of miniaturization you would need for
a cell phone, you're talking on-chip photonics (diodes and photodetectors
integrated into the IC itself), which still looks like it's in the early R&D
phase.

Of course, most of this is probably because copper is still doing just fine in
terms of bandwidth. Though some big issues with forcing huge bandwidth over
few traces are latency and the additional circuitry needed to translate the
signal into the actual components needed to drive RAM chips or a CPU.

~~~
beloch
[http://www.technologyreview.com/news/518941/intels-laser-
chi...](http://www.technologyreview.com/news/518941/intels-laser-chips-could-
make-data-centers-run-better/)

It's out of R&D and entering production.

------
ajaimk
The Mac Pro is a very modular computer. The only thing set is the Cooling,
Power and Case. Everything else is Modular:

* Graphic Cards * Storage * Ram * CPU - In the form of the entire motherboard but that has always been Apple's approach

Everything else lives outside the box and is super modular.

~~~
wmf
Those "modules" are proprietary form factors that you cannot buy and Apple
won't support any modification other than adding RAM.

------
dragontamer
Modularity is going away, not only in Mac Pros, but even in desktop space.

AMD APUs are integrating the CPU and GPU together. Intel is releasing
mainstream chips that cannot be physically separated from the motherboard (See
the i7-4770R: it needs to be soldered on).

GDDR5 RAM, the mainstay super-fast graphics RAM, is assumed to be soldered
onto a board. DDR4 will only support one RAM per lane.

Modularity is almost the opposite of market forces right now... as unfortunate
as it sounds.

~~~
djjaxe
AMD's "APU" is simply a way to join 2 different products and jack up the price
on the one joined product. As such these are still a modular part. As for
Intel this is the same idea they can force you to buy not only their CPU but
also the entire motherboard and force you to pay for the entire board that is
made by them. Modularity is not the opposite of the market forces right now.
The big companies are just too lazy to spend the money to, 1. go smaller, 2.
get more complex.

~~~
dragontamer
You can build an AMD APU system extremely cheaply. A ZBox Nano A4-5000 (using
the "Kabini" APU) is only $300 or so for a complete computer (RAM, Hard drive,
etc. etc. included).

On the contrary, the AMD APUs are extreme value buys. Intel has superior CPUs,
but you save a money with AMD APUs because you don't need to buy a graphics
card anymore. AMD is beginning to do crazy stuff, like cache-to-cache direct
transfers between CPU and GPU, because both are on the same die.

Intel has been improving its integrated graphics in response of course... but
they aren't at the level of AMD's APU integration yet.

But to call this "price jacking" means that you completely don't understand
the marketplace right now. AMD's Kabini and Temash chips are among the
cheapest in the entire marketplace right now... and AMD's higher-end APUs
(Codename: Richland) are sold at significant discounts in comparison to Intel
chips.

------
thrush
I'm afraid you may have missed the point that Apple makes with creating
"walled gardens" with their products, and also what it means that Motorola is
releasing a phone with modularity. Apple provides its customers focus. Rather
than worrying about how they can modify their machine to improve performance
and functionality, Apple's customers can be fairly confident that their
machine will be reasonably fast, feature rich, relatively safe, and
aesthetically pleasing while sacrificing a higher monetary cost and faith that
the previously mentioned qualities will be fulfilled. You are very right that
modularity on a mobile phone is going to involve sacrifices due to space
constraints, and this is what we will see with the Ara. Interestingly though,
while almost all mobile phones I've been exposed to in the past, this one will
have swappable options. Basically I think this is saying that mobile phone
technology is at the point where we can make some of the sacrifices but still
end up with a useful device with options. It will be really cool to see how
this plays out and if 3rd party add-ons will cause Motorola to pull back with
this offering.

I'd like to point out that you highlight the benefits of modularity as
improvements for speed, power consumption, size, and cost, but the use case
you give is that you are writing an article on this custom machine that you've
built. What else does your custome-made computer let you do that you couldn't
do otherwise? And why do you think that the majority of consumers don't go
down the custom route?

On a side note, I feel that I have to refer this article from the Harvard Law
Review. It is a bit of a philosophical discussion on the "generative" aspect
of computers compared to other devices/appliances. It's headlined "The
Generative Internet" by Jonathan L. Zittrain.

[http://dash.harvard.edu/bitstream/handle/1/9385626/Zittrain_...](http://dash.harvard.edu/bitstream/handle/1/9385626/Zittrain_Generative%20Internet.pdf?sequence=1)

------
snoonan
The article hits the nail on the head. Engineering for modularity will squeeze
out the benefits at some point. It's not some evil conspiracy, but ever-
shrinking electronics and tolerances. It's going to be the domain of robots
and high res 3d printers because we just won't be able to see or touch things
at this scale.

------
Aqueous
"I'd rather make improvements to speed, power consumption, size, and cost"

I think those are optimizations that _you_ care about - but not the average
consumer. We're getting to the point where phones of the same price pretty
much universally operate within a standard of performance, give or take a
margin of error, and can run a day or longer without recharge. At a certain
point users stop caring about further optimizations and start caring more
about features. And that's where modularity gives a huge pay-off.

~~~
xerophtye
with the whole debate about "who wants more power/speed? They are already fast
enough", i feel we are missing out on one crucial point:

Our perception of hardware performance is driven by the software running on it

The only time you start feeling your tech isn't fast enough is when new
software comes out that is even more resource hungry. The PC's running Tomb
Raider 1 seemed fast enough at the time, but can you imagine them running even
apps now? Even with smart phones them selves, iPhone 3 was great! until people
started making apps that needed iPhone 4 level of resources and you started
feeling your iPhone 3 was too weak...

So I think we're always gonna want more power. What's the point of a modular
phone if it can't run any of the latest apps?

PS: I still think a modular phone would be insanely awesome! But just giving
my 2 cents

------
taeric
It is always funny to see the arguments of modularity age. Consider, the idea
that a car is this somewhat perfect modular thing. While there is certainly
something to it having some obvious discrete parts. Typically along the lines
of what wears at a different rate from other items, the idea that it is
modular for the 99% consumer is laughable. And I say this as someone that
changes my own brake pads. Speaking of, the "components" of most computers are
more universal than the average car part. Unless you are simply talking about
things which plug in what used to be cigarette lighter.

------
chimeracoder
> A screen has to put out enough photons that a pupil two millimeters wide can
> capture enough light from it half a meter away. That fundamental principle
> isn't going to change anytime soon, and since that's the largest piece of
> energy consumption in phones these days, power requirements aren't going to
> drop significantly in the foreseeable future

The technology isn't there yet, but I would love to see the day when we can
make laptop screens (and the like) out of e-ink-like displays.

I don't _like_ staring at backlit screens all day. A computer screen has about
the same luminous intensity as a 40W bulb[0], which is not a pleasant thing to
stare at all day.

If only we could figure out a way to make e-ink remotely usable with high
refresh rates, etc[1]., we could rely on external lighting (which is usually
available) and get dramatically better battery life out of our laptops and/or
phones.

It could even fall back on "frontlighting" (what the Kindle Paperwhite does)
for nighttime usage.

[0] Preempting any physicists' objections: yes, I know that watts do not
measure luminous intensity (that would be the candela), but a 40W bulb is a
familiar reference point, and this comparison is roughly in the right
ballpark.

[1] Which is a big hurdle, and why I admit that the technology isn't there
yet.

------
aeturnum
I'm not sure I buy the narrative about 1mm connectors. You don't currently
line up the contacts on your graphics card - the card only fits into the slot
one way. What is preventing us from taking the current design and doubling the
number of contacts? Manufacturing tolerances get smaller, but they're
certainly going to get smaller if we start printing the same number of
connections.

~~~
chiph
It's a little hard to make out, but if you look at the photo of the ARA, those
aren't 100-pin connectors -- they're more like USB connectors. Power + Data =
4 pins. So not all that hard to line up, especially if you have guides in the
screen portion to slide in your modules. And _unlike_ USB, there'd only be one
way to orient the pluggable module. :)

My chief concern would be pocket lint getting in there.

~~~
sirkneeland
and dust, and sweat, and humidity...

------
quink
Modularity is going to remain very very much alive, at least for mobile
phones.

Here's why.

A mobile phone needs to be a certain size to hold it. There's always going to
be a minimum size, beyond which it won't be sensible, it won't be usable.

See this?
[https://www.google.com.au/search?q=panasonic+gd55&tbm=isch](https://www.google.com.au/search?q=panasonic+gd55&tbm=isch)

When have you last seen one of those being used? People want their 4 or 5 inch
screen, it's as simple as that. The most frequently sold Android devices used
to be fairly dinky 320 by 240 affairs, but at the price difference becoming
fairly tiny, they've now given way to devices with bigger screens, higher
resolutions and more processing power.

The Google Nexus 4, for instance, features a base frame that 's a fairly thick
chunk of Aluminium. It's bezel features large back areas at the top and the
bottom of more than a centimetre each. It features triumphs of miniaturisation
within, sure, but the product itself isn't. It's designed to be used by human
hands.

The only thing that really requires the space is the battery and the antenna.
The antenna can be built into the base frame. It could even be a sheet that
sits underneath the components that could also be replaced. Or it could be a
bezel, similar to that starting on the original iPhone 4, that might also
double as a bumper frame for the thing. Have a bumper frame with connectors
and you're there. Or maybe at some point those extra 3 or 6 dB don't matter
much any longer for the majority of the market, especially if a lot of the
heavy lifting will be done through WiFi.

When it comes to the speed a phone can accomplish as it matters to the vast
majority of the market, then size of the components to make that happen is not
that much of a constraining factor.

At some point, the diminishing returns of a modular phone vs. a non-modular
phone will bring it to a point where the performance difference, as it matters
to the average consumer, will be about, let's say, 20% or maybe even 30%.

But when you can target the one aspect of performance that matters most to you
and swap that out, if you can prioritise there, then you have the ability to
make up for that.

If you then also have the components, on average, each last twice as long for
your needs than would otherwise have been the case, it's just become twice as
valuable an investment.

I like the size of my Nexus 4, and any performance penalties imposed by
modularisation are outweighed by being able to swap out components and it
being a better investment. There are diminishing returns to making mobile
phones smaller. Desktop computers have reached a certain size and then they
didn't really get any smaller because they had no reason to and modularisation
outweighed that. Why not the same with mobile phones?

~~~
quink
"But longer battery life"

~~~
quink
I don't care. Make the processor more efficient, make the screen more
efficient and it really becomes a non-issue. LG could have put in more battery
in the Nexus 4 had they really wanted to, probably 20% more. Remember Nokias
with four or five week battery lives? Well, with re-thinking on the other
sides of this, like ubiquitous MicroUSB, battery life beyond two days became a
bit of a non-issue. Completely. Sure, some phones might be constrained because
of battery life, but they're not in the majority. It's just a standard
component of a rectangular shape and thickness they put in these days and it's
good enough.

What am I really going to do with, what, 20% more battery life, if it already
lasts 2 days? And how about the possibility of each of these components adding
more than one functionality? Who is to says that a GPU component can't also
add a battery? I wouldn't mind having a phone with two batteries. And no one
is saying that everything in the phone should be modularised out. How about a
module just called 'mainboard' that contains everything but the battery and
the antenna, and of course the screen? You could then override or enhance
functions on this mainboard with additional modules. And it would be easy to
replace. The concept of these connectors and all wasting space becomes a bit
of a weird thought if you actually might hold it in your hand an realise that
it's a good size, it has good performance, it has good battery life, it has a
good screen and there might only be two or three modules in the eight module
slots available. And that's doable. Saying that there's wasted space in a
phone like that becomes, instead of a criticism, it becomes a justification
for buying new modules.

All in all, the wasted space in this is not going to outweigh the "wasted
space" (if you can call it that) in the majority of smart phones out there by
a substantial amount.

In fact, it gives manufacturers an incentive to cram every single cubic
millimetre they're given with as much functionality as possible, or at least
sensible, because their revenue will depend on this utterly in a freer market.
Has that been the case so far? I've repaired my Nexus 4. There are plenty of
cubic millimetres of space being used that, strictly speaking, don't really
need to be. Most of what's happening here is that these structural cubic
millimetres are being shifted to a different and sure, a bit less efficient
shape.

The only outcome that matters coming out of all of this will be that it might
cost $15 more. Or $5. Or some small integer dollars to pay for all the
connectors. But if I can use at least half of it twice as long, that's an
amount that doesn't much matter.

Here's a secret: The Nexus 4 is already full of spring loaded connectors
anyway. I counted. It has about 5 going to the back cover. And one for the
speaker. Add the SIM and that's another one. Then add the two connectors for
antennas... and arrive at about 9. If you count the battery, which uses sprung
pins in addition to two screws, that's 10. This thing has eight visible at the
back, add a SIM and you arrive at 9, and add maybe another one or two at the
front. Ten, maybe eleven. So, really, the only big difference is that these
connectors are now more easily user accessible and the aluminium frame is a
bit of a different shape. There's already plenty of internal plastic to
surround individual components in a Nexus 4 anyway, so just make it a bit
thicker on the outside. Not a big leap, people.

"But everything is moving towards integration"

Well, that hardly precludes it from moving in this direction as well. The CPU
being on the same silicon as some mesaure of GPU is hardly going to be
fundamentally affected by this, is it now?

~~~
scott_karana
The Nokia phones with four or five week battery lives used a single backlight
for a 120x120 or so pixel monochromatic LCD screen.

Don't get me wrong, I love the one that I have _and still use as an alarm
clock_ , but it'll never replace a high pixel density colour screen in
anyone's books.

~~~
pekk
Plenty of people don't really care about watching movies on their phones, so I
wouldn't say "anyone".

~~~
kayoone
In my world, almost everyone cares about consuming content (maybe not movies)
on their smartphones, heck they use their smartphone more than their PC/TV.

------
WiseWeasel
Seeing that computer running from inside the mobo box is like seeing pictures
of frightened, malnourished animals in those Humane Society ads. Get a case,
you cheap bastard! One coffee spill away from disaster...

------
ricardobeat
Hadn't heard of Project Ara before, sounds an awful lot like
[http://www.phonebloks.org/](http://www.phonebloks.org/)

~~~
resetmp
It was on the front page yesterday, and if you searched about it, you would
quickly learn that Motorola is collaborating with Phonebloks.

------
aschampion
What graphics cards are using 2 or 3 PCI-E 3.0 x16 slots?

------
alebairos
Can't wait to see the first prototype for an organic based cell phone. No
modules, but software controlled morphing parts.

------
ksrm
I'd much rather see modular, standardised laptops. Mine doesn't even have a
removable battery.

~~~
dshefchik
But do you really need a removable battery if it lasts 14 hours per charge?

------
kylec
128Gb, not 128GB. 16GB storage on the iPhone 5s iFixit tore down.

------
namuol
I don't see how the two are mutually-exclusive.

------
pauletienney
Kudos for the jjcm.org website.

------
djjaxe
It is more sad that people are seriously believing this person who is a
designer not an engineer. "I do art and code and sometimes other things. This
is my homepage, which as a rule of thumb contains nothing useful. This
homepage is where I conduct experiments, which at any given time are usually
broken."@ [http://jjcm.org/](http://jjcm.org/)

I am sorry but the only things that are non-modular are computers like the
xbox360 and products of that nature that the companies do not want the user to
change/upgrade the computer at nearly all cost. It is ridiculous to state that
it is "inevitable" to become non-modular. Scientists have been backing the
barriers of nearly every science there is in the past 50 years and as such you
can NOT conclude that this is "inevitable" it is more likely that by the time
we run into a space problem at the level you are implying (which by the way is
ridiculous as other people stated... what graphics card uses 2+ PCI slots LOL)
that we will finally figure out how to make the first bio-matter computers.
Ie: a computer that has a cpu of nearly jello matter that will process data at
most likely at first a slower but much cooler rate than our cpu's that we use
at the moment.

~~~
jjcm
I'm actually an engineer at Microsoft. Design is just a hobby of mine.

As for the graphics cards, I was wrong on that account. My memory betrayed me
(I could have sworn I'd seen cards that used two slots somewhere along the
line, but googling found nothing). The blog has been updated.

~~~
revelation
There are graphics cards that use two slots, just not two PCIE interfaces.
Humongous cooling solutions on contemporary graphics cards usually extend over
two slots:

[http://images17.newegg.com/is/image/newegg/14-127-768-Z02?$S...](http://images17.newegg.com/is/image/newegg/14-127-768-Z02?$S300$)

~~~
djjaxe
"what graphics card uses 2+ PCI slots" USES PCI SLOTS. not covers.

