

Crucial's MX100 solid-state drive reviewed - nkurz
http://techreport.com/review/26532/crucial-mx100-solid-state-drive-reviewed

======
keehun
[http://www.anandtech.com/show/8066/crucial-
mx100-256gb-512gb...](http://www.anandtech.com/show/8066/crucial-
mx100-256gb-512gb-review)

------
tdicola
Wow, nice looking drive for not a lot of money. I've been living with a 128GB
SSD but find it's just a little too small. After 9-12 months or so I generally
need to flatten and reinstall because a lot of junk has accumulated and I'm
running out of space. A 256GB drive should be perfect, and at $100 I'm not
going to agonize over the cost.

------
ajtaylor
I've been thinking long and hard about upgrading my Macbook Pro with a hard
drive to an SSD. Which this kind of performance and value, it's looking to be
a no brainer. The main thing holding me back was going from 500 GB HD => 256
GB SSD (based on price). With the MX100, I can have it all!

~~~
stcredzero
Make sure you have Trim Enabler and a good backup system! With an SSD,
performance goes up and reliability will go down. SSD manufacturers have been
improving firmware in order to be able to sell ever crappier MLC memory. You
will need Trim Enabler in order to preserve the performance gain. This also
means that you have to restart once again whenever your software updates
require a restart, in order to restore TRIM.

~~~
wtallis
Essentially _none_ of the unreliability of consumer SSDs is directly
attributable to MLC flash cells wearing out. They're large enough, have enough
spare area, and the wear leveling _works_. When a consumer SATA SSD fails,
it's almost always a catastrophic failure of the controller or its firmware,
and the rest of the failures are from things like losing an entire NAND
package or some other critical component on the PCB.

Now, it can be said that the need for good wear leveling and latency hiding is
what causes the controllers and their firmware to be so complex and thus
failure-prone. But the myth of SSDs wearing out needs to die. When an SSD does
wear out its write endurance, it's a gradual process with lots of SMART
warnings and a gradual decline in performance when the controller no longer
has enough less slack space to work with, and some sporadic data corruption.
It also doesn't start happening until the drive has been subjected to a lot of
heavy writing - it takes _months_ of 24/7 torture testing:
[http://techreport.com/review/26058/the-ssd-endurance-
experim...](http://techreport.com/review/26058/the-ssd-endurance-experiment-
data-retention-after-600tb)

~~~
stcredzero
I had a Crucial drive fail on me during a reinstall at about a year and a
half. Spolsky has also written about the reduced reliability of SSDs. It
doesn't matter if it's a secondary cause of more complicated firmware from
using less stable MLCs -- a crashed drive is a crashed drive. It wouldn't
happen so often with SLC flash drives or Spinning platters.

And yes, I took steps to reduce wear on my SSD, including installing a
secondary drive, noatime, disabling local TimeMachine, and using external
drives for torrenting. My SSD still died.

~~~
dougabug
Drive reliability is far more important to me than minor differences in
performance. Unfortunately, reliable information _about_ reliability is
typically absent from most sad reviews, making them not terribly useful for
purchasing decisions.

------
zura
Btw, do SSDs make any significant difference for desktops nowadays? I mean, I
power up my PC once in a day. Also, as I'm aware it doesn't really help much
with compilation performance. I'm asking this from dev-centric PoV.

~~~
MaulingMonkey
> Btw, do SSDs make any significant difference for desktops nowadays? I mean,
> I power up my PC once in a day.

Depends on your workload.

Mine now frequently involves accessing far more data than will fit into RAM,
and thus will fit into the filesystem cache. There are only two options: This
either flushes useful data out of the cache, or fails to enter cache in the
first place. Either way, I'm going to be hitting either slow to seek spinning
platters, or a fast random access SSD, even with a perfect oracle for an OS
caching algorithm. Sadly, current OSes seem to be lacking even that.

> Also, as I'm aware it doesn't really help much with compilation performance.

Friends have seen compilation times approximately _halve_ , even without a RAM
busting workset (yes, they timed it.) I've personally seen similarly extreme
performance gains in compilation, again even without a RAM busting workset.
I've seen even better gains for those worksets that do bust RAM, especially if
extremely IO bound.

YMMV, but I do not buy or build computers unless I can put a sufficiently
large SSD into it anymore. I define "sufficiently large" to include my OS, my
tools (IDEs, SDKs, etc.), and a couple of active projects to fit onto.
Everything stays snappy. I bought an 80GB back when they were $500, about
twice the cost of any other component in any of my computers then or since
(ignoring more SSDs), and still consider it money well spent.

I recently bought a laptop - priorities, in order, were: Must have an SSD,
should have extremely high resolution (ended up with 1800p, not 1080p), and
only after that did I even bother to check if the machine had at least a
decent amount of RAM, and a not entirely terrible CPU. It doesn't even have a
dedicated GPU, just the little Intel that can do D3D11 at horrifically low
framerates. (This despite using Direct3D SDKs both professionally and for
play.)

~~~
sdrothrock
What laptop did you find that had 1800p?

~~~
pedrocr
A Lenovo Yoga 2 Pro has a 3200x1800 13.3'' panel and intel graphics so fits
the OP's specs.

[http://shop.lenovo.com/us/en/laptops/lenovo/yoga-laptop-
seri...](http://shop.lenovo.com/us/en/laptops/lenovo/yoga-laptop-series/yoga-
laptop-2-pro/#techspecs)

------
Turing_Machine
Amazon has the 128 GB marked down to US$79.99. US$224.99 for the 512.

