16GB DDR4 memory is slowly creeping up to the $200 mark. While prices for CPUs and other components are falling steadily or at least are staying stable, the overall cost of PCs is increasing just because memory is so damn expensive nowadays.
At least you can buy RAM if you really wanted to.
For few years in late 1990's, you couldn't buy RAM if even you had the money. Companies like Dell/Apple were unable to manufacture computers to sell because of lack of RAM modules. Price of RAM Modules literally fluctuated a lot like gold or crypto coin day by day.
There were robberies targeted at computer parts distributors by robbers who wanted RAM modules to resell in black markets.
And this is how Samsung got big. They had just started investing in RAM manufacturing and Dell/Apple and the likes started signing long term contracts with Samsung for steady supply of RAM modules. Samsung Electronics Revenue grew x4 yoy one year around that time.
This is about the time when Dell signed 10 year contract with Samsung for computer parts. I learned much later that as a term of the contract, Samsung stopped selling Samsung branded laptops in North America. It was weird. Samsung had started making some good laptops but they suddenly disappeared from shelves in US. Laptop reviewers who reviewed them in EU praised them, but they were not sold in US.
I also remember hearing about that Dell contract that prevented Samsung from selling Samsung branded laptop in the NA. I believe it was in part because Samsung designed and manufactured Dell's laptops in those days -- which now I believe is now outsourced to Quanta.
The parts contract between Samsung/Dell was about a billion dollars over 10 years. It included CD/DVD drives, RAM, screen etc. A billion dollar in tech industry today doesn't sound like much but for late 1990s, that was a big deal. But not reported much in US if I recall correctly.
Also I distinctively remember reading a piece by a reporter who was quoting a Dell representative talking about a small/light Samsung Laptop that Michael Dell was using as his everyday laptop. They really liked its small form factor. I think it may have been Samsung Q20 which was rebadged as Latitude X300 around year 2000 or so.
Also, I heard Tim Cook first caught the attention of Steve Jobs around this time. He joined Apple to help smooth out supply of parts for Apple computers and he was instrumental in working with Samsung to accomplish it. I just checked and he joined Apple in 1998 so the timing does make sense to me.
Btw, as I was trying to look up old computers of Dell, I realized there is no complete list of them anywhere. Not like one available for Apple computers. Sad how the latest thing at a point in time like in 2002 is no where to be found on the internet...
I am so happy that I added 16GB DDR3 a couple of years ago, which was 70 Euro in the bargain bin. It's also crazy with our recent server purchases, where a 768GB Dell server was quite a bit more expensive 6 months ago than two years ago (same specs).
yea the shitcoin miners have really driven up the price of graphics cards in the last few months. It's rare to find new cards for sale at or below MSRP.
People are telling this since mid-2017, yet, it still doesn't occur... And they also said the shortage would be ended in Q4 2018, let's wait and see...
It's hilarious that my ~6 year old Lenovo x230 (16GB) has as much, if not more, memory than many 'top of the line' laptops today. It seems that the high price of RAM over the last few years has really stagnated the increase of RAM in systems, or OEMs decided they didn't really have to add more.
It probably don't help that Intel decided not to officially support 8Gbit DDR3 and that not all Skylake laptops were DDR4. It is also funny that DDR3 ended up lasting more than a decade.
My personal MBP has been chugging along. It's a late 2011 15" MBP, I swapped out the RAM and put in 16GB and I also have a 512GB SSD. And it keeps up with several of my co-workers devices, the only real issue I have with it is the Graphics Card, but it's not like I'm playing all day or anything. The only issue I've had, was when I had send it to Apple for a Logic Board issue; my MPB was a 2.8ghz i7, when they returned it after repairs it was a 2.2ghz. When you go to purchase a mac, that extra procesing speed is hundred of dollars. But after complaining the credit card charges were dropped and my repairs were free. (Even if I'm sure it cost them next to nothing to do the repairs.)
But I'm more than ready for an upgrade, I've actually bought a few Refurbished Mac's in the past and they've always seemed to be a good price for exactly what I needed.
I wish there was some kind of go-fundme page where was generating funds to sue Apple / Jonny Ive / Dan Riccio. Not for damages or anything, but just to get in a room with one of them and beg and plead for a "developer" macbook.
yes! They should call it Macbook WOZ (expandable, geek and dev friendly, which actually would be the real PRO version) with lot of ports and a beefy battery (to charge it once at night)
I concur. I would love to purchase a MacBook WOZ. While we're on the Woz theme, I'm imagining such a laptop to have a design reminiscent of the Apple IIgs with the beige color, the stripes, and the rainbow Apple logo, but thin enough to be able to fit into one's backpack (about the same thickness and weight as the pre-Retina MacBook Pro). Bonus points if it had a traditional non-chiclet keyboard, and even more points if it has Woz's signature somewhere to the side of the trackpad.
To clarify: Macbooks and similar machines currently use LPDDR3, which allows for significantly better battery life than DDR4. Intel never expected to need LPDDR4 support for their 14nm processors, and their 10nm processors that have updated DRAM controllers supporting LPDDR4 have been repeatedly delayed. Intel's current processors can only use 16GB of LPDDR3 given the number of channels on Intel's memory controllers and the available density of LPDDR3 chips. LPDDR4 is available in higher densities and is catching on in other product segments, so it's entirely possible to buy an ultrabook that uses LPDDR3 with its CPU but has LPDDR4 in its SSD.
It's more that lazyweb has been infecting HN. Nobody should be surprised by basic concepts that are trivially Googleable (e.g. available on Wikipedia). Posting bare links with no explanation doesn't add anything to the discussion. I hope this trend ends soon.
> Nobody should be surprised by basic concepts that are trivially Googleable (e.g. available on Wikipedia).
Abbreviations are terrible for searchability. For example, a DDG search just now told me that ECC could mean Erie Community College, Elgin Community College, El Camino College, or Essex Community College. Of course it's easy to find out what "error-correcting code" means once I know that that's for what "ECC" stands, but that latter fact might not be so easy to discover. (In this particular case, it's the 5th DDG result, but I've definitely run into abbreviations with no plausible expansion, or, which is perhaps worse, multiple plausible expansions.)
Try any of the other keywords in the comment or title, such as 'memory', '32GB', 'RAM', 'DDR4', 'SO-DIMM' -- searching those with ECC gives the correct result every time. How are we supposed to get to hacking if we can't even do simple research based on context?
While I agree with you, searching for unknown terms is easy, many people browse on their phones, where searching is not as convenient.
More importantly however, if a phrase is uncommon, then one person asking and one person answering can save hundreds of people from searching and thousands of people from not searching and not knowing.
A phone these days is even simpler than a desktop. A $20 Android phone can understand "Hey Google, what is ECC memory?".
HN is not like other sites. It is for informed discussion that adds depth to the topic. If I want a linkfarm of relevant topics, I can easily go to Google or Wikipedia. If I want to see karma whoring for copy+paste links, I can go to Reddit. HN is not going to stay more informative than other sites if we are satisfied with discussing basic topics.
Your comment sounds pretentious. Why not just assume that all kinds of people browse HN? Even IT is a diverse field.
edit: What are you people downvoting for? I'm not insulting anyone. I just frankly pointed out how I would interpret the attitude of the person asking as above.
The current LPDDR3E used in the MacBook and MacBook Pro uses around 1.8W Active in 16GB. These newer DDR4 might actually fit the use case of MacBook, 64GB Memory using only 4.5W is very attractive.
That is assuming Apple bring us back the old battery capacity. To quote Ars
Compared to 2015’s models, the Touch Bar MacBook Pros lose quite a bit of battery capacity. The 13-inch model drops from 74.9 WHr to 49.2 WHr and the 15-inch model falls from 99.5 WHr to 76 WHr. That’s a 34 percent and 24 percent reduction in capacity, respectively.
This is what, the third or fourth time in the past 20 years the big DRAM makers have been found colluding? Numerous convictions. Some execs even did jail time. So far it doesn't seem to be helping. Perhaps such consolidation shouldn't be allowed.
Unlike in 2002 lawsuits where memory makers admitted to participating in meetings, conversations, and communications in the United States and elsewhere with competitors to discuss the prices of DRAM to be sold to certain customers, etc, this time around, they are accused of "carrying out conspiracy" by announcing their capacity plans, capital expenditures, etc in which their public press releases -- which I found a bit unconvincing.
I think we are all forgetting the fact that there was a huge memory supply glut (and historic low memory prices) that starved memory manufacturer from investing in new capacity back in 2015 and 2016. Many industry observers predicted future price hike in 2017 and 2020 as a result of many money-losing quarters and (lack of) investment.
I don't see any evidence of price-fixing this time around.
I'm awaiting delivery of my Dell XPS 9570 with the i9-8950HK processor. The CPU says it supports up to 64GB memory [0]. The laptop comes with an Intel CM246 chipset [1]. How would I figure out if I can stick two of these sticks into my laptop?
Your biggest worry is whether or not your laptop has two DIMM slots, and that the system memory is not soldered down on the system (like in some (all newer?) XPS 13 systems). The CPU and chipset will likely handle it just fine, it's the OEMs you have to worry about, who make crappy design decisions that prevent upgrading.
I don't think so, unless you own an IBM mainframe that allows you to hot swap even CPUs (as long as you disable them first and allow the OS to migrate threads to others).
Yes, although if we're talking about improvements in RAM density (lithography improvements) rather than just putting more equivalent RAM in a machine, then that tends to mean lower power consumption per bit.
It looks like you are correct. I wasn't truly sure of that statement when I made it. It was just mis-remembering something heard in the lecture here[1] and somehow I formed the idea that RAM is a big consumer in standby mode (even though the lecturer did not make such a claim)
I have a Lenovo with 64GB RAM which is really handy for consulting: separate VMWare images for all clients and the ability to do Sharepoint dev work (separate VMs for SqlServer, SP, Visual Studio) with enough memory left over for Chrome...
Worth mentioning that it is impossible to stick only one chip into a DIMM, though soldering only one or two chips on is common in embedded applications.