Hacker News new | past | comments | ask | show | jobs | submit | WhiteDawn's comments login

In theory the chipset max is 2GB, but the physical motherboard does not have enough address lines traced to support more than 256MB. Adding more memory would require a PCB redesign


... or one crazy rework.


Citation needed? The study concluded in Sept 2023 but no results were provided. Are results on these studies usually so delayed? Perhaps they didn’t get the expected result?


The study did run and conclude: https://clinicaltrials.gov/study/NCT05155696


Could you tell us what the conclusion was? Because this is the very point of this discussion.


IIUC, the data have been gathered but the results aren't yet published: https://clinicaltrials.gov/study/NCT05155696?tab=results

I think we're confused because it's a strange stage for bringing the study to our attention.


A few months isn't a long delay for a paper to be published.


Calling it here: mslt, to apologize, you should say "I'm sorry, I didn't read far enough to justify my own snark. That's who screwed up here. I apologize to the poster I lashed out at"


So last minute that the AMD engineers found out Microsoft went with Intel at the Xbox launch party [1]

[1] https://kotaku.com/report-xboxs-last-second-intel-switcheroo...


Also, the security system assumes that the system has a general protection fault when the program counter wraps over the 4GB boundary. That only happens on AMD, not Intel.


Wow! Good find!


Great post! I find the quote from Bunnie in the blog pretty relevant:

“The JTAG boundary scan approach was rejected on the grounds that the TRST# pin, used to hold the JTAG chain in reset, was tied active in a manner that was difficult to modify without removing the processor.”

Gives me flashbacks to simpler times where disk based systems lacked any real form of DRM because of the assumption that a consumer wouldn't be able to afford to press their own CD-ROMS.

Maybe still not as easy as burning a CD-R, but BGA rework stations have come down in price and utility enough that they are practical for the semi-serious tinkerer. Most modern designs account for this, but I wonder if other techniques, maybe like decaping or some future unknown, will start to open new, simple, vectors of attacks on our hardware today.

I don't really have a point to make here I guess, just that most assumptions made today tend to not quite work out as expected, and that's kinda neat.


Decapping is well within the realm of hobbyists, I have thought about putting together an open source equivalent of the JetEtch, which can sometimes be found on eBay for fairly cheap. My main concern is that the chemicals used are really dangerous.

Things like laser glitching are feasible in the $thousands range and power or RF glitching is in the $hundreds range.

With low cost fiber laser cutters and etchers, lots of techniques can be applied. I imagine with the high resolution X-ray stuff starting to come onto the used market, lots of things for hardware hacking will become extremely affordable.


Yes. Decapping and laser glitching is all exploited today in amateur settings. Here are dudes working on old Arcade machines http://caps0ff.blogspot.com in fact most recent post is about laser glitching :)


Its not the cost of BGA rework hardware alone that was the main barrier back then, it was knowledge and experience. BGA was seen as something very difficult to the point of not being worth even attempting. In early 2000 even basic SMD was an insurmountable barrier to most amateurs, today its trivial despite using same equipment. Op author Markus Gaasedelen identifies himself as a software researcher by trade, but casually replaces BGA CPUs on the side :) all thanks to easier access to information. Merely knowing something is possible pushes humans forward, thousands of available tutorial pages/videos helps too :)


I am curious how this will all settle out in the end. I think the majority of users don’t really care about the API or subreddits going private as they primarily just lurk.

However, the people that do care are the ones that moderate and contribute the vast majority of the content that the larger group enjoys.

I am pessimistic that the minority here will win out in the end, but the majority may begin to lose interest if the quality of new content drops.

At least for myself, the blackout gave me enough space away from the site to consider if my time on Reddit was valuable/enjoyable and basically I concluded it is not worth the time. I’ve uninstalled the app and I haven’t really missed a thing.


> I think the majority of users don’t really care about the API or subreddits going private as they primarily just lurk.

I agree. These protests have missed the point. There is a (very) loud minority raising hell right now, but spez is right, it's just noise. The silent majority is still hanging around.


They're a loud minority because they've invested more into the platform. It is the 1–9–90 rule in action.

When the 1% leaves the platform's quality will go down.


My bet is that quality will go up. I'm not really interested in reading what the small number of people who spend 8+ hours per day on Reddit think, about any topic. Hopefully they'll take their silly Reddit mannerisms and inside jokes with themselves on the way out.


Unfortunately I believe the ones with the Reddit mannerisms and overused jokes are the ones that stayed as they are too addicted to leave.

The actual creators of content are different from the drones.


We'll see. Reddit will not die in 2 weeks that's for sure. But some people will leave and maybe a viable alternative will surface as a result of this shifty behavior


Personally I think this is the biggest selling feature of FPGA based emulation.

The reality is both Software and FPGA emulation can be done very well and with very low latency, however to achieve this in software you generally require high end power hungry hardware.

A steam deck can run a highly accurate sega genesis emulator with read-ahead rollback, screen scaling, shaders and all the fixings no problem, but in theory the pocket can provide the exact same experience with an order of magnitude less power.

It's not quite apples to oranges of course, but the comfortable battery life does make the pocket much more practical.


When being nitpicky about latency is where FPGAs truly shine. You lose a good bit of it by connecting to HDMI (I think the Pocket docked is 1/4 a frame, and MiSTer has a similar mode) (EDIT: MiSTer can do 4 scanlines, but it's not compatible with some displays), but when we're talking about analog display methods or inputs, you can achieve accurate timings with much less effort than on a modern day computer.

For a full computer like the Steam Deck, you have to deal with preemption, display buffers, and more, which _will_ add latency. Now if you went bare metal, you could definitely drive a display with super low latency, hardware accurate emulation, but obviously that's not what most people are doing.


Counterstrike through the ages used to be a paid product. After a ban your whole account was locked down, so in a practical sense it used to always have a 20-50 “escrow” fee to join the pool of legitimate players. Even today the cs:go “prime” status does a similar thing.

It doesn’t seem to affect the number of cheaters in any way, if anything it leads to incentives of account stealing and underground exchanges of steam acc/keys.

Personally I feel the cheating issue is more of a side effect of games moving away from dedicated servers with communities and towards global matchmaking. There used to be well run servers that would quickly kick-ban cheating players and have a social construct that incentivizes playing nice to keep access to the good servers.

Not that practical today with all the battle royals and as with any “government” there is abuse and corruption, it wasn’t perfect but I do miss the days of servers that always had a admin online to shutdown cheaters and rules around minimum pings and bare-minimum sportsmanship in the voice chat.


I think the community servers solution wouldn't have scaled up to current markets. Back in the day with smaller userbases you'd have less casual players, more people who would be interested in administrating a server and willing to work for free. Now that gaming is mainstream and everyone does it, not everyone cares as much as people used to back then.

Counter-Strike itself still has community servers as an option. And given the choice between those servers and regular matchmaking, most players prefer the latter. This map hasn't updated in 4 years, but I doubt that has changed: https://teamwork.tf/worldmap/csgo/live


Setting a $500 entry fee would quickly shrink the global pool to about the size of a dedicated server. Not that local server wouldn't still be an option as well if someone (or a group) wanted to run it.

For people that just walk into the game though, I wouldn't mind paying to instantly have a well-curated global community without having to vet each server trying to find the good ones. Again though, for kids with more time I think there is room for both.


The results of this study actually presents that PLA fumes are more toxic than ABS fumes.

In practice though, the study shows PLA is safer because of significantly less fume creation with the lower printing temperature.

So relative risk for PLA is significantly lower than ABS, but it is incorrect to state that PLA is less toxic.


You can smell the ABS fumes so they must be worse for you. The nose knows! /s


The human nose is an extremely sensitive chemical detector, and should not be dismissed. It's effective down to a fraction of parts per trillion for certain chemicals.

https://en.wikipedia.org/wiki/Odor_detection_threshold#Odor_...


But for others, like carbon monoxide, it is useless and leaves you to die.


You can't count on your nose to warn you, but when your nose does warn you it pays to heed that warning. If your nose is screaming at you to GTFO, listen to it.

You listen to your other senses, right? When you touch something hot, you listen to your sense of touch telling you that you're about to get burned. That doesn't mean your sense of touch can warn you of all hazards, but you certainly shouldn't ignore it when it does.


That's my usual line when people talk about smell being indicative of how safe something is.

Farts smell. Carbon monoxide doesn't.


I'm always conflicted with this.

My gut says any new student should start with an interpreted language like Python or JS/TypeScript. As that gets you to running code, and core concepts like variables, loops and if statements in little to no time.

However, there is value in learning some of the under the hood concepts such as pointers, structs, memory layout, endianess, pass by reference, compilers etc.

I don't think schools need to teach employable C/C++ skills, but C/C++ is a great language to play with and experience these core concepts.

However I'm not sure if the value in learning these concepts are real, or it's just my own interests/nostalgia. You can have a successful career in this industry without having to manage a single byte of memory, and it arguably makes sense to accept abstractions at their face value so you can focus on what builds your skills/product.


> pointers, structs, memory layout, endianess, pass by reference, compilers etc.

C++ is a bad language to teach any of these concepts. Sure, people will be exposed to the concepts, but they are presented in a rather esoteric fashion. Not to mention, actually leveraging those concepts in is bad practice anymore, i.e., using a pointer arithmetic to loop over arrays instead of iterators or the like.

I didn't grok a lot of those concepts until I took computer architecture, which was taught in assembly language. And we weren't taught x86, but a toy assembly language designed for teaching.

Another big pain point I had in school is that every professor / TA had different opinions on what was a right and wrong way of doing things in C++. And sometimes their opinions would conflict with the damn documentation too. There's nothing like having to relearn core language concepts every year at the whims of professors. This is probably where most of my disdain for the language has come from.


the chance of you being taught correctly by your so-called "professors" (they are not professors unless they have been appointed to a chair) is vanishingly remote, but this has zero to do with the language


C and C++ are still horrible languages even if you want to teach those concepts, because of how many footguns they have. That's why Pascal was so popular as a teaching language, historically speaking - it still has pointers and other stuff you need for manual memory management, but it's much simpler and more regular both syntactically and semantically.


no, it has exactly the same issues as c and c++, and some of its own, such as arrays of different sizes being different types. guess why it isn't used anymore


Arrays of different sizes are different types in C, as well - this is obvious when you are dealing with pointer-to-array types.

That aside, Pascal removes a lot of UB and other footguns by forcing explicitness for e.g. casts and pointer arithmetic, or providing (verifiably safe) byref argument separately from raw pointers. Strings and arrays are a mess in standard Pascal, which is why everybody used dialects that solved them - most notably Borland's, of course, which was used for a lot of DOS and early Win32 software.

Anyway, I'm not suggesting Pascal specifically today. The point is that C and C++ were never good teaching languages, which is why something else was usually used as one whether we look at 80s, 00s, or today.


>However, there is value in learning some of the under the hood concepts such as

>[...]

>I don't think schools need to teach employable

My University taught the intro CS class in Scheme; years after I graduated they switched to Java and last I saw it was Python (based on visits back to campus and wandering through the bookstore to see what textbooks were for sale). I just checked and it still Python, based on the course description ("how to design and implement algorithmic solutions in Python"). I see a few 2xx level classes are in Java, and after that it stops mentioning specific languages.

Anyway, it's tough since there is pressure to teach the concepts, which argues for certain languages, yet also produce employable graduates, which argues for certain other languages.

Finding overlap is tricky... teaching theory in Haskell, under-the-hood concepts in assembly, software development gluing libraries together in javascript/c++, may in fact be the superior approach... but there is fatigue associated with learning languages just to learn more languages when maybe a nice general language that serves many educational needs is a better way.

Python might be the sweet spot to start out with, and indeed it looks like the 3 intro classes at my alma mater, are taught in Python. I'd like to think the driving force behind this is that 1) Python works well, and 2)using one language for first year students (well, 2nd semester 1st year or perhaps 1st semester 2nd year) lowers the mental overhead on the students.

Going heavy on C/C++ early essentially selects people that already come in with a programming background. Some folks don't get that, or not much of it, in high school and want to enter the field anyway. And I think it is fair for them to reasonably expect, like you can with every other academic field, that they can do that via the starting curriculum.


I have written maybe one binary search that went into production code in 20 years of software development work. It is absolutely an under the hood concept.

But knowing how it works so that I can leverage the concept efficiently is super important. Having a sorted or unsorted list/array/tuple/whatever-linear-thing and an functions that search them and then knowing what the performance characteristic will be like and how I should put those two things together is not something that can easily be googled.

I agree it doesn't need to be a first year thing, but it does need to be part of a robust computer science education.


Exactly. And understanding WHY you should prefer using a Struct (or class) instead of a dictionary / hash-map. Can you feel the PAIN of all that additional cost??

The Python / JS world is all dictionaries. Such a developer might never understand why their language runs slower.


I would suggest Pascal or a compiled BASIC dialect to learn those things.

The nice thing with these languages is that they are relatively small but still has these concepts, thus perfect for learning.


Should take a look at their up coming digital logic analyzers too. Sipeed is packaging up some very interesting hardware at cheap prices. Barely any documentation though, so prepare to get your hands dirty!


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: