I’m sure a lot of this stuff is basic to people here, but to a software guy, I’ve found his channel invaluable for understanding how computers work at an electronics level and also just wildly fascinating.
YouTube has been my _sole_ video source for years and I'm still discovering incredibly high quality creators for very specific topics.
Smarter every day
... In no particular order.
YouTube is great.
What fascinates me is that, even with a full design under my belt which works in simulation, computation remains somewhat magical to me.
There is a much later Zuse design, from discrete transistors, the Z23, in the collection of the Computer History Museum.
I followed along with the clock module and when I was done I had a Matrix style "I know kungfu" moment - it all was crystal clear. And the series doesn't brush over important details (like dealing with flaky inputs that would cause extra clock pulses etc) so even though I still don't know all the math needed for electronics, I'm far more confident in making my own stuff!
It’s such an accessible hobby thanks to the abundance of online information/ideas, cheap components, and simple tools. Plus you learn so much even with the most basic of projects, and considering how much our world now depends on electronics and technology, I think it’s practically useful as well as simply entertaining to learn this stuff.
Great Scott, https://www.youtube.com/user/greatscottlab
Andreas Spiess, https://www.youtube.com/channel/UCu7_D0o48KbfhpEohoP7YSQ
Edit: asking because there's so many to choose from
IMO you should try to find a really simple computer to do first. The advantage here is you have other people's software to run which makes for a nice integration test. My first time I did the original apple and had it run the monitor program (which is nice you can just keep adding opcodes going up from the beginning of the program until it works.) For the Z80 I think maybe the ZX80 is the one where it was pretty much just a Z80, a ROM and SDRAM chip and some glue logic?
I am still catching up with it because I didn’t put so much time into it yet but so far I have completed up to and including day 5 of AoC 2019.
For me, this is the first time I have implemented an emulator, and what I like about this imaginary computer is that it is so simple that I was able to build what I have built so far without relying on any external resources.
The fun part of it is working through it yourself of course, so even though I am tempted to link to my implementation on my GitHub I will instead only leave a link for AoC 2019.
As a software guy currently doing the Embedded Systems course on edX, Ben's videos have really helped solidify what I've read in the course.
Big example was tri-state drivers, after reading about them in the course they hadn't quite clicked, and I searched YouTube and came across Ben's video. He explained the concept so incredibly well, it clicked immediately and I've found it a good companion for the course.
It suffered from my classes in that Eater actually deals with the minutiae of working with real electricity — debouncing, clock edge detection, and so forth — because he's building his CPU in the real world on breadboards. That was interesting for me to see.
Here's the first video: https://www.youtube.com/watch?v=LnzuMJLZRdU
Related from a few months ago:
> I built a programmable 8-bit computer from scratch on breadboards using only simple logic gates.
Sorry, but this is slightly incorrect. First, he is using ready-made adders in ALU which are not "simple logic gates", second he seems to use ROM in control logic which replaces tens or hundreds of logic gates one has to use of they don't have a ROM and a programmer.
If I was building a CPU, I would make 1-bit ALU and shift data bit-by-bit to save ICs because 1-bit ALU requires 4-5 simple ICs and 8-bit ALU would require 8 times more and that's too much.
Next, I would use 4-bit registers instead of 8-bit because 8-bit register requires 2 ICs and 4-bit register can be implemented with one IC. Also it allows to reduce the number of wires and amount of work to connect them. This makes CPU slower though because it has to access memory twice as often.
Also I was surprised to read in comments that in some universities students are designing CPUs as a practical project. Must be interesting.
But mostly, I like to crap on other peoples accomplishments.
Perhaps I should look to see if any of my old books are still at my Dad's house. This one is going for nearly $1k on Amazon. Not sure how a book from 1981 can be sold as "new" but wow.
In my experience, to see this sort of anti-accessibility from someone who knows the "low-level" enough to build CPUs successfully is unusual, but unfortunately getting more prevalent. The others I'm aware of, http://www.homebrewcpu.com/ and https://www.bigmessowires.com/bmow1/ , have far more accessible sites.
Edit: looks like others here think accessibility is not a concern anymore. That's really disappointing, especially for low-level stuff.
what about the site violates accessibility? JS generated sites should be usable on screen readers. you can violate accessibility, but you can do that without js too
Lamenting that it is used when there is no need for it seems reasonable to me. The grandparent even made an effort to not just offer destructive criticism but also constructive alternative approaches.
* The header image, saying “Ben Eater”, has no alt-text. That would be fine, except the letters are individual SVG paths.
* Likewise, the social media sharing buttons also have no alt-text. A screenreader will just see four images with links at the beginning of the page.
* The YouTube videos are constructed out of multiple links without text each, including SVG images.
* The YouTube video links open into the same window, harming usability. The "target" attribute should only really be used in sites built out of frames, and this is not one of the exceptions.
* My screenreader-type program, in "just read the proper text" mode, misses half of the headings. Outside of this mode, I have to sit through over a minute of drivel before the content, because there's no skip-nav link, and then again for the video links. (This one doesn't quite count, because it's 'cause my screenreader-type program sucks and ignores aria-hidden when I set it to "all". But the lack of skip-nav is an issue.)
I got bored at this point, but I'll list some other issues I spotted while looking through the dynamically-constructed DOM:
* Identical SVG images are copied-and-pasted throughout the file, but with different CSS styles – some browsers might waste time re-rendering.
* The DOM contains, no joke, eight consecutive </div>s. The removal of some of these wrapping <div>s makes no perceptible difference to the page.
* Mixing and matching semantic and non-semantic HTML tags, confusing certain "reader mode" tools.
And it turns out that this page is actually multiple pages, with no machine-accessible links between them, bundled up into one file stretched across multiple URIs. This is not the proper way to handle caching the next page. I'm struggling to articulate how bad this is.
When I click on the "kits" page, random loading animations partially obscure the top of some of the paragraphs as it tries, and fails, to add some kind of inline purchase widget. There's significant DOM bloat, and displaying the page takes up an entire core of my laptop.
This is the only computing device I own capable of rendering the page in less than five seconds, even with my very fast network connection. Normal websites with an order of magnitude more writing, plus images and links, take an order of magnitude less time to load than this.
This website is not good. But it's nowhere near the worst out there – in fact, this would probably be in the top 60% of pages I use regularly, if I did.
I fixed a number of the low-hanging a11y issues that you identified.
As for using modern web tooling, keep in mind there are many business and technical tradeoffs that go into any engineering decision. In this case, dynamically rendering the content makes it easier in the future to require aspiring hardware engineers to pass a test demonstrating knowledge of WCAG 2.0 standards before unlocking any educational hardware content.
… Wait, those changes applied to all of the pages‽ I'm starting to finally see the appeal of those web frameworks.
I know that I have stopped caring for "accessibility" almost entirely beyond the most obvious things. Most of these complaints are either totally inane and unnoticeable, or will affect so few people over the site's lifetime that I can count them on one hand. I think it's good for people to try and make your website as accessible as possible, but there's a limit to how hard you can indignantly demand people bend over backwards for you.
The list contains “inane and unnoticeable” issues because I was checking it manually, only spotting what came to mind. This is because external accessibility tools cannot check this website for accessibility issues, because they cannot access it.
I did not pick up on, for example, the issue of content flashing up on the screen in distinct parts, hitting the "three flicker limit" heuristic for seizure safety. Sure, that's only going to effect very few people, and it's fairly low contrast compared to other sites, and on a high-end gaming PC this would all occur within a single frame (making the point moot), but it's still something to consider and attempt to minimise.
Personally, I'm more concerned with the bandwidth; the fact that the page makes requests to Google's CDN for images and fonts that should be hosted on the same domain (and ideally provided on the same connection); and the ridiculous amount of processing my browser has to perform to draw the page. There are many more issues with the page than just a11y.
I'm building a 68000 system now for fun. I know I'll never really understand my desktop, so I'd like to have at least one computer in the house that I really fundamentally understand.