When I worked on an early SoC with a display on the dev-board, our test software didn't do anything with the display. Depending on the last state of the system, the display usually had a bunch of ugly LCD noise on it.
Eventually I got around to implementing a rudimentary framebuffer to send to the display. Wondering what to display, I figured I'd compute a Mandelbrot fractal!
People liked it, and I was happy to have made our dev systems a little less ugly.
I went to work on other things for a few months. When I came back, they had replaced my fractal with the company logo :\
(I suspect, but don't know the codec, that the jpeg used to store the company logo ended up using more space and cycles-per-pixel than my fractal)
I put googly eyes on the men and women figures on the bathroom signs once.
HR sent out a series of painful to read HR speak emails where they chastised whomever did it, implied just about everything you could imagine ... all while saying nothing at all.
We do, but it's easy for whimsy to go too far. Googly eyes on a poster is cute. Graffiti everywhere isn't.
When one person does it, it's a clever surprise. When many people do it, it quickly becomes tedious and repetitious. It's interesting only when it's done judiciously. It's even worse when it escalates, such as with vandalism, or cruelty, or shock. And one person's "whimsy" can easily become "what, can't you take a joke?"
So I can see why an HR department would want to say, "Yes, the was very cute, the first time. Please don't turn it into a problem".
> HR sent out a series of painful to read HR speak emails where they chastised whomever did it, implied just about everything you could imagine ... all while saying nothing at all.
If HR cares and has time to do that it’s a sign, there's fat to trim in the organization. I’m sure your investors would be thrilled to learn that.
Although frustrating and disappointing to hear, that certainly sounds like the perfect sort of opportunity to surreptitiously upload slightly altered versions of the company logo.
Slightly mismatched colors.
Bassic mispelings.
Logo rotated by 4 to 5 degrees.
Overlay a semi-transparent competitor logo for a single frame.
I love whimsy and Easter eggs in our industry, and I add them whenever possible. There is always a reason to take them out (performance, potential user confusion, grumpy devs, whatever), but prioritizing them says a lot of good about the team and project.
In most cases, in my experience, I can tell what generation an engineer comes from based on attitude to whimsy. Younger devs almost always don't like them. Too much effort, too much potential for trouble. Older devs, especially past retirement, almost always like them. This industry started with a sense of humor and fun.
"Newsweek reports on the zombie invasion of New York".
I buried a Konami code in the ad integration JavaScript (no one wants to code review that) which flipped all the stories on the homepage for zombie invasion stories. The authors were all members of the dev and project manager teams. You can still Google it.
I did not get fired, for a couple reasons. One, I disabled all the ads, so there was no negative brand association. Two, we'd just launched a redesign and our homepage traffic skyrocketed from 45,000 views / day to 750,000 views a day. My personal highlight was receiving a mention on NPR morning edition.
My general manager "scolded" me in public, but privately was thrilled at the several hundred thousand dollars worth of free marketing. I figured I'd hang my hat up there.
That’s the one! A fun two days of my life tracking all the places it popped up. There was a lot of speculation at the time that it was a marketing ploy by the company, which is definitely not the case. Newsweek was a very old school company run by people not good at or willing to understand the Internet. A shame, because the people who reported to them were full of great ideas.
My first programming-related job was for a company that handled instructor evaluations for colleges. The UI we showed to the instructors when setting up their evaluation questions for their students included a demo table showing some fake courses. Rather than something boring like "Calc 101", "Bio 201", etc, I decided to go with themes/settings from the Alien movie franchise, so there was a table like this:
> Instructor, Course Name, etc
> Ripley, Interstellar Ore Hauling
> Ripley, Terraforming 101
> Ripley, Criminal Justice Reform
> Ripley, Military Bio-Weapons Projects
It stayed up for a few years until my boss made me change it because one of the instructors got confused, complaining that they didn't teach any of these courses.
Exactly... and even worse, the 404 "bot" laden with whatever garbage my teenage brain came up with at 3am back in the 90s: https://glenneroo.org/404.html
The argument that everything got more complex and bugs are harder to find now, so fun is avoided, does seem convincing, but I believe it does not account for everything.
For a philosophical take: the kind of positive effects that easter eggs have, such as joy at work, team cohesion, flimflam - they are not physically measurable. They are immaterial. And the managers who are interested in making a profitable product will be focused on material measures, since at the end of the day that is what seems to be whats important on the market.
To put it briefly, I believe that the materialistic world-view inspired by natural sciences and inscribed into our economies is not able to deal with more abstract benefits such as fun.
But also black-hat hackers, spammers and all sorts of other nefarious sorts making a lot of the openness that was possible in the 80s and early 90s much less practical.
Things were also a lot more innocent in the old days. I created one of the first (if not first) comprehensive TeX archives on the internet by pointing the FTP server on a math department VAX to the actual live application tree that was in use. I never asked permission to do that. Deploying software generally involved just putting stuff in the system directories. And I had open telnet access to all kinds of machines at various institutions. I can't imagine having things that open in the contemporary world.
Microsoft eventually had to remove Easter eggs entirely, because of government contracts that specified there could be no undocumented behavior in their software.
As I recall, Brian Valentine sent an email to the Windows division and basically said that easter eggs made the product look bad. Customers don't want to be patching their systems constantly (I was there circa 2000 so in the NT4/2000/XP era, so back when windows update was a website you visited, and the original NT4/2000 bug fix delivery mechanism was the service pack and various rollups) and then also hear it contained easter eggs for funsies.
Microsoft also licensed the source out to companies, governments, universities, etc and it wasn't a professional look. Easter eggs in games - that's fine. Easter eggs in your operating system, that licensees could see - not a good look when the previous product (NT4) didn't have a great stability reputation and Microsoft wanted to pursue businesses with new SKUs like Windows Enterprise, Datacenter, with features like clustering.
BV recognized programmers wanted to insert a little bit of credit so for Win2000 the easter egg replacement was either a link to a webpage (that listed the names of the Windows team members) or a rolling display of those members. Sorry I can't quite remember which it was... I think it was a link to a webpage to keep as much out of the OS as possible.
Remember this was around the time that Excel 97 had an easter egg flight simulator, Word 97 had a pinball game, and Windows NT 4 had one or two easter eggs maybe more (one was the opengl screensaver would 1% of the time draw a teapot at an intersection; the other was a scrolling list of windows team members that I don't remember how it was invoked. So due to this I'm 98% sure what he let into Win2000 was a webpage link). By now, the URL is probably long decayed.
> One of the aspects of Trustworthy Computing is that you can trust what's on your computer. Part of that means that there's absolutely NOTHING on your computer that isn't planned.
The closest thing I get to Easter eggs in professional work is leaving references to people or things I like in testing data. Not anything worthy for a spot on The Easter Egg Archive [1], but more plentiful and certainly better than nothing.
I've done this. A long time ago I did QA for a layout product and a lot of the issues were along the lines of "place this square next to this text in this certain way, see the attached screenshot for what it looks like".
The square could have been anything but I had a very cute picture of my toddler so I used a photo of them instead. So now somewhere in a database there are hundreds of reports with a picture of my kid in them :)
I had a coworker who named all the StringBuffer variables "buffy" (as in the vampire slayer). While being a Buffy fan myself, I found this quite annoying.
Systems were simpler, side-effects were better understood and contained.
Now I work on systems that have more complexity in their I/O controllers than the devices I had started on, and no-one understands the full stack. When something goes wrong, the whimsy is the first thing that goes out the window in trying to find the root cause.
Amusingly, an easter egg helped in the investigation of a tricky bug recently. The launch path of Twitter’s iOS app logs a “just setting up my twttr” (which is a reference to the first tweet, of course) and it was quite useful in when trying to find the root cause for a particular crash, because the system had silently changed the launch process for apps in an OS update and we could use its presence in the logs to figure out how far along we were in the startup code.
(To round out the anecdote, I’m a performance engineer on the younger side, so we’re not all bad :P)
I think that this is the problem. No one understands what is going on. So instead of trying to find the root cause of anything, they add another tool or another layer. And then we have 45 micro services running on a lot of expensive hardware (somewhere) doing less and with worse performance than what we used to do 15 years ago in a monolith on a single server.
> And then we have 45 micro services running on a lot of expensive hardware (somewhere) doing less and with worse performance than what we used to do 15 years ago in a monolith on a single server.
And yet, the "expensive" hardware of now is cheaper than your single server from over a decade if you look at the whole picture... a lot of development trends are based on capitalism incentives:
- Outsourcing stuff like server hardware management to AWS is 1-3 FTEs less on your payroll, plus saving of datacenter related cost (climate control, UPS maintenance, Internet uplink, redundant equipment/spare parts)
- "DevOps" aka "let people who have never heard of Unix cobble together Docker containers and CI pipelines until it works Good Enough" is yet another saving of specialized expert staff (SysOp / SRE)
- "microservices" got popular because it's easier to onboard developers and treat them like disposable cogs if your work packages can be really small, with clearly defined interfaces
> "DevOps" aka "let people who have never heard of Unix cobble together Docker containers and CI pipelines until it works Good Enough" is yet another saving of specialized expert staff (SysOp / SRE)
There's plenty of us old UNIX greybeards working in DevOps. And frankly in most fields of IT you're going to see your fair share of younger engineers because that's literally how life works: people get old and retire while younger people look for work. Moaning that kids don't know the tech they don't need to know is a little like trying to piss in the wind: you might get some short term relief but ultimately you're only going to soil yourself.
edit: It's also funny that you moan about AWS as outsourcing while saying "people don't remember Unix" yet Unix itself comes from a heritage of time sharing services which are the same basic principles of cloud computing.
If you're old enough you'll eventually see all trends in computing repeat themselves.
I know people who were running commercial sites from an average PC in a spare bedroom over ADSL in the early 2000s.
A bit graphic design, a bit of PHP, a bit of email management, MySQL, a "pay now" page imported from a payment provider, and they were making good money for a minimal startup cost. All the code written by one or two people, often some external help with the graphics, and sometimes the business concept was someone else's idea.
Obviously the services weren't scalable, they didn't have hundreds of millions of users, and they didn't operate globally.
But they didn't need to. And that's still true of many startups today.
We are currently paying quite much more than I'm used to, and our throughput is a small fraction. I don't mind managed servers, it just that we didn't use to need so much of it.
Unix isn't that hard. And Docker containers and CI/CD pipelines open other cans of worms, and since no one seems to understand what is going on under the surface (because no one wants to touch Unix), they just add more monitoring tools and scaling.
But we suddenly need five times as many developers because 80% of the code is just dealing with interfaces and communication and handling race conditions and recovering from failed network calls.
If this web server was running on AWS EC2 with attached EBS volumes then zeroing out a new partition was actually AWS' recommended practice to initialize the disk for performance reasons. EBS no longer requires this.
This is a HN trope that gets trotted out every time a subject like this comes up. It's gone from amusing cliché to just boring and false.
Yes, the hardware was simpler. But the knowledge wasn't.
For example, when I first started with ray tracing in the 1990's, you used something like Turbo Pascal and had to actually know and understand all of the math behind what you were doing. Today it's just #include some random other person's library and you're off to the races.
Today if a developer wants to display a new font, they just add it to the massive tire fire pile of other abstractions they've copied from the internet. Back in the supposedly "simpler" days, you plotted out a font on graph paper, then actually did the math of converting those pixels into bytes for storage and display. And did the extra math to find ways to do it in the most efficient way possible.
The knowledge has changed, but the amount of knowledge hasn't.
Things are much simpler now than they were then. That's why programmers today think it's OK to waste so much of their computing resources. With the supercomputer power we have in our pockets, this should be a golden age of computing, but instead we use that power to feed people's vanity and addictions.
side-effects were better understood and contained.
If that was true, then retro computer enthusiasts wouldn't still be discovering features and capabilities today.
The first web pages I made I had to type into the OG Notepad. And there were no tools to help, and you had to support 3 resolutions + at least 2 browsers. This was before responsive design (or indeed any kind of HTML/CSS design practices) or media queries, so for each page of content I had to write six different pages. There were no guides to help with understanding Netscape vs. IE, no 24-hr tech updates/phones (so you'd take a vacation for a week, IE would update, and you'd come back to sad emails about how the site is broken), etc. Maintenance and upgrading was a nightmare.
Luckily, we quickly stopped doing that, but the idea that things were easier back then is because people learned from what we were doing. In 20 years, people will talk about how easy it was to make things in 2022.
You're arguing different points on the same broader topic.
The GP was talking about how much of the stack we don't know. Ie if something fails in an abstraction underneath what we develop in, then we're often fscked. And there are so many layers to the stack now that the simplicity of debugging the entire stack has gotten harder -- this is a true statement.
However you are talking about the barrier for entry in software development. It has gotten easy. This is also a true statement but it doesn't make the former statement untrue either.
By making it easier to write higher level code we end up obfuscating the lower layers. Which makes it harder to inspect the lower layers of the stack. So it's literally both simpler and more complex, depending on the problem.
> > side-effects were better understood and contained.
> If that was true, then retro computer enthusiasts wouldn't still be discovering features and capabilities today.
This is a grossly unfair statement because you make a claim for one side of the argument and, without comparing it to the other side (ie are we still discovering features and capabilities of modern systems?) draw a conclusion that the original statement is false.
So lets look at the other side of argument: in fact the reality is people are routinely finding optimizations in modern systems. For example you often see submissions on HN where hashing algorithms, JSON serialization, and such like have been sped up using ASM tricks on newer processors.
Another example is some of the recent Rust code released that outperforms their GNU counterparts.
It is also worth noting that modern hardware is fast so people generally optimize code for developer efficiency (a point you made yourself) rather than CPU efficiency. So fewer people are inclined to look for optimizations in the capabilities of the hardware. However once current gen becomes "retro", you might start seeing a shift were people are trying to squeeze more out of less.
I think what you say is true in a way but also gets to the difference between simple and easy. Pulling in a library that does a lot of heavy lifting is definitely easier. The resulting system most likely isn't gonna be simpler though. You are now trying on a lot of code you don't understand and there is probably also lots of code that you aren't even using. This has very little downside to it till something goes wrong. Is it going wrong because you are using the library wrong or because the library has a bug? This isn't gonna be a big deal and gonna be obvious for the majority of cases, but the weird edge cases are gonna be what wakes you up at night.
> Systems were simpler, side-effects were better understood and contained.
I'm going to disagree, here. Things were as difficult as they ever have been since. Systems and side-effects are better understood only in hindsight.
Coding in assembly because higher-order language wasn't invented yet. Coding in a text editor because IDEs were not a thing. Searching linked lists because what's a data-base, not to mention SQL. Needing to keep things lean because 32kb was an ungodly waste.
In that light, things are easier than ever.
One can be both professional and whimsical. In fact, I'd argue that true mastery will only come with a sense of fun and interest in your craft.
Applying whimsy can be an engineering decision. Using 418 when any error code will do, for example, if it affects nothing else.
So what? If your tech stack is worth its salt, those layers will be mostly removed at compilation. Reductions in performance often are due to developers mixing up their layers of abstraction and repeating work among several of them, rather than by having several layers adding more powerful abstractions.
It's good that a developer has cursory knowledge of how those layers work (e.g. to avoid re-adding checks at the upper layers of something that is already guaranteed by the lower ones), but there's no need to be an expert at all them.
(tl;dr: the problem is not in the abstractions, is current standard engineering practices of bringing the full layer for each abstraction instead of just the few parts you need to solve your problem).
I'd say something like Docker containers, running a wasm engine, targeted by a lean library with a sane high-level programming language - either a santard-looking imperative language like Haxe, or some esoteric opinionated functional thingie like Julia or Elixir. (It could also be something like Vue, Angular or React, but those aren't exactly 'lean', being specialiced in working with the full browser DOM and web servers).
Each layer abstracts away the lower levels (virtualization & compartmentalization to run on any hardware, bytecode to run on any OS or browser engine...), allowing you to potentially use it without being tied to one specific implementation of that layer.
Higher layers provide increasingly more complex and powerful abstractions, with standardized code that's been created by experts in the field with efficiency in mind, and debugged through hundreds or thousands of client projects; making them likely more performant and robust than anything a single development team could ever build on their own (except for extremely limited domains run on dedicated hardware).
And ideally they have the plus side of working side-by-side with other applications or libraries, running in the same system without being engineered to work only with a single "official stack", allowing you to mix-and-match the best abstraction for each problem being solved, instead of forcing you into a single solution for all your software. That level of flexibility (plus the simplification of the layers below) is worth the runtime penalty imposed by several piled-up abstraction layers. That's why we don't code everything on assembly anymore.
These were all solved problems in the 90s already - we had plenty of high-level languages, including those with integrated IDEs and even databases (dBase comes to mind!). SQL was already around. 32Kb was something from another era.
> Systems were simpler, side-effects were better understood and contained.
But debugability was far harder, you could not do it remotely, and there was no stackoverflow etcetera to help you.
Example: Installed in another country is an embedded controller talking RS232 to devices. The controller is resetting, probably due to a watchdog timeout. The controller software uses a custom RTOS, with no spare hardware timers and it is hard to get logs back, which restricted in size to some kilobytes. You make educated guesses as to what is happening, but are failing. Many weeks of work later, you design your own process profiling technique using a hardware logic analyser and from that you find the root cause. Easily over a man-month of work.
> In most cases, in my experience, I can tell what generation an engineer comes from based on attitude to whimsy. Younger devs almost always don't like them. Too much effort, too much potential for trouble. Older devs, especially past retirement, almost always like them. This industry started with a sense of humor and fun.
Yea man, because most of us are just trying to get through the career part of our life to money and retire. Save your whimsy for your hobby projects. Nobody wants to debug production code with subtle easter egg behaviors.
You don’t need to invest inordinate time and energy to create cute little things at work. But this is your life! Don’t just “try to get through” parts you don’t like. Enjoy it if you can. You’re not some corporate koolaid-drinking drone if you try to have fun at work.
Sooner or later, one person's subtle easter egg becomes another's mission-critical behavior. Speaking solely and strictly for myself, I do not enjoy having to support niche undefined behavior because someone thought it was cute.
Unpopular opinion, but I'm very much against Easter Eggs, at least in SeriousProjects™. Hear me out. When I was younger, I used to find them funny and lighthearted, but after years of having to debug other people's unnecessarily clever code, undocumented code paths, and corners of applications that had absolutely no reason to exist, I've changed my mind. Even though none of what I had to debug were technically Easter Eggs, they were unnecessary, and thus time spent working on/around them was wasted.
The best reason to not add Easter Eggs is simply: they add unnecessary (albeit tiny) risk to a project. Technical risk and schedule risk. They are not part of the software's requirements, and they are likely going to be undocumented. Every line of code we add increases risk. Part of our jobs as developers is ensuring every line we add contributes more value than risk. Easter Eggs don't do this. Sorry to be Mr. Serious Raincloud, but please leave Easter Eggs for personal, non business-critical software.
In well-architectured code, easter eggs are free. Removing them is trivial, but you don't need to because they don't cause problems. They add value, not only because they're great, but because they force you to make your system more extensible than it otherwise would be – but if they break, no harm done. (You might even get bug reports from users, if they stop working! But the users won't mind, because the easter egg isn't why they were using the program.)
> Unpopular opinion, but I'm very much against Easter Eggs, at least in SeriousProjects™.
OP, here. I don't completely disagree with you. As I said, there are always reasons to take them out, and almost always, those reasons are excellent.
A well-executed Easter egg says something good about the team and the project, though. A team that has the time, resources and capability to build something neat or cool or (pleasantly) surprising, is a team that is unstressed and competent.
To be clear, I'm not discussing something like adding a clown giggle to flight control systems. I'm talking about when there are two equivalent paths, neither objectively worse than the other, I prefer when the team follows the path that adds whimsy and fun to the project over when they decide boring is better.
A team that will at least consider adding it will inevitably be more efficient than a team that absolutely will not consider it because they fear extra work.
We all hate debugging unnecessary and pointless code.
But over the very LONG TERM projects I see with Easter Eggs tend to have had developers who cared more about the project and were having fun, which leads to an overall higher code quality. (Your own experience may be different).
Fun! In one of my early roles, there was a script passed named `cat-art` that includes some images that are presumably references to this - see https://pastebin.com/hhv0nXsm
Yes, in fact their first frame looks identical to the second to last one there as for the specific placement of characters and the characters themselves.
At risk of stating the obvious, drawing a flowchart, which is ostensibly just drawing boxes, is a whole different level of difficulty to drawing pictures.
I will grant you that there are tools that will generate ASCII art using letter shading (based on the number of pixels that letter lights up) but that also is very different to using the characters themselves to draw the shape of the image (like with emoticons). That still relies on raw human creativity…well maybe AI could now do it but that wasn’t around 15 years ago when Win 8 was in development, let alone back when those images were created.
That particular site has quite interactive 'painting/drawing' features (select the 'freeform' option, press a key to 'paint' with that letter using the mouse), which is the type of thing I remember from similar ascii art programs 'back in the day' :).
No matter how useful the tool, it's still the artist that makes the art of course. But the tools can make a real difference in how creative someone can be - I doubt much of the really creative ascii art is/was made with notepad.
My point was simply that there are tools out there that really can approach that experience of 'doodling' with ascii, and I thought some people might not have been exposed to that before.
TIL: Performance team are a bunch of heartless cat haters. /s
These sorts of articles are awesome though. I love hearing all the stories that go on behind the scenes of huge projects like this that otherwise never see the light of day.
He references the same blog, and goes into some pretty incredible detail about the history of what happened to pinball coming pre-installed on Windows.
Younglings may not remember, but Mozilla browser used to include a kitchen sink[1], as a joke on the number of functions that were added to it as an internet suite beyond being just a browser[2].
There is still about:robots though, which has been maintained and updated in the new style over the years.
And of course, about:mozilla is still present.[0]
[0]: The Beast continued its studies with renewed Focus, building great Reference works and contemplating new Realities. The Beast brought forth its followers and acolytes to create a renewed smaller form of itself and, through Mischievous means, sent it out across the world.
Huh, reading the first half I almost expected them to have a debugging system where the cats on the fence were composited by each layer drawing only part of the scene on top of things below (fence, then one cat, then the other etc.) so you'd see different things if different parts failed.
AFAIK SAGE computers had diagnostic programs that were written in such way that they drew a "pinup girl" on the console (there supposedly was another, more risque animated image too).
If there was a malfunction, the graphics would be malformed.
This sort of strategy feels to me* like it would be ideal in a gamedev sort of environment - considering the nontrivial practical distance between each of the layers in this scenario, I can totally see each layer occasionally showing up out of alignment, or rendered in the wrong font, etc. My (very abstract and non-practical) impression is that the rendering pipeline in a game is typically much tighter/cohesive than an entire desktop shell.
* I could be wrong; I'm just spouting naive hot air with no practical experience at either end of this spectrum :)
I can only aspire to this level of beautiful whimsy - it's unfortunate that it triggered a codepath that could be measured, because that is by far the easiest excuse to remove something beautiful and rarely seen. The closest I've managed is:
$ grep -o --text U+1F4A9 vmlinux
U+1F4A9
and while there may be an argument that this is whimsical, I don't think there's an argument that it's beautiful.
(U+1F4A9 should be the literal Unicode PILE OF POO glyph, but HN strips it, so)
Although the fact that I've since encoded the "UUID" that this is contained within into other codebases such that Linux can't remove it without breaking some deployed u-boot builds is definitely something I'm proud of.
Your mention of UUID, reminded me that the GUIDs for Microsoft office products in the Windows Registry all end with '0FF1CE' - that's a nice Easter egg I think that doesn't impact anything.
I use ABAD1DEA somewhere in UUIDs in experiments/examples & proof of concept code/data. And for a while one was floating about in production where it would appear in URLs for anyone who cared to look to see¹, because someone else took a bit of my PoC work and shoved it where it didn't yet belong without review…
[1] not just in the middle of a sea of otherwise arbitrary hex either, something like 00000000-0000-ABAD-IDEA-000000000000 so it was easily visible as a “test” value.
Oh absolutely - as far as the code is concerned they're just bytes, the fact that interpreting them as UTF-8 is entertaining is not something the kernel cares about at all.
Somewhere there is a Windows shell programmer who debugged one too many "not even a crash" shell issues. He's got a 1000 yard stare and all he can say about it is he "saw the cats". Seeing a cat on a fence to this day stops him cold.
Microsoft likely didn’t create that art. So I suspect a bigger reason for removing it was licensing, with “optimisation” being a strong excuse for not spending resource on creating original assets.
I *think* this was part of the shell layer. Which is why it's fascinating how the performance team wanted to prune it out; presumably the fonts were mapped into memory elsewhere and thus deduplic...oh, maybe Windows doesn't have mmap page dedupe.
NT's VFS cache does a decent enough job at sharing pages to actual data for the same file, even between kernel space and user space; the problems start around FS metadata caching.
AFAIK stuff like .net's WPF and the GPU accelerated DirectWrite do font rendering in userspace. It's ye olde GDI that is in the kernel, because NT didn't run well on a 486[1]. Yes, Windows has about 5 different ways to render fonts, and no, they don't seem to render fonts the same way.
(In the case of this kernel text mode / BSOD code, I do wonder does it use the GDI font rendering, or does it have its own separate mini renderer just for those screens?)
I don't know anything about DirectWrite, but there seems to be a DirectWrite kernel component due to CVE-2020-1577 (DirectWrite leaking kernel memory "by convincing a user to visit an untrusted webpage", which I'm not sure whether the exploit route would be in TTF/OTF processing, e.g. due to webfonts, or the text rendering itself).
I mean, in recent version of Windows it's now on the user-space, but because Microsoft retains bug-for-bug compatibility Windows 8 will be stuck with a kernel font rasteriser*
* DirectWrite and WPF have been always user-space.
I thought that was a behavior cats just exhibit in real life. In my apartment, there's a gap between the top of the wall and the ceiling. My cat frequently climbs up there. Pretty easy to imagine he'd do the same thing on fences were he outside.
you can make ascii art with a variable pitch font, you only have to do it once and it's static after that, use a font that was going to be rendered anyway.
I'll bet if you rendered those screens as 2-color GIFs, the compression would make them similar in size to the ASCII text. Then you wouldn't need fonts.
ahaha, excellent. I wrote a similar script, displaying on my terminal a cat looking over a fence, eyes flicking from side to side waiting for a certain web-service to become available in which the cat would be gone.
the waiting game is real for most developers/coders.
It's great to see there are real human beings with passion and interests are developing Windows at MS. Whereas Google has no human face. People have their accounts closed and don't get any response from a human being on a support forum.
It's more likely to be a reference to Goodnight Moon. The book has cats at the end saying 'Goodnight, Moon!'
Also there is no point in the series where Luna and Artemis actually stare at the moon together in this fashion, and the cat on the left is wearing a collar. Neither Luna nor Artemis is collared.
...
I was a six year old girl when Sailor Moon premiered in America. I know WAY too much about that show.
> You see, ASCII art uses a monospace font, and the cats screen was the only part of the startup sequence that used a monospace font. Drawing the backstop window was forcing the rasterization of an entirely new font, which was costing time and memory for something that shouldn’t ever be seen in the first place.
Wow, I never thought an entire operating system could be crippled by font load times. Must be a problem I'm too linux & vector font to understand.
It's bitterness from having your dreams crushed. I used to read Raymond's blog and share his passion for correctness and efficiency, but it seems like mockery now.
Eventually I got around to implementing a rudimentary framebuffer to send to the display. Wondering what to display, I figured I'd compute a Mandelbrot fractal!
People liked it, and I was happy to have made our dev systems a little less ugly.
I went to work on other things for a few months. When I came back, they had replaced my fractal with the company logo :\
(I suspect, but don't know the codec, that the jpeg used to store the company logo ended up using more space and cycles-per-pixel than my fractal)