
The Thirty Million Line Problem (2018) [video] - breadandcrumbel
https://www.youtube.com/watch?v=kZRE7HIO3vk&feature=youtu.be
======
zamalek
It takes "17 million lines of code in Thing X to do Activity Y" is patently
wrong. Many of the lines of code in Linux is dedicated to drivers, drivers
which are likely not even loaded. Many more lines of code may be unused this
specific scenario, if you are simply reading on WordPress, you aren't using
the admin code at all.

The reality is that a miniscule amount of those lines of code is involved in
the activity. You can write an OS and stack that could achieve a very specific
task (IncludeOS), but then it could only perform that one task. Now that there
is 'no wasteful code', how would he record his weekly screencast, or edit it,
or transcode it, or upload it, or view it? Everyone would only be able to view
text files.

These systems are this complex because there is demand for them to be. This
really seems like tilting at windmills.

~~~
smadge
I think that’s exactly the point. There are millions of untestable lines of
driver code which may or may not be run depending on your users hardware
configurations. There could be bugs or vulnerabilities sitting in those lines
and there is no way to test all the cases. I didn’t watch the whole thing but
I think the talk proposes a standard ISA which eliminates the need for
drivers, so your code literally runs on bare metal.

~~~
zamalek
Smells like a leaky abstraction - it's just kicking the can down the road.
What if you have novel hardware that is unsupported by this ISA? Say, an ML
accelerator or a quantum coprocessor.

~~~
kortex
Exactly this. The speaker's claim to fame is building a game from scratch. He
relies on (based on his episode guide titles) Windows, C++, DirectSound,
OpenGL, just to name a few, not to mention human interface devices, to achieve
this in a remotely reasonable timeframe.

Yes, he's got a point about over-relying on layers of abstraction, but that
abstraction was written with untold millions of dev-hours across thousands of
organizational structures and independent individuals. The only way to get
pieces of such disparate pedigree to operate together is abstraction and APIs.
You have to bottleneck and standardize complexity to logical boundaries or
else everyone ends up rewriting their OS to ship with their game. A
standardized ISA wouldn't fix this because you still need standardized OS
ABIs, hardware wire protocols, network protocols, media write protocols.

His argument boils down to "get all engineers to agree upon an ISA, OS, SoC,
hardware interface". As a start, he suggests "Simplify the NIC, SATA, USB and
video interface". Yeah, ok, I'll believe that can happen when engineers agree
upon bike shed colors reliably.

When you're basically the only person in the stack, it's easy to naively say
"just standardize everything!" but as anyone who has been to any
organizational meetings knows, getting humans to agree on ANYTHING is an
involved process.

~~~
AnIdiotOnTheNet
> The speaker's claim to fame is building a game from scratch.

Handmade Hero may be what he's known for in the general population, but Casey
Muratori has been in game development since games were written directly to
ABIs. He worked on WinG, he worked at RAD Game Tools, he's probably got code
in more computer games than you can even name.

In other words, he has a pretty damned good idea what he's saying.

~~~
zamalek
Being famous and having credible experience does not automatically make you
correct on all matters.

~~~
detaro
Nobody is claiming that.

~~~
zamalek
You're right. No claim was made that he was _correct_ on all matters. However,
that comment provided no substance beyond a list of credentials.

The claim is that he is an expert on all matters related to programming; none
of those credentials have anything to do with OS, ISA, or hardware
development. Certainly a claim that he is more of an expert than anyone here,
which is absurd because HN does not consist of a uniform sample of human
beings. Much less uniform in the comments section. People who care to comment
on posts like this are passionate about what they do; meaning that they are
most-likely very good at it.

He might be able to design the best computer system for a narrow scope - which
has already happened in some respects: we have Vulkan, which he very curiously
avoids, instead opting for the heavily abstracted OpenGL.

His opinions are valuable and worth listening to. Others' opinions are not
worthless merely because of his CV and social network status.

~~~
detaro
The comment it responded to can be read as trying to use credentials/his
"claim to fame" to discredit the argument as being from someone who didn't
understand what they are building on - additional credentials that this person
might know what they are talking about seem like substance to me.

~~~
zamalek
Once again, you're right.

------
otakucode
So it's not just me. I've considered developing a toy OS as a fun project for
a long time, but I was always mystified about what to do about graphics. The
whole reason I wanted to do an OS was to do a sort of 'start fresh' thing
where I looked at the hardware we have now and designed for THAT instead of
designing to support 1970s era software. The first step is if it's a 4K
monitor, the first graphics mode is 4K, not 80x25 text. Fonts are vector by
default, not raster. Graphical compositing of the display, even when its the
initial 'text mode' terminal would be rendered on a 3D isometric projection
surface using a hardware accelerated graphics card. And how do you do that
exactly? Even if you are willing to limit yourself to a single nVidia or ATi
card supported to start? Ah...

~~~
phaedrus
A similar situation applies to 3D graphics for game engine programming. You'd
think if a piece of hardware can run a AAA game with impressive graphical
effects at 60 FPS, it should be able to run your simple indie game engine with
simple graphics for proportionally less development effort. But, no, it turns
out unlike CPU's where we got so many "speed ups for free", most of the things
that make the AAA game able run so fast with so many effects are bespoke, opt-
in features.

(The following is based on my experience learning from the Irrlicht engine.
YMMV) You find a random N-year-old basic OpenGL tutorial and it doesn't run
N/1.5 iterations of Moore's law faster now, it still runs inexplicably slowly
(compared to what you know that hardware theoretically should be able to do
with modern games). You have to figure out, oh it's not using GPU memory for
the meshes, how do I tell the driver to do that? Fixed that, but wait I'm
using immediate mode, retained mode would be faster. Wait is retained mode
going away, what's replacing it? So I gotta learn shaders now too?

The problem is the APIs are terrible and the drivers are buggy. The levels of
abstraction keep being chosen all wrong - too low-level for the library or
drivers to give user code "speed ups for free" and too high-level for a power
user of the API to be able to be able to tell directly what's happening and
profile what's making things slow.

A better model would be to break the high-level and low-level into two layers,
like the Clang / LLVM compiler. The high level API defined by an organization
like an OpenGL working group and they could provide a reference implementation
front-end (corresponding to Clang or another compiler front-end in this
analogy). The hardware vendors would just provide the low-level implementation
(corresponding to LLVM).

------
sova
This is real talk! But, How does this appeal to a unified architecture
interfere with the marketspace? I'd love for more programmers to listen to
this video and give their thoughts!

~~~
mac01021
I'm no authority on this stuff, but I don't think it has to be one single
universal ISA for everything. In the same way I can compile my C program to
run on x86 or arm, I can design it to interface with two different possible
storage device architectures, if I want to.

------
columboslastq
I would be very curious to know the thoughts of HN readers: Is it a good idea
(define a system wide spec to remove the need for drivers etc.)?

Seems to make sense to me, but i know nothing. If someone defined one, an
emulator could be written. may even be able to prove some gain that would make
it attractive for a manufacturer to impliment it.

~~~
sp332
I like the idea, yeah. Virtual machines running in Xen hypervisors can be very
small because they can make a lot of assumptions about the "hardware" they are
running on. And as a negative example, every Android ROM has to be compiled
and distributed separately for each model of phone it supports, because ARM
platforms don't have a BIOS layer so they can't generalize.

------
ncmncm
Remember when the SDI ("Strategic Defense Initiative"), "Star Wars", Reagan's
fraudulent ballistic-missile defense project, was considered impossible to
implement because (as reported by David Parnas) it would take a _million
lines_ of code to implement, and (more importantly) would have to work right,
on the first try?

The program was believed-in by the Soviet military, which spent so much on
countermeasures it crashed the Soviet economy, bringing down Gorbachev and,
ultimately, the USSR itself. Which seemed OK at the time (people talked about
a "peace dividend" \-- joke was on us!), but now we have Putin. And Trump.

But I got a free trip to Hungary, in 1987, to report on research in point
semiconductor defects, paid for by a grant from the US Air Force out of their
Star Wars budget.

~~~
im3w1l
Maybe this is a stupid question but how does the Star Wars initiative relate
to the movies if at all?

~~~
MarkSweep
Senator Ted Kennedy is credited with creating the nickname "Star Wars" to mock
the project:

[http://web.archive.org/web/20090227050446/https://www.smdc.a...](http://web.archive.org/web/20090227050446/https://www.smdc.army.mil/2008/Historical/Eagle/WheredowegetStarWars.pdf)

The nickname does indeed refer to the movies, which were released a few years
earlier.

