Wow. Preemptive multitasking, graphical window management, POSIX-like, in a programming language that to my knowledge it hasn't yet been done in... and in just several months of work.
Thanks! Originally it only started as an experiment to see how far I could go making an OS in a high-level (higher than C at least) language, I was only planning to build a monotasking DOS system. Around this time, Andreas Kling started to show us his SerenityOS, which was one of the factors propelling me to reach this point. So shout outs to AK!
Yeah! It's pretty fun to develop an OS, albeit time consuming. You should check out the osdev wiki and the Intel/Seagate manuals, they should hopefully get you to the point where your OS can read/write/execute files, you should also take a look at how other kernels do things as a reference and actually implement them in your OS in a way that fits.
Sorry that your account was being rate-limited. It's software filters that do that, based on past activity by trolls. Unfortunately it also sometimes prevents project creators from showing up to discuss their work. I hate that!
We've marked your account legit so this won't happen again.
Oh, I have dabbled in OS dev a tad already actually, although not in a long time. I think if I went for OS dev nowadays, I’d prefer to try and start as an EFI binary because I believe that way you get to start directly in long mode, and have some filesystem drivers to bootstrap with at least. Still, I don’t know if I will ever have the drive (and spare time) to push it as far as you have gotten Lilith.
Right after the global descriptor table is setup and paging is enabled, the GC is then enabled. On every allocation, and whenever the os has no task available to switch to, a gc cycle is performed.
Not really, the kernel and any process written in crystal has a garbage collector, any other userspace process can manage memory however they want to.
Maybe it's just because I come from Ruby, but Crystal seems like an amazing language. I'm interested why it seems to be getting little attention compared to Go, Rust, & Julia. It seems in the same league to me.
Kotlin and Swift have more obvious reasons for their adoption.
I wouldn't say it's getting "little attention," rather it's relatively new (for a language) and not backed by a major player who can instantly give it a huge spotlight / platform.
For being a 5-year old indie language out of South America, I think it's doing quite well, and projects like this are good evidence of why.
Ruby is an outstandingly productive language. Copying the expressive syntax and ergonomics of Ruby into a fast, statically typed, compiled language with excellent C bindings is a very compelling idea.
This. I recall ages ago when I had to write a networked tool and had lots of freedom in choosing tools and languages (and time), so I picked this new thing called Ruby just because it looked so easy to understand. That was years before Ruby On Rails was introduced, so the language was pretty much unknown. After some time to grasp the basics I started to be productive; making changes or adding functionality or debugging became really fast, and in the end I recall thinking "if only this thing was nearly fast as C that would be wonderful".
I believe that just happened. OK, maybe not 100% identical to Ruby or 100% on par with C speed wise, but just being close to that is a huge achievement. I only wish it was available for small embedded ARM boards like other interesting languages like Nim etc. Last time I checked, Crystal is bootstrapped, ie it requires itself to compile itself, which IMO doesn't help portability to new iron.
i've been a huge fan of elixir ever since i found it a few years ago. The main web framework (phoenix) is doing some really cool stuff right now with live-view, too, and I'm pretty darn excited about it.
I've always thought that the best system for a web server would be erlang OTP, because of the design based around uptime and system recovery. It has been proven in production by a few companies now in web dev, and erlang has been used for years by telecom companies for their own infrastructure.
elixir, imo, is awesome because of the ruby-like syntax, but I would probably be using erlang for my personal projects regardless because of what I mentioned earlier.
Elixir gives you a lot of 21st century development tools: Async Tasks, Compilation environment (dev vs prod vs test out of the box), first class documentation and first class test suite. Sure you could implement these in erlang, but in Elixir, it's opinionated and everyone is basically on board with the same set of tools.
Some of these tools are making their way back to erlang, like telemetry, and some aspects of docgen.
There are also some under the hood features, too. I can run async tests where the test is given an id, the database gets a sandbox with that id, I can escape the vm (via chromedriver, eg) passing the id, and when the request hits the http server component, it and all child tasks is aware of the test that it's in, so all of the database calls are routed to the correct sandbox, resulting in concurrent and idempotent end-to-end tests. Again, you could do this from top to bottom in erlang, but elixir has support for this out of the box, and the good anointed and 3rd party libraries (Mox, ecto, hound) use these elixir features.
It's pretty nice to be able to run a full suite of unit and E2E tests in about 10 seconds.
I found Crystal about a year ago and absolutely fell in love with it. It just “clicks” for me.
My sense from reading most peanut-gallery commentary that the biggest reasons it hasn’t taken off more (yet) are:
1. Real multi-threading (it’s now in alpha - but previously just had single-threaded concurrency - and sorry I might be getting that terminology wrong).
2. Windows support (I personally don’t care but I haven’t found a single thread talking about Crystal where someone doesn’t chime in and complain about this)
I have a feeling once it reaches 1.0 and those issues are addressed it’ll get a bit more attention. It deserves it!
The problem with wsl is that its a window like a small island extending 1000m into the air, and the moment you step off... Splat, you're back in Windows with the registry and such. I bought a surface a few weeks back - wsl is nothing but an improved cygwin, of no use to Linux/Mac users, and windows has become repulsively spammy. I've installed arch.
While I don’t recommend WSL for serious production workloads, yes you can X11 forward applications from WSL and there’s a few implementations that are really good, all things considered.
[Off Topic] Deus Ex had a profound effect on me growing up, and probably shaped my views which last to this day. Unfortunately, very few games like that are made nowadays.
> Games have minimal dependencies re OS APIs wise.
Sure, if you exclude "let's use every little quirk and undocumented feature in the graphics and audio and input stacks we can find because gotta go fast sanic noises".
Both games and office software (ab)use OS APIs; the difference is the specific subset of those APIs said programs (ab)use.
So yeah, you're right that it ain't "bullshit", but don't write off games as somehow "easy" compared to office programs; a substantial amount of effort has been poured into reimplementing Windows' multimedia stack in Wine and continues to be poured even now.
DirectX9 is not easy at all, it's almost Glide/Vulkan like low level-ish API. But hey. that's "bullshit". My balls. The above user should try running a pure DX9 game with Wine and GL 2.1 on a Intel Mobile 4 series with no noticeable frame drops. I tried. Hard.
But also, having DX7/8 being almost 100% compatible back in the day was a huge step on stability vs Windows 98, even if the RAM and CPU usage were higher. DX run fast, so did Max Payne.
Lots of the crop of new languages with cleaner designs have huge problems with windows. V doesn't work on windows and zig's compiler fails if you use carriage return newlines. They even built in an error message for it instead of just making the parser work. It's kind of ridiculous how these languages shoot themselves in the foot.
I think Zig is probably just being opinionated there; many of the new languages seem to be overly so. Last I checked, both the Zig and Nim compilers refused (on principle, not for any technical reason) to compile code containing tabs (among other things). I don't (usually) want my compiler to do linting, and I certainly don't want language authors to dictate my code style from on high.
Thank you for this. The passive-aggressive style of erroring on unused variable while staying absolutely silent when I accidentally print the &int variable with %d format, always seemed like an inconsistency to me.
It really is a sad state of affairs when disabling the linter requires patching the compiler. I wonder if any of the LLVM-based Go compilers will adopt a saner policy?
Making a compiler break with pretty much any file saved from a windows text editor is a very bold opinion, I can't say it would be my choice if I wanted to grow my user base.
What about Go? It seems like their approach is that if they define the style guide and build it into tooling, your code style (for Go) can only be the 'approved' style.
Or, put another way: People only have code styles because languages leave ambiguity. By removing that ambiguity you force your user-base to be consistent therefore increasing communication and productivity.
I tend to agree. It really grinds my gears when precious time is wasted on styling issues when there is no technical reason for either side. Granted it can be tough if you use three different languages with three different sets of rules.
I think it's worth reading more of that thread for some pros and cons of being very strict with styling rules. Note that being too strict can lead to code being less readable in specific edge cases. This may or may not be considered a fair trade off.
To summarize the linked post (which I completely agree with), just provide officially recommended sane defaults and then get out of the user's way. In my view, that's how nearly all software ought to be written.
It's a fundamentally flawed approach in my opinion; viewing the users of a language as a single unified user base is a mistake. Various projects and groups will inevitably have vastly different use cases, and thus needs and considerations.
For example, I liberally mix C, C++, D, and occasionally other languages within a single code base on my personal projects. For sanity, I choose to observe my own consistent style across all languages when doing so. This obviously doesn't match common practice for any of the languages in question, but thankfully none of them attempt to impose the opinions of their designers on me.
I view languages being concerned with code formatting as a form of scope creep. A language should facilitate good code and communication by thoughtfully designing the syntax and providing useful constructs to the programmer. That's already an incredibly difficult problem - trying to tackle additional interpersonal or organizational issues is far too much, can't possibly accommodate everyone, and is bound to have unexpected negative consequences.
Put another way, core infrastructure should nearly always be as unopinionated as possible. Stick to as narrow a design goal as is reasonably possible and execute on it as well as possible. For a language, that means faithfully translating whatever I throw at it into machine code unless there is a legitimate technical barrier in the way of doing so.
> It really grinds my gears when precious time is wasted on styling issues when there is no technical reason for either side.
If this is happening it's indicative of an organizational or managerial problem. Any given project should have a clearly defined (and consistently enforced) code style, ranging from "use gofmt" to "follow PEP 8" to "adhere to our internal style guide".
Across a specific project, team, or org, sure. That's what style guides, formatters, linters, and pre-commit hooks are for. The core language is the wrong level of abstraction for this stuff.
Python works just fine with tabs, but the language-endorsed style guide (PEP 8) recommends against them. (However, the usual plugin for Vim requires adding "let g:python_recommended_style = 0" to work correctly.)
"zig fmt" ostensibly cleans up carriage returns and tabs, for what it's worth.
Granted, I agree that it's silly that Zig's compiler can't just "do what I mean" and just accept that there will be tabs and carriage returns.
However, I'd say programming languages in general always seem to have trouble with Windows at first, in particular because Windows is pretty alien to those more accustomed to Unix-like environments (which, I reckon, is where most people making new programming languages are actually making said programming languages).
Crystal still doesn't have full Windows support, either, on that note (you can cross-compile a "Hello World" program, but there's a lot of pretty critical stuff missing).
I wrote a benchmark for V too and it’s fast but there are other issues internally and externally.
Have anyone know who is Alex?
He going to build compiler in AST, but as long as it’s useful for your works. It’s too early to tell whether it’s reliable for businesses, I’m more likely to trust Go as we know it’s battle tested.
It will definitely take more years to stabilise and promoting, other languages would have been mature and established.
What is "latest build"? V does not release any stable build and you can get "large error" on any platform right now when trying random "latest build" without any effort to investigate the issue.
The StackOverflow 2019 survey suggests 47% of developers are on Windows (https://insights.stackoverflow.com/survey/2019#technology-_-...). This is a big chunk of developers you can't reach well, if you are limited to Linux and/or macOS. Windows devs might desire to have Crystal because they want to build native apps, but the reality is many people use the OS for other reasons that are beyond their control.
I don't find it surprising to see this question in a discussion involving a language related to Ruby. It's a bit sad it still comes up, but there is overwhelming demand for better Windows support of this and any other good or promising software running on Linux or macOS.
Historically, Python has had reasonable Windows support. It got to a point where it was OK and stopped improving. In recent years there has been more attention and improvements. This investment has meant that the language is reasonably viable for a lot of tasks on Windows. This doesn't mean that there hasn't been an influx of folks at points with little knowledge or care for Windows, but lots of packages work reasonably well.
Ruby is a different story. Rails was the big growth driver and there was a narrow focus. The pattern emerged of dev on macOS and deployment on Linux. I personally credit the tropes about only seeing Macs at dev conferences in a big way to Rails. The result is that it's infeasible to use Windows directly and you are best to go with a VM, or now WSL 2.0. It didn't have to be this way when a big chunk of developers are Windows. Rails could have taken an even bigger chunk of the market if Ruby had better Windows support.
Strategically, Crystal and other languages that want real adoption and the things that go with that (more recognition, more libraries, more real world use, more contributors, etc.) need to work out a good plan for Windows support.
I have had a huge amount of issues trying to use both Ruby and Rust on Windows for server / client use cases. It's not just that it "works" on Windows, but that it works easily out of the box for common use cases.
Meanwhile Python and Golang work great on Windows with no extra effort. It's like night and day.
It's been a while, but it was the common issue with certs that Ruby on Windows also has. I just couldn't get it sorted for whatever reason. Maybe it's not an issue anymore though? Rust http client works OOTB on Windows?
I never thought there would come a time when I would want windows-support. However, the case is, that Windows as it is today, is quite inevitable. Hence if you want Crystal, you probably want Windows support.
That was the biggest problem for anyone jumping on to Ruby Rails in regions that has little to Zero Mac market shares and all Windows PC. With WSL2 this was (?) solved, but that is assuming everyone is on Windows 10 and latest version.
This has been hurting RoR adoption for long, although at this point it probably no longer matters.
Yes. Although I'd rather use Linux, most company IT will distribute windows computers and will not give you a Linux laptop. That means a lot of your work will be done on Windows.
> I'm interested why it seems to be getting little attention compared to Go, Rust, & Julia.
Rust essentially competes with C and C++ (and maybe D and Nim without their GCs), and the reason it gets hype is because it brings some unique features for that niche.
Go brought some unique features as well (goroutines done right), and it is pushed by a major player.
In Julia's niche, you only really have Julia and Fortran - sure there are many other languages used in that niche, but there are very very few languages exclusively designed for that niche.
C#, Kotlin and Swift are the system API languages of major platforms (MS, Apple, Android). Javascript is the language for the web.
> No, the system API of those systems is exposed in C, ObjC and C respectively.
This is incorrect. Microsoft doesn't have a C compiler, only a C++ compiler, so the lower level language on Windows is C++. Yet for application development, the application language is C#.
On Apple, the lowest level language is ObjC, and the application language is Swift. And on Android, its C + Kotlin.
Doing application development on, e.g., Android, using the C NDK, is pretty much unsupported, since the NDK is mostly undocumented, and changes often. The only stable API for application development is Java, or Kotlin.
It's still C all the way down, even in 2019. Amazing how far crappy, badly thought out, ubiquitous, low entry barrier languages can go. More than 40 years of memory errors, out of bounds errors, broken metaprogramming, undefined behavior, lack of error handling, type safety, abstractions and intelligent design in general, all served under the clunky syntax sauce. And it still powers our world. Worse is better, indeed.
Crystal _could be_ for game developers; there's large green pastures left almost entirely untouched for a language to disrupt it. You see a few niche languages make inroads, like Haxe, but I expect something could really effect change.
The key is to be as fast as possible, as memory light as possible, and able to produce and speak to C ABI.
Well, I hadn't heard about Crystal until... just now. So projects like this are great marketing.
I'm reading up on it and liking what I'm seeing so far. This could be a great Golang (which I detest) replacement for me when I need to write a small, native program. Also, the Ruby-esque syntax is a win. I've always thought Ruby was a nicer language than Python. Too bad Ruby never grew much of a dev community outside of Rails.
Didn't Crystal run into a major development roadblock a couple(?) years ago? Something about a fatal design flaw in the type system which led to exponential blowup in larger programs. I haven't been able to surface the relevant GitHub threads again, but whatever it was, I guess they worked past it eventually.
All that stuff is still optional. I don't think there are any cases where removing type annotations from a working Crystal program will change behavior.
Go still lacks Windows support for certain features, e.g. plugins.
And wanting to be part of Google team helped many community contributions. Easy to compare with how much community contributions were made to Inferno and Limbo.
Fair, I had forgotten about plugins. On the other hand plugins has other nuances[0]. Plugins don't seem like a properly release feature due to its constraints.
Have been playing with Crystal for several months and still find it amazing what this language can deliver given that it's still relatively new.
Asked my friends to try it out and they love it too. Hopefully we can grow a bigger ecosystem for Crystal to make it a main stream programming language one day.
What's Crystal's unique selling point? It's older than Rust, Kotlin, Swift, Nim, or even OCaml, and doesn't really offer anything they don't. More Ruby-like syntax? Fine, but most people don't particularly like Ruby syntax.
The "even OCaml" is what threw me. Still not quite sure if they meant older or younger from it.
> But, yeah, besides the syntax familiarity for Ruby programmers it doesn't seem offer much that the others do.
A familiar syntax is a huge benefit in a lot of contexts. There's plenty of Ruby-only shops around the world. Making a faster and statically-typed language accessible to them is a decent niche.
Crystal isn't quite Ruby. It has type guarantees that Ruby doesn't, which makes it easier to maintain. The async and parallel stories are also immensely better in Crystal than Ruby.
The "Nil is a compile-time error" is also a feature I highly appreciate.
Okay, port your existing Ruby code to a compiled language, giving it multiple times the speed? Maybe three is a slight business opportunity here.
Also, give your developers a more productive language that can go where Ruby can't (like OS development), which is 90% instantly recognizable — may also be worth something?
Is it really so hard to port Ruby code to any of the languages I mentioned? Even with Crystal you will still have to make changes to your code, is having to use a slightly different syntax such a big difference? If you've got Ruby code and you haven't already ported it, how badly do you need the speed?
Well it seems most developers here would rather use Ruby + several memory-optimised VM clusters to make their services go faster and destroy their balance sheet than to use a faster language that is efficient in memory and is also blazing fast.
A rewrite in (other fast language) from Ruby sounds ridiculous as soon as you include writing a extra FFI wrapper around this you have to maintain so that would be out of the picture and that sounds a lot harder than a semi 1:1 port of the source code.
The trouble is Ruby itself is starting to become a sunken cost as a language but is survived because of Rails, thus also explains the countless memory issues that happen when developers use it. So for a stopgap to something sort of better, I would rather port to Crystal or Elixir so I can run on those low-cost VMs and save cash.
C bindings, strongly-typed, a great concurrency model, very fast, and Ruby-like syntax. That makes it a very unique and appealing language to me (and others).
If you / most people don’t particularly like the Ruby-like syntax, then Crystal might not appeal to you. But I can assure you to those of us who like the syntax, the language is wonderful and super fun to work with.
> C bindings, strongly-typed, a great concurrency model, very fast, and Ruby-like syntax.
Ruby’s syntax is a great fit for Ruby’s semantics but—and Ruby is my personal favorite language—I don't see much point for it divorced from that. It's the one thing I like least about Elixir.
OTOH, Crystal looks like it may be close enough to Ruby semantics that Rubyish syntax is a plus rather than a misdirection and distraction.
In what way is Elixir's syntax like Ruby's? It has "end" on blocks and string interpolation with "#{whatever}", but everything else is different.
It's about as close to Ruby as Ruby is to Python (which is to say, not much).
I think it really helps that you can write mostly functional code in Ruby by just chaining higher-order array functions, and you can use Pipes in Elixir to do this almost identically -- when you consider the fact that all the higher-order array functions have identical or very similar names and parameters in the two languages.
My Ruby code is like 95% chained higher-order array functions. My classes almost never have state. I'm basically writing functional code, and it seems like this is a pretty common paradigm in Ruby. I will say this was HEAVILY influenced by learning Elixir.
When not taking external libraries into account, my Elixir code (at least to me) seems almost identical to my Ruby code.
I think Python and Ruby are very close. But I personally think list comprehensions are un-intuitive and you can't easily chain higher-order array functions in Python.
So although I feel the spirit of my Ruby and Python code is identical, in no way do they look and feel identical. In the same way, I think my Elixir and Python are quite different.
Thanks for sharing!
I think though that what you are referring to is not syntax but semantics. On both places you are chaining HO array functions, so the algorithm is pretty much the same, but the syntax you use is different in both places (pipes, dot).
Having done lots of development in both Ruby and Elixir (both professionally and on my free time), I'd say the only substantial differences boil down to the following:
- Elixir has |> instead of method chaining
- Elixir uses "do [...] end" consistently for blocks (as opposed to Ruby omitting the "do" in front of function/module/class/etc. bodies)
- Elixir supports Erlang-style tuples on a syntax level while Ruby (AFAICT) does not
- Elixir supports Erlang-style pattern matching / destructuring while Ruby (last I checked) does not
- Elixir's a fair bit stricter about differentiating zero-argument function calls from variables (i.e. if it's ambiguous it'll complain)
That is to say, Elixir's syntax descends directly from Erlang and Ruby, just as Ruby's syntax descends directly from Eiffel and Perl, and it shows rather strongly. That should be unsurprising, given that Elixir is the creation of a (former?) Rails maintainer.
Kotlin - it's on JVM which is a good thing for some and a bad thing for others. It's easier to distribute standalone crystal binary for example.
Swift - crystal throws exceptions, swift explodes...
Ocaml - the functional approach is not for everyone.
Rust - I like rust a lot. But sometimes I want to write something quick and I'm happy to sacrifice some memory and speed to not have to deal with the ownership issues.
Nim - the most comparable, I guess it's syntax preference at that point
> Ocaml - the functional approach is not for everyone.
OCaml is pretty general purpose though, it has loops, objects, mutable state. libguestfs [1] and some other codebases (the compiler itself, btw) are written in a pretty imperative OCaml.
OCaml somewhat tries to be familiar for OOPs, but it _is_ first and foremost a functional language.
Personally, I am learning it now, and I think rather than it being a stepping stone between OOP and FP, it might as well be its own separate and hybrid paragdim; I find purer FP languages like Haskell much more intuitive than the mix of modules, objects, types and classes of OCaml.
Though do not get me wrong, OCaml is an amazing language. It just stands rather by itself.
Unique? There are a hell of a lot of computer languages out there. The only unique ones are those that are so bad that no-one ever did that again.
The selling point for Crystal is that it has a clean syntax that's easy for Ruby or Python developers to pick up (almost the same as Ruby), but compiles to standalone binaries that run really, really fast.
I mostly work in Python because the libraries are a lot more mature, but I use Crystal when I need to produce a standalone binary.
It's unique selling point is union types as a part of the type system, with whole-program type inference and flow typing. This has never been done before (as far as I am aware), and it is incredibly powerful and interesting. It's compile-time duck typing.
It's very expressive and pleasant (and I never used ruby), while being fast and statically typed. That's a great combination of features. If it were stable, it'd be a serious contender to Go imho.
>I'm interested why it seems to be getting little attention compared to Go, Rust, & Julia
I've noticed the slogan was change to "A language for humans and computers", what it really mean? Related to machine learning?
My impression that other than it was design to be as friendly as Ruby and fast as C. It's a lot helpful to have an FAQ if the Crystal team could provides.
>I'm interested why it seems to be getting little attention compared to Go, Rust, & Julia
As I see it, Go gets 70% of the attention, Rust 25% and Julia 5%. Crystal which doesn't have major backing (no Google, no Mozilla, and Julia has some MIT hackers behind it IIRC, plus the whole "data science" angle which is fashionable today), gets even less.
Yes. Fork was a clever hack for one point in time, back when processes were simple, single-threaded, and lightweight. It's always had the misfeature of inheriting resources which you didn't ask for. Not including it makes it mildly annoying to port existing programs, but 99% of the time programmers want spawn and are forced to build it from fork(), not the other way round.
POSIX-like, not entirely POSIX! There are basic IO syscalls, and some process management calls, but overall I try to take unix as a guideline rather than trying to reimplement one entirely. I also try to take inspiration from other kernels (like fattr (which is equivalent to GetFileAttributes from WinNT), and spawnv from DOS/WinNT).
That's fair. And to be clear, reading "POSIX-like" makes me expect it to act like a unix, but that hardly means that you should be bound to follow that exactly:) If you wanted to set expectations a bit, maybe "partially POSIX-inspired, with other influences from $FOO and $BAR" would be clearer?
Oh wow someone actually posted this! Thanks a lot for the comments and criticisms guys, really means lot to me. Before I go I'll answer some questions, so ask them away!
I’ll echo what everyone else said - SUPER IMPRESSIVE and awe-inspiring work. Thanks for sharing it with the world.
I have one question: While I can read and understand each individual line of code you wrote (go Crystal!) and what each method does, etc., I’m at a total loss for how you knew what and how to build - big picture wise. Can you share a bit about your process for building the mental model of this? How do you know where to start, what modules to build, what’s needed / what’s already there, etc? What roadmap (if any) are you using?
I’m just amazed at how something like this comes together and would love to learn a bit more about the process a single dev uses to build it.
You can checkout what I replied to jchw. But tldr, wanted to do DOS => got ambitious => made it multitasking => got more ambitious => added graphics and guis.
Currently it's pretty barren driver-wise, I currently have a FAT16 driver, a keyboard/mouse driver, some basic code to handle core architecture hardware and that's it. I really want to have CDROM support, a network driver (ne2k or e1000), an audio driver (sound blaster?) and an actual file system.
My roadmap is go with the flow. If I want a text editor, I'll focus on building a text editor. If I want to play doom I'll port doom (but I dont so I won't :^)
Thanks but the code isn't that clean, there are hacks here and there and it isn't organized as clearly as I wanted to but feel free to look through it.
Precise GC information, the crystal compiler prepends a dword/qword value indicating an class' typeid upon allocation. I decided to patch the compiler to generate 2 functions, one that outputs the class' size based on a typeid, another one that outputs the offsets of each pointer in that class as a bitmap, based on a typeid.
The compiler also generates a null-terminated array of pointers pointing to the offsets of global variables.
No iso images yet, I haven't implemented iso9660 filesystem support, but it's coming soon?
If so, it is definitely mind-bending. Makes me wonder what people mean about not being over-ambitious, "avoiding NIH", "worse is better", "minimizing technical risk", etc. Somehow one is made to feel (by no particular person, but by the composite emergent vibe) that there are a hundred and one reasons to be more conservative. I wonder how many people have the audacity to re-imagine their software ground-up, and the technical chops to actually implement it. The verve is refreshing. Kudos to the author, and also thanks.
The Soul of a New Machine has an amazing passage on a software developer from Data General being too inexperienced to realize that what he was doing couldn't be done, and finishing it in record time.
It's safe to say that people who aren't aware of or ignore limits put out better work on average than people who don't.
Indeed, reminds me of :
One day In 1939, George Bernard Dantzig, a doctoral candidate at the University of California, Berkeley, arrived late for a graduate-level statistics class and found two problems written on the board. Not knowing they were examples of “unsolved” statistics problems, he mistook them for part of a homework assignment, jotted them down, and solved them. (The equations Dantzig tackled are more accurately described not as unsolvable problems, but rather as unproven statistical theorems for which he worked out proofs.)
Six weeks later, Dantzig’s statistic professor notified him that he had prepared one of his two “homework” proofs for publication, and Dantzig was given co-author credit on another paper several years later when another mathematician independently worked out the same solution to the second problem.
Not sure if it's the same story but there is also the Ivan Sutherland one reported by Alan Kay.
> When asked, “How could you possibly have done the first interactive graphics program, the first non-procedural programming language, the first object oriented software system, all in one year?” Ivan replied: “Well, I didn’t know it was hard.”
That seems decidedly unsafe to say, but even if you’re right, the second paragraph doesn’t follow from the first. “Everyone generalizes from one example! At least, I do.”
It's been shown repeatedly. Arthur Whitney is another example. In fact, basically every major technological innovation of the past two centuries has been because of people doing this.
The first was just to share an example. The second paragraph is common knowledge.
We do depend in a profound way on people trying the impossible for our civilization to progress. But at the same time most attempts to do the impossible do end in failure and I think it would be mistake to say that people trying do the impossible do better on average than those who don't. Rather, we should encourage play, tilting at windmills, etc knowing full well that we're accepting the danger of failure now but making an investment in the future.
It's survivorship bias - you never hear about prople who ignore limits or don't know them and fail. Some limits earn you trouble or penalty by law when broken, so you might get to know them in a less pleasant context.
Apparently they've also written a language, a static site generator in that language, a brainf*ck interpreter, and a markdown parser, among other things. Super impressive stuff. https://github.com/ffwff?tab=repositories
Two things that I immediately respected about Crystal's landing page [1]:
- It shows numerous code snippets covering various functionalities. Not just "Hello world" but code samples that give you a very solid flavor of the language.
- Its spinning logo can be manipulated with the mouse.
I've been following Crystal for long time (Shameless plug I've been running a geo-location service in production for years now https://gitlab.com/maxpert/crlocator and it works flawlessly). Despite the setbacks in past Crystal has continued to evolve and I am waiting for fully baked version 1. I am not surprised that Crystal can pull of something like running a complete operating system. Modern language landscape is getting populated and brings novel ideas (Rust, Zig, Golang, Julia); and Ruby's nice syntax with efficiency of compiled language IMOH still remains desirable for a lot of us.
Cool coincidence but nah. Maybe its a reference to the first angel in Evangelion? Maybe its a reference to Adam's wife? Maybe its best girl from Machikado Mazoku?
Crystal is a solid alternative to Go. I gave it a solid look when figuring out which language I should learn. The only reason Crystal didn't make it is because it's just missing some core features. The one that sticks out is there is no way to get the length of a channel last time I looked.
But this is impressive and I can't wait to see what's next
I have the same feeling as yours. I wish Crystal web server could be customisable, and lack of gRPC, HTTP/2 (only separate repo) and QUIC going to take more time to implement.
I can only see 1,209 repos after using Crystal for close to a year, some are no longer maintained.
https://github.com/topics/crystal
There are lacking of stories around the cloud computing or any large businesses has success with Crystal. They need more of those to gain support if they don’t want to remain in “uncharted territory” which mentioned in their interview.
I’m aware of this nice site, but I think you are aware Kemal threw an unfriendly message error when you reach to the last page? 100’s shards on each page is rather slow and limited options to search, I would prefer to stick to Github or self-host with Github API that supported some language.
Out of the box, Crystal does garbage collection through the boehmgc library, however if you pass in some compiler flags you can get application to not use GC.
As for my OS, I wasn't gonna port libgc, and since I have prior knowledge of building a GC in rust (very badly!) I decided to make a tri-color concurrent, mostly precise (through compiler patches), garbage collector, in Crystal.
I've read up on how luajit does garbage collection (in fact the current gc/allocator with allocation/marking bitmaps is inspired by his tri-color gc. Apart from specifying when the GC should run, I won't be changing anything about the core gc algorithm used in the project anytime soon.
Interestingly apart from the ruby like syntax they seem to be quite different languages from my limited exposure, in terms of the domains they target (elixir - distributed systems, crystal - whatever go/rust are known for minus distributed systems?), the type system (dynamic vs static), deployment story (elixir - mix based, crystal - binary deployment), the standard libraries they leverage etc.
It seems the sum total of them covers a lot of ground from writing quick CLI apps to large distributed systems for developers familiar with Ruby.
I think Ruby developers are a lot more likely to switch to Crystal than Elixir. Elixir encourages a much more functional style than I think Ruby developers are used to.
There is definitely a lot of Elixir developers who come from Ruby. I mean José Valim himself was a core RoR team member.
I'm trying to find an Elixir job right now and pretty much all of these comes with a RoR requirement. (also it seems some companies are moving from Elixir to Go).
As I'm mostly a PHP / Symfony guy I feel the disconnect, and at my current place people really don't care about Elixir at all. Basically Go and Node would be much better prospect for me.
So how would this compare performance wise to a lightweight linux with lxde? Can you run this on a raspberry pi and get good performance? What are the minimal memory requirements?
This guy (an high school student), writes a OS, from scratch, and you made an account to say that his choice of wallpapers "undermines" his project.
I guess you're one of the people that supports things like that bullshit code of conduct that is popping up pretty much everywhere.
RIP the hacker spirit of the 70s. May it rest in peace. Welcome to this brave new world where people like you are welcome instead of being mocked all the way to middle management meetings room, where they belong to.
And the windows are carefully positioned as to put her at the focus of the screenshot. And the project is named after a succubus? The overt objectification is unnecessary and unprofessional.
Basically anyone non-american is ok with this i think (knowing the author is probably also underaged). I hope US scare of anything barely related to nudity won't be the new norm.
Hell, I'm American and I don't really see a problem with it either. Wallpapers are meant to be seen, and most importantly are a reflection of what the chooser of said wallpaper likes to see; if people like looking at half-naked anime characters every time they use their personal computers, who am I to judge?
Emphasis on personal computer, of course; if this was in a workplace or otherwise-professional setting, then yeah, I'd be concerned, too. That's obviously not the case here, though (unless you're really building your business on top of some high school kid's pet OS project, in which case - no offense to said high school kid - you're completely and irrefutably bonkers).
If you don't like it, fork it and replace all the wallpapers with puppies or Jesus or Insane Clown Posse or whatever floats your boat. I won't judge your taste in desktop backgrounds.
You seem very interested in imposing your cultural values on teens from other countries, who seem to have produced in a few month more than what the average HN commenter will achieve in years.
As admirable an accomplishment as this is, and as with many other projects where the arenan is highly saturated, I feel the README should start with a quick WHY? section. I'm truly not trying to dismiss the programmer's great deed.
There's no smoke and mirrors as far as I can tell. I dug back to the initial commit: https://github.com/ffwff/lilith/commit/c9fa1053dc6a22d630ee6... - and it's just bootstrapping to VGA.
Very well done, and inspiring. Makes me want to try OS dev again.