> All in all, I guess this is the result of a company that has more money than they possibly know what to do with. I wonder how long this utopian "do no evil" culture can last. Wealth creates power, and power corrupts. And boy, have I seen a lot of power this last week.
Another interesting addition would be to warn drivers if a close by car/driver also had a bad driving history. May be also track if the close by car started of from a bar/restaurant and weight the probably of the driver ending up with a crash with other cars in the vicinity
I have done something similar using https://pybowler.io/. This library wraps the LibCST library and abstracts away some of the refactoring operations into nice wrapper functions.
I am curious. Why do you think it helps it doing things on the stack. For me, I understand this to provide more generic type level checks on the size/shape of values. Perhaps, I am oversimplifying this?
Basically, the const-generics feature gives the programmer the ability to create fixed-size custom types that doesn't allocate on the heap. For example, with current Rust you can't make your own generic fixed-size hashmap (as in FixedSizeHashMap<K, V, N: uint>, where N is the maximum number of keys you can use) that doesn't use the heap, since you can't use size-parameterized arrays in structs. Although this isn't a big deal for most general programming purposes, it still matters a lot in domains like embedded/HFT/gamedev/simulation where minimal latency is incredibly important. It's also important when you want to create any sort of linear algebra library (such as Eigen). For me const-generics is an important dealbreaker when choosing between Rust and C++ (where C++ already had the ability to do this for decades).
Can you eloborate on what kind of projects you work on which has made the move easy from Python to OCaml. While I love OCaml I have not heard positive things about the library eco-system in terms of variety and maintenance. Perhaps, you could throw some light on that based on your experience, especially after giving up the eco-system Python provides you.
This is a sincere question and would love to find good excuses to adopt OCaml as my primary language of choice.
>I have not heard positive things about the library eco-system
OCaml has quite a neat ecosystem, though. Especially in terms of quality. There are a few major and a bulk of minor groups that write libraries for different purposes, such as:
* MirageOS -- people behind a bunch of a high quality system libraries, there are tls (in pure ocaml) and various cryptography primitives, filesystem drivers, tar, zlib, full tcpip stack, http server, git (in pure ocaml, not a mere binding), irmin (a git-based key-store database with concurrent access), dns and some other [1].
* Ocsigen -- webdevs, authors of Lwt (monadic concurrency library), js_of_ocaml (ocaml to js compiler) and related stuff [2]
* Jane Street, Facebook, blumberg -- reason syntax, bucklescript (alternative npm-ecosystem-frienly javascript compiler), various libraries for serialization, containers etc.
* Others
Opam [3] has quite a comprehensive set of libraries pretty for every purpose: data serialization, interaction with other langs (python, R, beam Erlang/Elixir), servers, db bindings, stuff for system or web programming. Some stuff is absent, but it's definitely not a scorched field, and most of the time the benefits worth writing an absent binding or two.
Have you considered using ipython as your REPL? I use ipython and in my profile, I have the following setting. This enables auto-reload and it works almost all the time.
Here is the option I have in my profile file. You can run this command inside the ipython session as well:
%autoreload 2
The times I have seen auto-reload fail is when there is lot of class creation magic going on or while dealing with global connection objects which sql alchemy might be creating. But, other than that it works very nicely.
Best not mess with someone who knows where you live and has your credit card. If she's in a good mood, you might be hit with just a pizza prank. Hope she doesn't realize she can swat you without moving a finger.
Actually, there are three parts. The course uses Standard ML, Racket, and Ruby as vehicles for teaching the concepts. The intent is to make you a more effective programmer in any language.
> It changed the way I learn any new programming language.
I can say the same and I can offer my reasons: until this course I saw every language like a little island; after this course I understood that programs are just a collection of features: various typing systems, static/dynamic scoping, lazy/eager evaluation, etc. It's a ton easier to learn a new language by identifying these features than by looking at a language as a big blob. This also made me realize that languages are not little disjoint island - they're overlapping a lot instead.
The course was the way I got into racket and other lisps and this allowed me to read SICP. Since then I've been doing all sorts of toy interpreters/transpilers for fun and it allowed me to get an idea of what's happening behind the scenes in real languages. For example, I used to think that closures are magical, but after implementing them as part of the course they were a piece of cake afterwards. You will get a profound satisfaction when you implement call/cc yourself and suddenly you understand how try/catch or generators work.
I took the same path and went back to reading SICP. But, this time around is was very easy. I had the same experience about implementing closures and the embedded language.
Interesting. Did you feel like you needed a strong understanding of compilers or automata to really grok what was going on (I think automata relate to programming languages, but could be mistaken)?
None at all. Automata are used to turn a program from its textual form into some manageable data structure that something else will consume (actual interpreter/optimizer/compiler). At some point in the course (in the racket part) you will be asked to implement an interpreter for MUPL (made-up programming language), but the programs are directly written as a data structure - so no need to parse; in racket both data and code look exactly the same - it'll be a breeze.
I think the only requirements for this course is some plain procedural language (C/Pascal).
I already see some good responses to your question.
As for me, before this course, learning a language was a mechanical process. I learn the syntax, learn some idioms and go with it. But, after this course, as the other commenter put it, I started learning every language as a set of features. That opens up a whole new world. For instance, when learning a new language, you seek out the features your are interested in and then figure out how that language lets you use it. For example, does a language support abstract data types, what paradigms of programming does it support, is it imperative or functional, lazy or strict, is the language supposed to be used as a bunch of statements or expressions, can common idioms be implemented as simple language functions or do I need the language to support it internally etc, does it support lambdas, does it do lexical or dynamic binding etc. The course also takes you through ML, Racket and Ruby and gradually exposes you through this concepts and in parallel explains what the trade-offs are as you give up once paradigm for another.
So, after the course, next time if you open up a beginners guide to any language, you will be seeking our answers to high level questions. The syntax to use will be learned automatically as you use those 'concepts'
Dan Grossman is a an excellent teacher. His passion for programming languages can be seen in his teachings. The homeworks are very relevant and helps you solidify the concepts. I am thankful to him for offering this course.
It dispels all the magic around programming, by helping you build a knowledge of computer programming agnostic to any programming language. Dan Grossman is a superb teacher and the way he ties concepts together is awesome. I'll be forever glad for this MOOC.
Do you always need some lazy evaluation to arrive at the Y-combinator? It seems that even for the strict example a lambda was used to hold the recursive call evaluating too early.
Well, anyways, these are very interesting ideas to study and opened up some of the reasoning behind Haskell's laziness for me.
> All in all, I guess this is the result of a company that has more money than they possibly know what to do with. I wonder how long this utopian "do no evil" culture can last. Wealth creates power, and power corrupts. And boy, have I seen a lot of power this last week.
> "May you live in interesting times."
Now, juxtapose that with the authors other recent post: https://social.clawhammer.net/blog/posts/2024-01-10-GoogleEx...
Interesting times in deed!