
Yukihiro Matsumoto: “Ruby is designed for humans, not machines” - Evrone
https://evrone.com/yukihiro-matsumoto-interview
======
dlkf
There's a lot of comments of the form "I tried getting into Ruby and it never
clicked" that I want to respond to.

I think that if you're just writing a script to solve a math problem or
automate a task, there isn't really a strong argument as to whether you should
use Ruby, Python, bash any other high level language. You should probably use
whatever you're most comfortable with.

But if you're making an application with a database, RoR is an extremely
powerful tool. And this isn't just a case of Ruby having the best library (in
the same way that Python has the best scientific computing library). It's a
case of the library functionality and the language playing together really
well.

Ruby statements look more like natural language statements than those in other
languages. This is a really helpful design choice when you're building a crud
application. In that context, you're not thinking about X, Y, -Log(X + Y) or
whatever.

You're thinking about a _new customer_ , or whether some old customer is a
_member?_ of some mailing list, etc

In a weird way, it's similar to how Haskell users love the fact that
statements in Haskell are mathematical truths. In the case of logic
programming this is a handy feature to have. In the case of writing the
business logic of a CRUD app, it's handy if your statements look like natural
language statements about the underlying real world objects whose
representation you are reasoning about.

~~~
fantasticsid
I think focusing on the syntax level is a bit superficial. While Ruby
(especially in the context of RoR) lends itself very well to making DSLs and
making abstractions, and thus writing superficially good looking, concise,
easy to follow (on a syntax level) application code, it really breaks down
when the abstraction does not work perfectly the way you want it.

If you have debugged a Rails app and tried to find where in the 10 level call
stack a side effect is introduced, knowing that at each level a
'method_missing' could've changed things, and thus you're really looking at a
non-linear call stack since each function call can branch out, you'll know the
beautiful looking syntax is not free and in fact very costly.

And don't get me started on poorly documented (if any) CoC (black magic)..

~~~
alanh
CoC means many things. What do you use it to mean here?

~~~
fantasticsid
Yeah, sorry for the abbreviation. I meant 'Convention over Configuration'

------
pdimitar
Strange takeaway in the title.

Mine are:

\- He regrets introducing threads. He now prefers other
parallelism/concurrency paradigms and is possibly looking into introducing
them in Ruby 3 (that part wasn't clear in the interview).

\- He regrets adding global variables.

\- He regrets not focusing on immutability primitives earlier.

\- He and the maintainers team look at Python, JavaScript and Elixir for
inspiration.

\- He seems a bit disappointed that Ruby is mostly used for web development
only and would be happy if it was used in research and AI.

\- He regrets not investing even a little effort in static typing earlier but
is now working on an optional typing system where you choose to describe types
in a separate file. (Reminds me of C/C++ and OCaml header files.)

I hope I’m not subconsciously emulating a sensationalistic journalist here but
it doesn’t seem he’s very pleased with the current state of affairs. A lot of
changes, indeed inspired from other languages, seem to be on their way to Ruby
3.

\---

I’m pretty indifferent to Ruby these days. I’ve worked 6 years with Rails to a
moderate financial success back then but the problems were obvious almost from
the start: too much magic and implied behaviour, too much global stuff
changing below your feet, way too slow DB layer (it’s known that your DB will
respond in 2ms and ActiveRecord will proceed to waste 100+ ms on serialisation
before and after), managing bigger projects becomes huge pain, and a number of
others.

People just keep buying the promise of “but you get to make an MVP really
quickly!”. Yeah well, but I want to maintain it past that stage too.

Where I was doing with this is: Ruby is a very okay language and is very nice
for sketching. It's readable, has libraries for a ton of stuff, and performs
alright (especially without Rails). But I never found good use of it outside
of small projects. Any serious business work becomes very painful to maintain
and stay on top of for bigger projects. I’m aware there are practices that
make it possible (Basecamp is a huge inspiration!) but I personally found them
only half-working.

YMMV, of course.

~~~
norswap
Good points.

Personally, it's a bit puzzling to me why the ML community ran with Python and
why there isn't a popular alternative for Ruby. It's very capable in terms of
making DSLs.

My hunch is that the number-crunching libraries were better in Python (numpy
etc).

~~~
pdimitar
I spoke with several data scientists and students in the last 2 years and the
consensus was basically two central points:

1\. NumPy is really fast and has a lot of features. Even if they don't need
all of them they feel more at ease knowing that they can use them should the
need arise.

2\. Python is very familiar to them, their professors know it, they know it,
it's easy to teach to new people in the team so everybody just goes with the
flow.

As much as I think the bigger IT area is in a desperate need of innovation...
I can actually agree with those two points. Tinkering should be left to us the
programmers and sysadmins. Everybody else has a job to do and they cannot and
won't spend a lot of time evaluating all tech possibilities.

That's okay. It's our job to tempt them with something irresistible. :)

~~~
Gravityloss
But those are reasons that are valid _afterwards_. Why weren't the good
numerical libraries developed for Ruby?

Maybe the watershed moment was when MIT switched from Scheme to Python? I
don't know.

~~~
mdasen
Ultimately, it's probably that Python is older and it's use in the English-
speaking world is substantially older. Python is around 5 years older than
Ruby and the first book about Ruby in English came 10 years after Python
existed.

It's 1995. Python has been around for 5 years. Guido is working with the
scientific community and adding better complex number, array, and matrix
support to Python. Ruby has just started and everything is in Japanese.
Scientists and Engineers at US universities and FFRDCs (Federally Funded R&D
Centers) keep plugging away at Python for scientific purposes. Finally in late
2000, the first book about Ruby is published in English. Ruby is still pretty
unknown. In 2005 Rails hits the scene and the English-speaking world notices
Ruby. But this is 10 years after the scientific community starts investing in
Python and 15 years after the Python language came out. By this time, the
community had Numeric, numarray, and finally NumPy to unify their efforts.

Ultimately, Python's BDFL started working with the scientific/numeric
community before Ruby existed. It would be 10-15 years after Python that Ruby
would start making inroads in the English-speaking world and by that time
there was already good numeric/scientific tools in Python. Scientific
communities weren't looking to learn new languages for the sake of new
languages. Many still just use MATLAB.

MIT hadn't switched to Python when a lot of this happened. I think it's more
that Ruby wasn't widely known to exist until 2005 and was "brand new" to the
English-speaking world then. Python had been plugging away at numerical
libraries for a decade by then and NumPy (which had learned from previous
numeric/scientific libraries) came out basically when Rails came out.

Once all these numeric libraries exist for Python, are you getting grad
students excited to work on an open-source numeric system for Ruby? Are you
getting FFRDCs interested in creating libraries for a language that's quite
similar to Python without any significant advantage? How do you justify that
work? How do you get people excited about it when the reaction of most users
is going to be "we're already happy with what we've got"?

Rails burst onto the scene at a time when people were often unhappy with their
tools. Rails was a breath of fresh air with clean separation of concerns and a
lot of helpful stuff. This is back when Java meant XML sit-ups, C# was closed-
source and this is before most of what the community knows today existed,
Scala had just appeared, Python was still in the Plone/Zope era (remember when
an object DB in Python seemed good?), PHP was pretty dominant but often ended
up a mess and didn't have the same framework that Rails was offering. Rails
offered something compelling and new which meant attention. Many other
communities followed Rails - Django, .NET MVC, JAX-RS, etc. Still, Rails was
there first and got a lot of mind-share.

Similarly, Python was there first in scientific/numeric computing and people
haven't provided enough to compel people to move off it.

~~~
save_ferris
I love reading historical context around language use like this. Thanks for
sharing.

------
vincent-toups
I don't understand languages like Ruby and Perl and their supposed emphasis on
"human design." My sense of what a human designed language would be is a lot
more like Scheme: extremely simple, fundamental abstractions which, when
composed, enable a variety of different strategies for solving problems.

Ruby, in contrast, seems like an absurdly complicated concatenation of weird
features and strange syntax. When I look at Ruby my eyes glaze over with all
the unnecessary syntactic doodads and peculiar abstractions. Why anyone would
want to complicate their lives with all that silliness is beyond me.

And yet Scheme languishes and Ruby, for a time, anyway, burned brightly. I'm
clearly the weird one.

~~~
mapgrep
>Ruby, in contrast, seems like an absurdly complicated concatenation of weird
features and strange syntax

My experience is the precise opposite. Ruby seems more consistent and
straightforward than any other non lisp language I’ve used. Ubiquitous blocks
and “everything is an object” gets you very far.

Compare nested list comprehensions in python to chained map/select calls in
Ruby:

lesson for lesson in [school.Lesson(downloaded) for downloaded in
api.downloads()] if want(lesson)]

vs

api.downloads.map{|download| School.Lesson.new(download)}.select{|lesson|
want(lesson)}

Or compare the many varied required ways to do something with an array in
python vs Ruby:

len(array)

“;”.join(array)

array.append(x)

Vs

array.length

array.join(“;”)

array.append(x)

Genuinely confused what you refer to with “unnecessary syntactic doodads.”
Examples?

~~~
0x445442
You answered your own question with your qualifier... "Ruby seems more
consistent and straightforward than any other non lisp language I’ve used."

~~~
hartator
He has replied with actual code samples. Which is valuable.

------
retrobox
I’ve been trying to get into Ruby for a new job and am finding it hard.

Not being designed for machines is right - a lot of the IDEs seem bad and
can’t even figure out where your methods come from. I don’t know how the
interpreter manages to run it. Multiply the problem by 10 once you add rails
and all the methods it generates.

It makes for a good, rapidly developed monolith but most places I’ve worked in
recent times are moving to services and micro services and it doesn’t seem cut
out for that.

~~~
horsawlarway
I'm in the same boat. I'm coming up to speed on a very large, 5 year old
monolith RoR codebase.

I fucking hate the thing.

I've done quite a bit of web dev across a pretty large variety of languages
(c/c++/c#/php/js/golang/other).

They all have rough edges, but never in my career have I been more frustrated
than when dealing with Ruby on Rails.

The tooling is... the best I can call it is bad. Simple things like jump to
definition just don't work, basically anywhere, autocomplete is non-existent.
RubyMine is the closest to functional, but it's still MILES away from good.

Non of that alone is a deal breaker - I've written sites in plain old notepad
or gedit, but when languages have bad tooling, I expect good documentation.
Rails is off in la-la magic land though. Docs are hard to find and
inconsistent. I'll frequently find documentation that contradicts itself (ex:
Are middleware "Set-Cookie" headers delimited by a comma, or returned as an
array? Ans - Trick question, the documentation says both in different places,
but it's actually a newline, which isn't documented fucking anywhere.)

Trying to just figure out where a method definition lives is needlessly
tiring. I asked a long time dev at the company how he finds the source for
methods that are magically pulled in by rails. His answer: "I do a global
search for "def [name]" and if that doesn't find it, I give up and ask the
development slack channel". Personally, I find that mind-boggling.

And honestly, I guess most of my frustration isn't really Ruby. It's Rails. I
don't love that Ruby seems to encourage the "make it a dsl!" type programmer,
but by itself there are good times/places for that. Rails though? It just
doesn't scale for humans.

Rails feels like it started off with a solid idea - Don't worry about the
boilerplate, just write productive code, we'll handle the magic. And that
works great... up to a point. But that point was years back. Now the whole
ecosystem has grown to the point where I HAVE to understand at least some of
the magic, but because they didn't want you to see it, it's really, REALLY
hard to get good answers about how/where all the various bits play together.
Instead you chase outdated docs, bad stack overflow answers, and shitty forum
posts.

I came on board pretty excited to pick up Ruby on Rails. A year in and I'm
pretty convinced that I'll recommend basically ANY other framework and
toolset.

~~~
retrobox
All so incredibly relatable... thank you for sharing your experience.

I have pretty much 1:1 the same struggle and it was (is?) starting to knock my
confidence - wondering if I’m just a bad developer for not “getting” it.

I often put off development tasks because I know it will all move at a snails
pace trying to drudge through endless magic, having no idea where any of it
comes from, getting very little back from Google, and giving up.

------
pwdisswordfish2
“Ruby is designed for programmers, not end users”

It's not machines who care about performance. It's the person who has to use
the program written. It may sound gentle and altruistic, but when you actually
think about it, putting “humans” first is actually an incredibly self-serving
sentiment.

~~~
darkerside
There is already an ideal programming language for computers. It's bytecode.
All high level languages after built for people, not computers, to make it
reasonable for us to collaborate and maintain our own code. The computer
itself doesn't care for anything beyond 1s and 0s.

~~~
pwdisswordfish2
And computers are built for people, to make it reasonable for us to
collaborate and maintain our own (non-programming) work.

Dismissing performance concerns because you're designing a language “for
humans” is actually glossing over a quite large number of other “humans” (i.e.
non-programmers) who do have a stake in the choice, but not so much say in it
besides “why is my computer so slow lately”. Well, it's slow because you
thought type checking is for know-it-all ivory-tower academics and pedantic
asshole compilers, so you chose JavaScript (or whatever “forgiving” language
is in vogue these days; the OP is about Ruby, but it doesn't really matter),
and so the user has to wait until the JS engine finishes JITting your code
over and over again to infer type information that should have been available
at compile time in the first place. Rinse and repeat for each app, and now
they have to replace their laptop every few years to catch up with ever-
growing (for no good reason!) RAM requirements.

Claiming that a programming language is “designed for humans” much too often
amounts to saying “my (programmer's) convenience and satisfaction is more
important than yours (user's)”. And that is an _incredibly_ arrogant thing to
say, as superficially gentle as it may seem at first glance.

I'm not arguing against high-level languages; I'm arguing against disregarding
performance in the name of programmer convenience. It's quite possible to
design a high-level language in a manner that doesn't stand in the way of
making its implementations performant. There are some recent developments on
that front.

But if you ignore them, then it's not the “machine's” resources that you're
wasting. It's the _user 's_ resources.

------
emiliovesprini
Reminds me of _' 'Programs are meant to be read by humans and only
incidentally for computers to execute''_ from Structure And Interpretation Of
Computer Programs.

What always gets me about that quote is that's it's obviously false. There are
exceptions (such as space shuttle controls), but most software that's
''important'' is important because of how much it's run \--Firefox is run a
whole lot more than it is read-- and so this Incidental Execution stuff is
nonsense.

Performance and Readability are both important goals. Most of the time they're
not even opposed as to justify pitching them against one another like this.

------
Evrone
We’re thrilled that our good friend Yukihiro Matsumoto, creator of the Ruby
programming language, has been able to join us at RubyRussia 2019 as a speaker
for the second time, having previously spoken three years ago at RubyRussia
2016.

In the time that we’ve been holding the conference — now more than ten years —
Ruby has undergone a great deal of evolution, and Evrone has grown and
developed alongside it.

Grigory Petrov, Developer Relations at Evrone, sat down with Mr Matsumoto to
hear from him first-hand about being a star, the philosophy behind Ruby’s
design and evolution, and a little about Japanese life and culture.

------
torgian
Honestly, I never could get into the Ruby language. I always kept making
mistakes with the syntax and ended up gravitating towards python and
relearning JavaScript

~~~
Hamuko
Same. I always felt that Python was actually designed for humans and that Ruby
had an incomprehensible syntax.

~~~
thaumasiotes
I never did much with Ruby beyond toy problems, but I was impressed by the
lengths the language goes to to make it convenient to express what you want to
do.

There's one syntax for half-open ranges and another syntax for closed ranges.

You can index an array any number of different ways: "get me the 5th element".
"Get me the element three back from the end". "Get me elements 5 through 8
(inclusive)". "Get me 0 through 6 (exclusive)". "Get me the 3-length subarray
starting at index 2".

A hash table returns nil by default when you access a key that isn't stored.
But you can change the default value as a property of the table itself. Users
now don't need to remember to supply their own default value when they're
looking in the hash table. Even more, you can specify a function that gets
called when a nonexistent key is accessed, to do whatever is appropriate.

All of those are great things. It's easy to translate your algorithm that says
"take a stretch of three elements starting from i" into an access that looks
like array[i..i+2]. But the semantics of your algorithm start slipping away as
you make adjustments like that. If what you're doing is taking substretches of
3 elements each, there's a gain from being able to express it as array[i, 3],
as semantically distinct from taking a substretch between two independent
bounds that happen to be i and i+2.

But the flip side of this is that if you're not regularly immersed in the
language, it's really easy to lose track of precisely which bit of syntax does
what.

------
sparker72678
Dear everyone crapping on Ruby (and esp. Rails) in this thread (and every
other time this comes up),

If you don't like Ruby or Rails, that's okay!

No, really. You can use something else. You won't hurt our feelings!
Programming languages and their frameworks can actually be a preference, not a
moral issue.

We're here because we like it, and it's totally cool that you don't! Please
use what you love; we are, and we know how powerful that can be.

Cheers, — Every Ruby Developer

------
monadic2
Shit, he should have considered the human cost of method_missing.

Yes, I am serious, it’s a massive productivity drain. You encounter methods
that you can’t google for, so your grep game better be good to get work done
(grep for a portion of the name, and then method_missing itself if that
fails). An IDE is useless in this case and your performance goes out the
window. It’s much better to dynamically generate a finite number of methods
you can see when inspecting objects.... I suspect those would jit better, too.

Oh, you want to access the call stack on your server to figure out where that
frame is? Jokes on you, ruby has no remote debugging mechanism aside from a
core dump or building your own heap inspection access.

Dang, why was this flagged?

------
runeks
And that’s why it relatively slow compared to e.g. C — which is designed for
machines, not humans.

~~~
jbverschoor
Replace ruby with crystal, and this comment can be removed.

~~~
ryanpetrich
Crystal is not Ruby, nor is it anywhere near as fast as C.

~~~
phaedryx
Really? Benchmarks I've seen put it in the same ballpark

------
tomerbd
Ruby is truly nice, I just wish I had the option for static typing built into
the language, as a human I prefer it because then IDE support is much better
for any refactor, etc and getting to maintain other people code.

~~~
elliotlarson
You're not alone. Stripe has been working on adding types to Ruby with their
static type checker, Sorbet: [https://sorbet.org/](https://sorbet.org/). This
seems to have a bit of momentum at this point. Assuming it continues in
popularity I'm excited to see how this helps to improve Ruby work in an IDE.

~~~
tomerbd
Looks interesting! will have a look.

------
manoj_venkat92
I was a RoR developer for two years and immensely enjoyed coding in Ruby, well
technically, RoR(Ruby on Rails). It was in a startup and features were being
developed at a rapid pace and it really felt like the right choice. But I also
noticed that as companies grow they have problems with the maintenance of the
code as well as the performance like twitter moving away.

------
thunderbong
I'm seriously not able to distinguish how many of these comments are about
Ruby and how many are about Ruby on Rails (RoR). From the very beginning
Ruby's popularity and criticism have hinged on RoR. And most people tend to
conflate the two completely.

Personally, I love Ruby but don't like RoR. RoR seems to bloated to me. Of
course it does many things. But very rarely have I come across projects which
require all those things. And even then, I'm quite sure, we wouldn't be able
to use all that RoR has to offer.

For many, many, years now, my stack has been Roda / Sinatra for routing and
Sequel for DB access. And whenever, we've come across some requirement which
needs heavy lifting, there are enough Ruby gems which do that without tying
you up with the RoR framework.

------
alexandernst
Kind of offtopic: I did some Ruby and I find some of the conventions really
confusing; for example, the lack of “()” when calling a method. You never know
if you’re accessing a property of calling a method until you grep the code. Am
I the only one who feels this?

~~~
ben-schaaf
I used to find this odd as well, but that feeling came from a misunderstanding
of what a "property" is. When you use `attr_reader :foo` what you're actually
doing is calling a method that defines a method on the class for accessing
`foo`. `a.foo` is not accessing a property, it's calling a method. `a.foo = 2`
is not writing to a property, it's calling a method called `foo=`. The "()"
can be left out because it's never ambiguous whether you're calling a method,
because you're always calling a method.

~~~
mercer
I remember loving this when I learned Ruby. There's a certain elegance to it.

However, I'd say for practical purposes, it still leaves me with the problem
that in my day to day coding I'm left with the same ambiguity. I don't know if
I'm calling a simple-assignment auto-generated method that is setting a
property, or a method that does all sorts of stuff under the hood.

While I'm still fond of Ruby, I've come to prefer less ambiguity. I like to be
able to see at the calling site whether what I'm doing is calling a method or
setting a property, and parentheses are a great way to signal that.

As an aside, one thing I didn't like about Elixir was that it also made
parentheses optional. Once I realized _why_ this was necessary, and once the
formatter was added and enforced parentheses as a default, it stopped being a
problem. Still, every language I've used where they're optional have caused
weird little problems that never seemed to be outweighed by the convenience
(CoffeeScript comes to mind).

But all that said, it's not too high on my list of things to hate on :).

~~~
dragonwriter
> I like to be able to see at the calling site whether what I'm doing is
> calling a method or setting a property

In Ruby, as is also true in many languages that have a syntactic distinction,
setting a property is always ultimately calling a method, so the distinction
is illusory. What you may want is a purity guarantee (well, that getters are
pure and than setters have no effects except on the state specifically backing
the property, but the latter gets complicated to apply to nontrivial
properties), but most languages that let you distinguish whether properties or
methods are exposed in an API don't provide a that kind of guarantee with
properties, either, just more boilerplate code to implement them.

And the whole point of properties over exposed data members is to abstract
behavior so that implementation changes don't change APIs.

~~~
int_19h
Even when it's not enforced by the language, it can be something enforced as
part of the coding style by the team - that Foo vs GetFoo() indicates presence
of side effects, or potentially expensive computation.

