Hacker News new | past | comments | ask | show | jobs | submit login
Thank You, Guido (dropbox.com)
1269 points by pauloxnet 13 days ago | hide | past | web | favorite | 375 comments

To be honest, it is a sad moment, but it was expected. When Guido resigned from BDFL, I knew that it was the beginning of the end.

And lets be honest, he's earned it.

Thank you Guido, after learning Python, I never looked back, and it has been the source of great joy for me over the last decade.

It's not the beginning of the end. Retirement is not a death sentence. Lots of folks go on to do some of the best work of their lives... not that Guido hasn't already done plenty of that.

I’m sure he’s going to take a break from the chaos of being at a unicorn in SF and the departure from BDFL, spend some time in the woods reading books and getting ready for the next thing. He’s got plenty of cash (presumably) so hopefully finances are not a source of concern. Then the itch will come and hopefully he will be poised to scratch it.

I’m sure we’ll see more from him!

> Thank you Guido, after learning Python, I never looked back

That matches my experience. All the pros and cons that in principle apply to any language, whether newer than Python or older, mean nothing to me in practice. Python is the one that feels right, the others feel like a chore. Since I started with Python 10 years ago, nothing has changed that.

There's ways I'll insist that Python is the best at [x], and ways I'll concede some other language is better at [y]. But it never amounts to anything close to a debate about what language I should use.

There are a lot of times where Python is an inappropriate language choice. Anytime you have concurrency for instance. Definitely any time you need to distribute an application directly to an end user.

Beyond times when Python is clearly wrong, there are other issues, mainly around tooling and documentation. Languages have come a long way in the past 20 years.

"But it never amounts to anything close to a debate about what language I should use."

So if your task was to deliver let's say commercial grade FFT library ( substitute with any other long computationally extensive process ) for consumption by others you would write it in Python?

Given the success of the Python numerical computing stack (numpy, scipy, etc) it would probably be a good idea to do just that. Of course you’d profile the library and move some of the performance-critical bits to C / C++ / Rust. Thankfully Python has some well-trodden paths to that solution.

IMO a good an practical idea would be to write it in C/C++, expose as flat C API and then offer wrappers in whatever popular consumer languages there are (python included). The world of software does not end with Python

Yes, definitely a good approach as well. One point in favour of the Python-first way is that you can prototype and test much quicker in Python than in languages that are closer to the metal.

On my team, we've been successful with this rough approach:

1. write a prototype in Python to explore the idea/ensure correctness

2. extract the core library and re-implement it in Rust

3. consume the core from Javascript via WASM and Python via pyo3 (Python/Rust bindings)

One point in favour of the Python-first way is that you can prototype and test much quicker in Python than in languages that are closer to the metal.

Interestingly I've found something quite opposite for computational algorithms. The problem was that execution of Python under debugger was unbearably slow for me (I used VS Code for this case). So waiting until the code gets to the point I need was not fun. With C however it would get there almost instantly. Compile time for isolated computational function was nearly instant as well. Maybe debugging under VS code was the culprit but I did not feel that for this particular task Python was any more interactive than plain C.

Yes, I can see that being tiresome. Our projects are closer to simulations, so rather than focusing on a tight core at first, we have a number of factors that can be in play during each time step.

Rather than worry about figuring out structures for our simulation objects like we might need to do in a lower-level language, Python lets us just throw together some numpy arrays and dicts as needed, and really iterate quickly on the solution. If I was working on something with a tight computational core I would probably take the low-level first approach like you have. Horses for courses, and all that.

Clearly his statement is predicated on the kind of work he does. Dont be obtuse.

If this is "predicated by the type of work" then why make generic statement? Am I the one being obtuse?

Do most generic statements start with "This matches my experience"?

Kinda seems like you are, mate.

Not sure how they start but sure look generic when they end like this:

There's ways I'll insist that Python is the best at [x], and ways I'll concede some other language is better at [y]. But it never amounts to anything close to a debate about what language I should use

Anyways I should know better then discuss Python

Of course there are scenarios where you can't choose your favorite language, I'm not going to write an Xbox game client in Python.

To the extent I have a choice, I'll go out of my way to use Python and I'll be a little creative to fit it into certain uses cases where it's not a perfect fit, consciously accepting a reasonable penalty. But yeah, sometimes it won't fit at all.

The point is more that I'd go so far as to avoid taking a job or gravitating towards a project where the development would not be enjoyable based on the tools I'd have to use.

"The point is more that I'd go so far as to avoid taking a job or gravitating towards a project where the development would not be enjoyable based on the tools I'd have to use."

I can definitely understand this point of view. Well sorta. I was/am always goal oriented. I find satisfaction in creating a product used by other people. The actual process of implementation is 90% boring work and language for me is just a tool. I do not get too excited by languages.


Could you please review the site guidelines (all the way to the end!) and stick to them when posting here?


There are FFT libraries written in Cython, the superset of Python that compiles to C/C++. Just because the Cpython runtime might be too slow for this type of task doesn't mean the language is.

And why would anyone write FFT code anyway? Robust implementations exist already - see netlib.org for algorithms implemented in C, C++ and Fortran. SciPy uses these and wraps them for use in Python, with newer code worn in Cython.

So yes, FTT applications can be, and routinely are, written in Python if that is what you are most familiar with, as is the case for many natural scientists and data scientists.

And why would anyone write FFT code anyway? Robust implementations exist already

The FFT was just an example, you can substitute it with any SolveTheWorldHunger computationally extensive algo

And you'd find that Python is a go-to language for people writing such algorithms. The black hole image which was in the news was produced from massively complex data sets from across the world. Would you like to guess which programming language was used?

Python looping is often the slowest part of data analysis algorithms, but no-one uses Python loops is these cases. Vectorising this sort of analysis using fast extensions makes Python perfectly speedy enough for actual scientists.

"And you'd find that Python is a go-to language for people writing such algorithms."

If it works for them, sure. Did not work for me.

Questions made in bad faith tend to be downvoted. Is this really a suprise?

I do not see any bad faith here. I just disagreed to the original point. Anyways it looks to me that anything said against Python is big no.

Btw. I do use Python, for complex shell scripts mostly.

I don’t like python and don’t use it but I still think your question was unnecessary.

The condescending and arrogant tone got you downvoted both here and in your previous comment

There is nothing condescending and arrogant here. It looks like the other side is being way too touchy. Anyways I got the message: stay away from anything that questions Python being the best thing since sliced bread.

No, you're not down-voted because you don't like Python. You're down-voted because you come across as hostile and condescending, whether that was your intention or not.

FWIW, I use Python because it’s necessary. I’m not overly fond of it. Give me Erlang, please.

And I’d still call your statements condescending.

He’s left Dropbox; he hasn’t died.

Ya this reads like a pr statement to me. Like: He was not unhappy here he is retiring!

A carefully controlled message.

Is he retiring only from Dropbox or he is retiring also from the Python committee (or whatever it is call)?

Yeah, he already did step down from being the BDFL.

PSF now has an elected steering council to replace him.


And Guido is currently one of the five members of the steering council.

Yes, had forgotten that for some reason. Thanks for the clarification.

I hated C++ and Matlab as an undergraduate. Hell, I hated programming as a whole - and effectively swear never to write a line of code once I was done with my BSc in Physics.

Then, at work, I was introduced to Python. It was so...obvious, for lack of a better world. It was like a language I always knew that never spoke before.

I'm now a software engineer, write Python almost all day, and looking back to the 18-year old version of myself, I would only say "if only you knew".

That, and more, is what people like Guido and those who followed his path enable people like me to do: enjoy the art of programming.

Thanks Guido, and enjoy your well-deserved retirement.

Interesting. I myself come form an Software Engineering background, and started programming very young in Logo (a LISP derivate), then BASIC, C, C++, Java, Ruby and a bunch of others. I have hated programming in Python every time I have had to do it (several times through my career). It I think the only language that I have really hated ... the language itself. I've done Z80 and 8086 assembly, I've done Prolog, I've done Pascal, VB, VB.net, C#, R, Matlab, JavaScript, ActionScript and even Z (formal language) but no other language has made me swear at it haha.

I think it is about the thinking process of the person, I assume you are >35, you grew up at a time that people were think about the performance more than the readability, that was the bottleneck back then.

Now that you have bigger systems and more complex requirements and cheap high performance computing, the balance is tipped.

And because python is (almost) on the opposite side of the spectrum for your thinking process, you might have a harder time wrapping your head around why it does something in a particular way.

I think this is also true for Haskell evangelists over here also, I think they started acquiring the programming mindset through advanced math(in my experience most have PhDs) etc., so haskell fits the way they solve a problem in their head the best.

For me personally it is Python, the delay between solving a problem and having it translated to python in my head is slim. You probably feel the same way about your own favourite programming language, because it is the tool that makes you most productive with the least effort.

> For me personally it is Python, the delay between solving a problem and having it translated to python in my head is slim. You probably feel the same way about your own favourite programming language.

I feel that with Ruby, which I do a lot nowadays and love

WIth regards to Python, it is the things like having to write 'self' on each class method, having spaces as part of the language syntax, inconsistent Object orientation (len(string)?? instead of string.len ) and the neverending Python 2 vs 3 pain.

Python and Ruby although in the same category have still different mindset. It is understandable the way you think fits Ruby more.

I tried to learn ruby and was put off by all the magic, last value calculated is magically returned, the useful but cryptic method calls.

It is shorter and more implicit, I like the explicit approach more.

to give you my perspective on the points you made:

- self: more explicit, I would prefer to write it, because there is classmethod(with cls) and staticmethod(without any) params.

- spaces vs. braces: I don't hate braces, but spaces makes it more readable, less things I see(whitespace vs. {}) the less distracted my eyes are from the the characters that do the actual work, you get used to it, I like it more to be honest.

- len(string) vs. string.length: again, no preference, I am used to both.

- py 2 vs py 3: it is not never ending, I helped move a 170K Django project from Django 1.6 to Django 2.1 and python 2.7.4 to 3.6 in a span of a year, we were mainly 3 people working on it on the side, our work was not even in the sprints, 2 senior engineers + 1 really good QA & deployment engineer, it took 1 year because the platform was business critical and we incrementally fixed stuff. At this moment whoever still runs py2 either has been lazy, does not care about technical debt or their business does not value solving technical debt, so they can not allocate time to do it. This is my opinion, might be wrong, maybe there are other reasons also. But the change had to be done for the health and longterm viability of the language.

>spaces vs. braces: I don't hate braces, but spaces makes it more readable

Really?? This is my least favorite thing about Python. D: It can't possibly be more readable when you're talking about large code blocks. It just happens that we're never supposed to be talking about large code blocks in Python. "If you want to write a lot of code, you use another language," they say -- probably one with braces, which syntax highlighters can match-highlight on. In Python, you just hope the indentation is far enough to make it obvious where something ends and another thing begins.

If large indentations bother you, good. Braces or not, large indentations and large code blocks should make you start wondering how to refactor them.

And if Python's design seems to discourage over-indentation and oversized code blocks, brilliant.

Nope. The indentation bothered me for an hour one afternoon in the spring of 2001, then it dawned on me what a master stroke it was. Code with ~25% less redundant bullshit.

Less is definitely more here.

This is really interesting to me because Ruby and Python are really close to each other, yet have completely different paradigms.

The syntax, dynamic nature (lack of privates, etc) are so similar yet one is a lot more flexible (Ruby) and another is a lot more explicit (Python).

Ruby is incredibly expressive, the language puts the full power into the authors hands and let them go wild.

There always good arguments on both side of the camp, both have mature web frameworks backed by massive communities. I don't think one is inherently better than the other, it's more a matter of style/mindsets.

I personally love coding in both, in fact today I wrote some ruby code that would run some python code and do stuff with it!

Developer fashion is moving towards more restrictive, explicit, static languages that reduce bugs. The traditional drawbacks and ceremony of that approach are being mitigated through inference and better tooling.

This is likely one reason Ruby is falling out of favor while Python is hanging on and still gaining.

> having to write 'self' on each class method

Um, "end"?


I assume you are >35, you grew up at a time that people were think about the performance more than the readability, that was the bottleneck back then.

From the old fart: conflating performance conscious code with the lack of readability is plain wrong.

When you see performance factors beyond 1000 on some algorithms between python/ruby and C/C++ and you see python being deployed everywhere, no wonder where all the earth's energy is going. At least these are promoted by silicon vendors.

Edit: I cannot blame too much though, its generating comfortable revenue for me as a consultant. Still, having to deploy nginx/haproxy to load balance python servers for only 200 requests per second each makes me pray for our planet, but it grants me the money the customer saved on developers ;-)

I do some consulting as well. Same thing, total python on back-end servers. All young programmers, some calling themselves pythonistas. Many actively support green movement to the point of wearing "green" labeled clothes. I loved their reaction when I showed them a table outlining energy consumption of python vs C (75 times difference).

Most services are IO bound, not CPU bound. For the later, there is Cython and C, as you mention.

I'm not sure how you can be annoyed by Python, but not by VB.

> I'm not sure how you can be annoyed by Python, but not by VB.

I think it might be that back when I worked with it I was young and did not know better. Nowadays I assume that if someone put me to do something in VB6 it will be super painful.

> It was like a language I always knew that I never spoke before.

God that's well put

Agreed. Learning Python for side projects and hobby stuff. The zen is real. It’s making me dislike writing JavaScript at my day job.

Have you tried Ruby? It's the most "what you describe" language I've used.

I don't use it because Python's more popular and I'm a slave to fashion and critical mass, but it's so much more easy and expressive than even Python's syntax.

    for i in 0..5
I can't believe Python didn't adopt that syntax, which comes from Bash... I surmise it didn't because Python is actually something of the philosophical opposite of Ruby. In Python, there is one way to do things. You use the range() call, so we only use the range() call. In Ruby, the same thing can be done in many different ways. More expressive although a little wild.

There's plenty of ways to do most things in Python. That's one of its strengths as much as anything, despite what the Zen says.

Also, I hate how discussions like this seem to always turn towards small syntactic differences rather than actual differences in functionality.

What do you do when you need to change the step? Unique syntax can fall down where a generic function call would handle it.

On the upside, it gave you motivation to double down on Python to make up for the lost time.

> obvious

Yes, coming from Perl that was a huge hit.

This article is a sign of class, well done Dropbox.

I too have to say Thank you Guido. Python 1.5 helped me a lot with several projects way back when. Then I got into Django later, and marvelled at how easy it was to use. Now I use Python (and other things) for AI.

I am sad because of the motivation for Guido leaving the BFDL position, but empathize and understand. I hope he has fair winds and following seas for whatever journey he sails on next.

Language, its community and development model is a reflection of its creator and his philosophy. So Without meeting the creator Guido, through language and its community I can see is very humble and care about things besides solving computer science problems.

May be this is one of the reasons of success of Python and its community which is open, humble, friendly and inclusive.

In my startup we have been using Python as one of the primary development language, and thanks to its community and Guido, we will continue to do so in time to come.

I recently watched a talk by creator of Elm language [1] [2] who is trying to create similar community like Python, I will try it besides lisp, since I like the ethos of Python like community and obviously PEP-20 [3].

From my whole team thanks to Guido, and Python community.

[1] https://elm-lang.org/

[2] https://m.youtube.com/watch?v=uGlzRt-FYto

[3] https://www.python.org/dev/peps/pep-0020/

I'll also add "silly." A not-too-serious attitude from day one helps protect a community (IMO). When your language is named after a surrealist comedy troupe it kinda takes the wind out of the sails of moral outrage. Whimsy is surprisingly powerful.

Which is odd to me; I resisted using python much because of the rabid culture around it, and I tried, several times. They didn't get the nickname "pythonistas" for nothing, nor in jest.

It is much, much better of late (I was thrust into a role a year ago where I am doing 95% of my coding in python now). I'm not sure what's changed; I had assumed it was the "core" culturalists were diluted by the influx of ML people, but it is only an assumption.

That might have been an artifact of small sample sizes: I couldn’t think of any time in the last couple decades where I would have used “rabid” to describe Python culture as a whole.

> For the last several months, Guido and Sushma have been meeting once a week to talk about all things programming. But Sushma says the biggest takeaways were not just about how to do things, but how to become more confident and learn how to figure things out on your own.

The part about Guido personally mentoring women engineers is, despite his (obvious) technical contributions, the most impressive part about his Dropbox tenure to me. The article mentions him leading by example and it's inspiring to see someone of that stature and level of accomplishment willing to spend time with individuals. It's one thing to talk about fostering a welcoming community, it's quite another to offer up his time so generously when he could be doing "higher leverage" things with regards to diversity.

I was fortunate enough to work with Guido, albeit briefly, during my time at Dropbox. He was always a great co-worker. But you definitely didn't have to work directly with him to get to know him, because of his commute choices he was always at work when breakfast was being served (food, the true high point of working at Dropbox) and the intersection of Dropbox culture and Guido's personality meant that it was easy just to go sit at the same table and have breakfast with him and get to know him.

Thank you Guido! At the risk of sounding saccharine, Python has literally made life better for my colleagues and me. In the corporate, financial world where I’ve been working, people often see coding as someone else’s job. Python, and the community that has grown around it, have made programming accessible (socially acceptable?). It has saved months of work for us.

I hope you enjoy whatever you set yourself to next.

Indeed, new joiners in investment banks these days are increasingly expected to know Python at least -- and not just for technology roles, but junior traders and managers as well. For financial services it's the new VBA, but better.

I can only agree. Me and a colleague are the only ones using Python for HR (HRIS) that we know of, but people are very interested when we tell them how we use it for data quality, automation and reporting. There will surely be more in the future.

Someone who worked at Dropbox shared an anecdote. Guido received an email from some recruiter along the line "You seem to have great experience in Python. How many years of experience you have with Python?" To which he replied "All of it!". I am not sure if it is true story.

Haha, reminded me of a time when I had just quit from a company as head of engineering, and a couple of weeks later a new Jr. recruiter for said company sent me one of those generic emails telling me that she liked my LinkedIn profile, and how she would love to talk to me about an open developer position at the company.

I chuckled and forwarded the email to the then head of HR and CEO. I am good friends with both, and I knew they will take it with humor.

It is difficult to find good recruiters.

I heard a story about when he used to work at Google, he would get pulled into interviews where the candidate was brave enough to write “Python expert” on their resumé. There’s definitely no one better to grill someone on their Python skills than the guy who invented it!

"“When asked, I would give people my opinion that maintainable code is more important than clever code,” he said. “If I encountered clever code that was particularly cryptic, and I had to do some maintenance on it, I would probably rewrite it. So I led by example, and also by talking to other people.”"

This is very sage advice.

That coincides with the following:

> Code is read many more times than it is written. Writing code costs something, but over time the cost of reading is often higher. Anyone who ever looks at a piece of code has to invest brain-power into figuring out what it does.[1]

This is also noted in PEP8:

> One of Guido's key insights is that code is read much more often than it is written. The guidelines provided here are intended to improve the readability of code and make it consistent across the wide spectrum of Python code. As PEP 20 says, "Readability counts".[2]

[1]: https://www.sandimetz.com/blog/2017/6/1/the-half-life-of-cod... [2]: https://www.python.org/dev/peps/pep-0008/#a-foolish-consiste...

I’m finding that one of the biggest time sinks is code that looks like it’s doing something other than what it’s actually doing.

Code smells draw the eye when debugging. Your brain wants the problem to be in the code that’s clearly “wrong”. So smelly code anywhere near common code paths has a huge cost compared to smelly code in some leaf function in an obscure feature.

One of the wisdoms of XP, I think. The code you touch the most should be the sanest.

> Code is read many more times than it is written

I'm really curious how true this is. Partly because, if it really was read more than it is written, people would optimize to /write less/ of it - if you have to pick up everything you put on the floor ten times over, it will incentivize you to put fewer things on the floor in future. And people would say things about code which people say about writing, such as "you have to write for your audience", instead of the more common "it's either boolean readable or unreadable".

And partly because I suspect code is /skimmed over/ more than it is read; that is, people assume what it does, glance at its shape, then if there's no surprise triggered, move on.

I would be interested to know if anyone has studied what it means for code to be read, but my suspicions are that most code which works is almost never read, and most code which is read, is read because it doesn't work and therefore it will be self-selected to be poor code in some regard.

But then, I'm not a programmer working on a large codebase. Those of you who are, how much of it have you read in enough detail that you verified how it works - and no assuming that a method does what its name implies, or assuming that if it passes tests it must work, but actually verifying by studying it that it does what it should and that you understand it?

Who would have thought the best argument against an 80 character cap in pep8 would come from guido himself?

Anyone who charitably reads what he writes.

> This is very sage advice.

I mean in real life (when you don't have Guido to do it for you), if people are writing code you can't understand then you need to tell them to fix it. And many people don't like being told they need to rewrite their code because you can't understand it, so it leads to conflict and eventually you often need to fire people.

So yes, it's good advice, but by cleaning up engineering debt you're often going to take on a lot of team debt.

There's a delicate balance that often lies in hiring process.

There are people who simply refuse to understand generics and/or stream APIs. Which are to (most) programmers more readable then the same logic being drawn out in a loop.

Meanwhile some other programmer would just write code golf and brag about it.

At the end of the day, it should be about enforcing a coding style that defines what you can or can't do. Adding final to all variable that can be final for instance is a great start.

We have reached a middle ground in our organisation (not a tech company, so there is less code and a slower pace). We don't require rewrites, but simply recommend that next time the code be written in a more lenient style. The education aspect is a good lens through which to view the issue also, sometimes resistance to an unfamiliar style is due to a fixed mindset around a known set of language functionality. It could be an opportunity to learn something new which is genuinely useful.

You’d think it would be common sense when writing something that other people will read. We don’t see people writing books in acronyms or omitting words.

How come so many people try to use code to show how smart they are?

Honestly, if you're smart, it's just really fun to make things dense and logically concise. Think of it like a minor form of code golf.

My first CL at my first job out of college was writing a state machine at Google. I remember collapsing it into this dense, elegant representation by hinging on a couple of bits of state. It took me an extra half day, but I was really proud of the end result, and I remember being mildly put off by the request that I unroll it and make it explicit. Upon thinking about it for a bit, I decided it was entirely reasonable, and after a few more years at Google, I transformed into the kind of engineer insisting that every bit of code prioritize readability over everything else, in the absence of constraints like performance. Across the tech orgs and teams that I've run since, this has paid huge dividends. I'm writing a lot of tensorflow and C++ at a very fast-moving company working on a complex problem, and I don't know how anyone on my team would be productive at all without strictly prioritizing readability.

Being good at coding and having good engineering habits aren't the same skill, and the latter is about discipline and patience (and occasionally thinking like a dumber person), which are inherently less fun than unfettered technical challenges.

My best friend in college was a writer, pursuing a BA in Literature.

I’ve been thinking again lately about little bits I picked up via osmosis and my own early creative writing experiences. In fact I was just noting a couple weeks ago how refactoring resembles an exploratory writing exercise.

I think we need to embrace the creative writing similarities. Would we ever celebrate an author who published efficient, dense and cryptic text? Rarely.

One of the things we do want is someone who paints a clear picture in a few well chosen words. Another is to be inspired. Tricked or bored are not on that list.

Readability isn't just the inverse of code golf, but that's how it's often talked about. We want a clear picture in a few well chosen words, but writing padded with low effort filler is not clear, it's a drag. Whatever the answer is, it's not "shortest code possible" but it's also not "longest code possible". Given that, it seems quite plausible that the answer is "whatever you have learned to read is readable, and what you haven't, isn't".

We have ideas of a "grade 6 reading level", do we have such a scale for code? If not, why not?

Love the analogy. I'm a SE with a English degree. Maybe that is why I care so much (more than those around me) about editing code and finding the readability sweet spot.


> How come so many people try to use code to show how smart they are?

Writing clever code doesn't mean it's being done for that reason. As the article notes, Guido himself agreed that in the early stages of developing a piece of software, such as in an early stage startup, it probably makes sense to write clever code, because you can get it done faster and therefore iterate faster on improving the code to meet user needs. At this stage very few people are working on the code (often just one), so communicating with other developers is not a big issue.

The need for making it maintainable comes later, when the product is mature and many more people are working on the code, so the need for clear communication becomes much stronger.

We don't use code to show how smart we are. We just assume that other developers are as smart as we are.

>We don’t see people writing books in acronyms or omitting words.

Math books written for mathematicians do this all the time.

> We don't use code to show how smart we are. We just assume that other developers are as smart as we are.

I think it would be more accurate to say that we assume that other developers have the same amount of context about the problem as we do.

Going back and reading my own code from even a few months ago makes this very evident, as I frequently realize that I'd made various assumptions that required a much deeper level of understanding than I anticipated. The difficulty in providing a sufficient amount of context is readily apparent to me as I go back and fill in the gaps I unknowingly left.

Oh, I’ve definitely worked with people who really enjoy being clever.

And telling them their cleverness is hurting other people can be traumatic, for one or both parties.

I like being clever too. I’ve just sublimated that into more beneficial things like human factors.

> We just assume that other developers are as smart as we are.

I think that excuse falls over when you look at some old code you yourself wrote, and you find that you can't understand it.

That's presuming your smartness doesn't change much over time.

I didn't take the comment to mean that other developers can't understand clever code, rather that writing overly clever code just isn't the best approach when a more plain solution will do. Writing code that takes another developer an hour to read through and understand wastes a lot of time compared to code that can be read and understood at a glance.

This is most important when scanning code.

When I’m trying to add functionality of fix a bug, I’m going to read dozens and dozens of functions to winnow down to a handful of candidates.

Simple code can be filtered quickly and cheaply. Clever code requires contemplation. Which clears working memory.

Yes, I can understand your code. But I shouldn’t have to work for it.

The key difference being operational ownership and business continuity. If all mathematicians (and I say this as a mathematician and math lover) just up and had a heart attack, then the world would probably not notice (as a whole).

If a critical service is experiencing growth and built on cryptic stuff, then the world may notice if no one can maintain it and it falls over.

Sometimes it's not about show how smart they are, developers aren't clear about the code they are writing.

when I write code for a concept i don't understand completely, it tends to be complex and cryptic. I'll go back to it after trying to understand the problem and the problem space more and be able to write simpler code.

You say that, but then there's more advanced writings like scientific papers which will freely use domain-specific terms and / or maths; I'm not saying that's bad, it's just that because I'm not at home with these terms and / or subjects I won't invest the time in trying to decipher it. Also because unlike with code, I don't actually have to of course.

I’ve ratcheted this up a notch with jr devs.

When are you having fun reading someone else’s code? You’re usually there trying to solve a problem. Your plans for the day have gotten away from you. You may even be having a Bad Day. And this is at least doubly true for failing tests.

Take pity on the person.

I've also begun to stress this in code reviews and mentoring interaction but the question is; how to present this idea in a way that "early in their career" developers (and not so early) can appreciate? Simply admonishing them about "keep it readable!" isn't effective in my experience.

An example that frequently occurrs is decomposing a large method into smaller methods. Okay, there are now more lines of code (due to adding method declarations), but what was really gained? The gain that can be had is identifying common use cases and factoring some of those methods out to utility classes that can be referenced from the same source file and others. This can be difficult to identify unless one learns to really look for these oppurtunities.

The approach I've been taking recently is presenting the idea as there's always a minimal level of abstraction, and code should try to achieve it. I'm interested to hear what others think about this.

One of the things I love most about Python is the ease of decomposing functions. The ability to return an arbitrary number of objects at the same time makes it really easy.

Python not caring about the type of a parameter makes it easy to share code too.

> How come so many people try to use code to show how smart they are?

It's tricky, isn't it? The original quote was about "clever code" versus "maintainable code", but they are not orthogonal: it's possible to write code that's more maintainable (because it's harder to use incorrectly) at the cost of being more clever (because you need to learn more of the language or standard library to understand it). One person's "more maintainable" is going to be another person's "too clever" — you're going to get a bunch of different opinions on where the line lies. Not everybody is trying to show off their brains.

People enjoy doing something clever. It takes experience and careful reflection to realize the downsides of being overly clever. It takes maturity to not give in to the guilty pleasure. It can also stem from boredom or insecurity.

It's terrible. People use deliberate crypticness to flex on everyone who hasn't seen x subniche of programming style before. They designed it to be like this to be the gatekeeper at their organization.

That metaphor doesn't hold up. Literature is full of allegories, figures of speech, allusion, and other forms of imagery.

I agree that code should be as straightforward as possible, but I don't think books follow that rule.

Much of 18th century English-language literature was an attempt to show how smart the authors were...

This is a really good comparison and one I'm going to use going forward. Thank you.

I think this is what experience teaches. Sometimes I have the itch to write clever code but then more often than not it ends up being cryptic to other developers. If you really have to write clever code, wrap it up in a well defined interface and write a bunch of tests.

I've found it also becomes cryptic to your future self who will have to figure out how to maintain it going forward

The third option, of course, is to just comment your clever code to make it less cryptic.

Yes, you can have your pie and eat it :)

Unless there’s a real and needed benefit, why complicate it?

I’ve seen clever code that was 300% faster, half the lines, completely unreadable, and called once per session, for a total the time of 10ms in a workflow that was minutes long.

No amount of commenting could justify it.

The compiler doesn't care about comments, so in my experience comments quickly go out of date as others change the code and ignore comments.

There's nothing to say code can't be performant and legible. In fact I'm a bit confused what "clever" means here. Writing performant but illegible code does not take more cleverness than performant while legible.

probably would help if he added static typing from the get go.

Interesting in the context of Dropbox, as they ship the client code as obfuscated and encrypted pyc files and a custom runtime that's pretty "clever": https://anvilventures.com/blog/looking-inside-the-box.html

Edit. Yes I get the difference and the reasons they do this. The wordplay crossover (cryptic, clever, etc) was notable to me, and reminded me of the link I posted above.

That has zero bearing on the point Guido was making. He is talking about development practices. The client code is not being sent to users for them to develop code and contribute to the Dropbox codebase.

We're not the intended audience for the code. We're just the consumers.

Just because the outcome of the code (i.e. obfuscated code) is "clever" doesn't mean that the code that does the obfuscation is clever

This is a cool read (and a shared sentiment!), I've always wondered what industry luminaries do day-to-day, and hasn't it been quite the draw for Dropbox from a recruiting perspective, to say, "Oh, and Guido might want to ask you some questions about your code."

I wonder what "retirement" means for Guido. The Python/software community could never hear from him again and he'd completely deserve the rest (and we the community would be immensely grateful for the work already poured in), but I get the sense that won't be the case. :D

There's a whole generation of programming language creators born in the early to mid 1950s. In no particular order: Guido van Rossum (Python), Bjarne Stroustrup (C++), James Gosling (Java), Rob Pike (Go), Larry Wall (Perl), Walter Bright (D).

Makes me wonder what kind of secret club they have going on..

Enjoy the retirement, Guido!

> Makes me wonder what kind of secret club they have going on

It's no secret; they lived through the time where the computing power was outstripping people's ability to express ideas harnessing that power, and the world was ripe for more expressive languages that leaned harder on the CPU to convert them into action.

And, that's still true. ;) New languages are being invented, some haven't caught on yet.

Another environmental factor is the end of Moore's Law. As single-core performance plateaus, we'll probably see more "bare-metal" native languages like Rust and Zig for applications that don't have the latency budget for higher-level dynamic languages with runtimes. We also see languages with good concurrency support like Erlang and Go become popular as applications need to become more multi-threaded to use multiple cores effectively.

A few others born in the 1950s that immediately come to mind: Martin Odersky (Scala), Guy Steele (Scheme), Robert Gentleman (R), John Ousterhout (Tcl), and (sort of) Simon Peyton-Jones (Haskell).

Wow, I didn't know that John Ousterhout created TCL, come to think of it, I never thought about who was behind it.

And of course Richard Stallman (Emacs Lisp) and Joe Armstrong (Erlang).

I can only dream of a panel or a TED talk with those computing sages! (I guess they are all alive right)

This is pretty close! (Guido van Rossum, James Gosling, Larry Wall and Anders Hejlsberg) https://www.youtube.com/watch?v=csL8DLXGNlU

Wow,surprised to see this at a local meetup in Seattle. Thought it would be a kind of event for a much bigger venue.

Thank you for creating an awesome lingua franca for a generation of programmers, engineers, and scientists Guido! You'll never be forgotten. Love!

"I think at his core, Guido is a person who is trying to help make the world better in his own way. I think that was his philosophy when he started programming in Python and mentoring women is just another way that he contributes." -- Sushma Yadlapalli about Guido van Rossum

I'm travelling with poor cellular connection and it took me 10 minutes to load this page, while I could load and read content on HN perfectly fine. The text content loaded LAST. That website is terrible.

Welcome to the modern web, where you need to download megabytes of largely useless crap to display 1 kilobyte of text.

We don't just need ad blocking, we need crap blocking and text caching.

Agreed, I'm at a desktop with 100mb connection and I had to reload it twice and it still took about 1 minute to show itself.

Guido seems to have been incredibly good on the mentoring front, particularly with getting women into the community. I wonder if he'll continue doing that at all even after he goes into "retirement". It seems like a great asset to the community.

One of the things some of us learn with experience is that “clever code only the author can understand” is often a lie, too. I may be able to figure it out on the fly but code I haven’t touched in six months is less familiar than other people’s code that I touched a month ago.

When I leave hints in the code it’s enlightened self interest. It’s often as much for my own benefit as for others.

Off topic (sorry): Dropbox's overall typography is blatantly offensive to the eye.

It happens (no worries): We did see before; we do see forevermore.

( ~ might be putting the cart before the horse. ~ take it easy )

> When asked, I would give people my opinion that maintainable code is more important than clever code. If I encountered clever code that was particularly cryptic, and I had to do some maintenance on it, I would probably rewrite it.

It would be great to see some examples of this rewriting. I feel that we see a lot of examples of converting repetitive code to terse code, but fewer examples of "de-clevering" code.

If we judge leadership by the quality of followers, python's ecosystem says it all.

> The result of this is nearly four million lines of checked Python code, nearly 200,000 improved type definitions, and countless hours saved for engineers.

The more I code the more I appreciate statically typed languages.

When I learned/started coding, I liked dynamically typed languages. It was easier to get past compiler. My code seemed to do something. And if there was anything wrong in the way, it would just continue to execute making me feel that my code is at least doing something.

Now more into programming, I don't want to go back to duck-typed languages. I like when compiler gives me guarantees along with exceptions being thrown at runtime when something goes wrong, crashing the program and noticing it immediately that something is wrong.

I think language people also recognizes benefits.

- PHP7 allows using strict types.

- Python has static type support

- Javasript -> Typescript

Disclaimer: Have used php, tried ruby. Using C# and javascript. Looking forward to typescript.

Do dynamic language people: 1. Write more tests? 2. Have more defective code? 3. Are better at writing code? 4. Maybe there is just no correlation between code correctness and static/dynamic lang. I just wrote down MY feelings.

Dynamic language people also typically write _less_ code, at least when it comes to Python.

Expanding brain mode: Typescript -> Elm

Thank you, Guido! I've learned Python and since loved Computer Science.

I first learned Python back in 1996, when there was just one Python forum where Guido answered any and all questions.

He was gracious and polite enough not to call me a dumbass for not realizing that tabs were a requirement, not an option.

That was also when I became a tab-convert.

Thank you Guido for this incredible tool that has helped so many others actualize their solution aspirations.

You may have meant this, but in case a newbie reads: Tabs are not a requirement, pep8 recommends 4 spaces per indent level.

I owe a lot to the python language and the community. One thing specifically stands out with Guido is humility. Smart, successful and kind ... empathy for others - from the language design to focus on readability, to helping new comers - what an amazing role model! It is no surprise when you see that reflection in the python community at large.

Python made programming fun for me again. Thanks, Guido.

The genius of Python is that it treads that fine line between allowing too little or too much control. Between being too strict or too liberal. Between trading speed for clarity and simplicity.

It is the ideal programming language for an increasing set of tasks.

The only really risky part of adopting it at enterprise scale is that it's flexible enough that a developer can trick themselves into thinking they can extend it and they end up falling into DSL hell (where you're technically still using Python, but you've bent it so far via its runtime-dynamism that static analysis tools can't help you and you're now writing both code and a toolchain to support writing code).

But you don't have to do that to yourself to use the language.

Usually any post about python is about how shitty the language is and how one can't write anything larger than hello world in dynamically typed language. This one in wholesome, I like it.

Thank you Guido for all you did for this community. While others have pointed out your contributions to the Python community, I have found inspiration in how you paved a path to the engineering culture all around us. In a time when career progression was to transition into management after a certain degree of tenure, your adamant refusal to stay technical and "just code" has shaped my personal views and my own aspirations for my career. You will always be cherished

After coding in Delphi, Java, C and C++ - Python was the first language that I really enjoyed coding it. When you enjoy a language, it's easy to start working on fun side projects that advance your skills and you continue improving yourself. Even though I don't write pretty much Python anymore, I still like and respect the language and would definitely recommend it to any new comers in programming. Thanks for Python!

A more than deserved retirement. And I'm sure there will be plenty of free time coding as well.

Given the impact Python has had on my career I'm nothing but grateful for it.

Thank You

It's been some time since I programmed in Python and it left a good impression with some caveats. I remember being bothered by the lack of a strong type checker. I see that now there is MyPy. Good to know. The other problem is the lack of a super fast runtime. A Python to C compiler would be great. I remember using Cython but didn't like it that much, even though it was much faster then normal Python;

Thank you Guido. Because of you and your work, incumbents have been toppled, careers have shifted Or skyrocketed, and fortunes have been reaped. I hope retirement is as kind to Guido as Guido has been kind to us.

Also highly encourage people to read the PEPs, as well as mailing list archives and interesting bugs on bugs.python.org. The PSF highly encourages backwards compatibility so that history is all there and live :)

For those of us running ad-blockers, here is a version that is readable thanks to the Internet Archive: https://web.archive.org/web/20191030162322/https://blog.drop...

I'm running uBlock origin and I'm not sure what you're talking about, the original looks normal to me?

The article made it sounds like Dropbox is no longer pure 100% Python. What other languages do they use now?

Go's the other best-supported infra language, Rust is used for things like block storage (Magic Pocket). There's some amount of Java and C++, but those are all considered legacy.

Yet, most application engineers are still writing Python on the backend and TypeScript for the frontend.

[I work at Dropbox]

They are heavily using GO

My whole career was based around Python -- 16+ years of it. I used many other languages but Python was the most enjoyable and useful.

Thanks Guido!

Congrats Guido. But don't bulshit me about retirement - you will probably find something else to keep your mind busy :)

If he's anything like me he won't stay "retired" for long. The longest I lasted without anything coding-related to do is 9 months.

Why did this take so long to load?

Thank you, Guido!

Well deserved.

Did you notice this: "He has already put into motion the conversion of the Dropbox server code from Python 2 to Python 3." Even the company that hired Guido is still heavily dependent on Python 2, and it doesn't surprise me at all; my own employer still has a lot of it in internal tools.

I'm starting to long for a stack that doesn't constantly change. Just set the features and that's it. Security updates only after that. It feels like a constant grind keeping up with everything. Containers, clouds, programming languages, operating systems, frontend frameworks, transfer protocols, it seems like it takes so much effort to just build something and keep it going. That Python 2 is still around doesn't really surprise me. Many of the projects that were built upon it were built by non-developers (scientists, etc) that have no strong reason to keep everything current.

Honestly: the Java environment is exceedingly good at that. Backwards incompatible changes are truly minimal, and only those that are strictly necessary are added.

And since Java 8 the language is pretty nice and offers good functional idioms.

Of course, major versions of libraries still change. But there's no match with the JS approach.

Yet we have multiple versions of Java at work (8 and 11) because some (pretty large) third-party components don't support the newer versions yet. Plus for things that do work there are a bunch of irritating warnings and whatnot.

It's not "exceedingly good" at it, it's just alright. Go has done much better for us in this area.

Go has the whole gomodules thing, which I’m not a huge fan of. Go with vendored dependencies sounds really stable.

Pretty sure generics are coming, and this is going to change best practices, and cause re-writes.

go mod vendor is what you are looking for then

I agreed until the disaster that was Jigsaw.

Maybe Jigsaw didn't impact you particularly hard, but you can't argue the JVM has a policy of minimal backwards incompatible changes given that debacle.

I work on Java at a really deep level, and I haven't had to interact with Jigsaw at all. What is it breaking for you?

A lot of things like older versions of Gradle (which at some point was a blocker to move to Java 9, but not now), Groovy (also fixed, but still has some warnings), which we use heavily for tests, several reflection-heavy frameworks (think dependency injection, annotation-processors etc), which to this day require some ugly hacks like --patch-module to "open" packages, JUnit tests (which also need the same hacks to work), even the location of resources loaded with class.getResource() had to be modified when modularizing a jar, so that resources go into the proper packages that are visible to the loader...

You seriously didn't suffer any of these?

No I didn’t experience any issues like that myself.

I’m a full time scala developer and have had basically no problems taking longer than a day to in moving to the LTS releases of 9+.

It’s probably pretty helpful that the scala ecosystem doesn’t depend much on non-well trodden parts of the JDK

The downside is that also means a lot of the core APIs are a huge pain in the ass to work with, especially relative to something like Python, and the “solution” to this problem ends up being something like Spring which has to use so much reflection and metaprogramming to accomplish what it does that it also relies heavily on exceptions for control flow. So without even getting into the other issues, debugging becomes a lot more complicated than it would be otherwise.

Personally I would second Go as a good answer for this. It has its limitations, but it’s simplicity and it’s standard library more than make up for that from a pragmatic standpoint imo

Either you have a stable platform, and it reflects the state of the art form the year of its inception, or you have a platform that keeps on improving, but this means you need to change your code along with it, to keep up with the improvements.

You can't have it both ways,

OTOH current JVM can still correctly run bytecode compiled by Java 1.0.1 (or maybe even earlier), so the backwards compatibility is indeed excellent.

Not necessarily. Java currently ships two date libraries, for example: java.util.Date and java.time.*. There are some duplicating classes like Vector from the old times and ArrayList from new times. There are two I/O frameworks: old I/O (File, FileInputStream, etc) and nio (Path, FileSystem, etc). While I don't like the particular way Java did that, basically you just have to ship all old versions and keep them working and ship new versions with some adapters to ease migration.

> Either you have a stable platform, and it reflects the state of the art form the year of its inception, or you have a platform that keeps on improving, but this means you need to change your code along with it, to keep up with the improvements.

> You can't have it both ways,

You can have a continuously improving platform with full backward compatibility, and all the improvements that aren't just efficiency of established operations are opt-in.

I am not a go programmer but I would love to revisit this conversation but with golang twenty years from now. Please correct me if I am wrong but I believe golang is essentially complete, right? and golang programmers like golang?

> Please correct me if I am wrong but I believe golang is essentially complete, right

I'm confident every programming language thought it was complete at some point... but eventually it's users demand some new "hot" feature and then you're back to releasing new versions again.

The C programming language just had a release in June 2018, C18. For a language released 47 years ago... and it's a pretty basic language compared to most other languages.

Fortran was released 62 years ago, and just released Fortran 2018 in November of 2018.

Go is not complete. But it does not evolve by revolution as of now.

Complete is hard to define, no language stops evolving. But Go language is perfectly suitable for large projects, and is pretty much my Go-to nowadays. (Pun intended)

Java 8+ is so easy to work with I think it bores a lot of people so they go looking for something more. Spring boot + jooq is easy to work with and breaking changes are slow.

I'm also coming around on Go. The simplicity was a turn off at first, but now that I've used it to implement a graphql server I like the simplicity. I don't want to have to be a programming language researcher to quickly get work done. IMO Go delivers in that mission.

Java is paid platform without clear pricing, licensing and patents.

Many (most) JDK distributions are free for all usages and fully open source.

Can you give some references for the patents issue you mentioned?

So you agreed pricing and licensing is not clear? If still not, here is the source https://upperedge.com/knowledge-center/documents/oracle-podc... As for the patents I don't have a source quickly available. Pricing and licensing are already bad enough. If I sell java based software, customers have to worry about licensing from Oracle. If I sell SaS I have to worry for what and how to pay. Those factors are big risks.

If you're using something like Spring or Spring-Boot (highly recommended!!!) - you rarely, if ever, are going to be debugging Spring's code... any issues are going to be in your code.

With Spring-Boot you can even start treating Spring and friends like a "Black Box", and stop caring about how it does what it does.

Could not agree more with Spring-boot recommendation! It's an amazing toolbox and combine it with streams and function interfaces, programming in Java is pleasant again

personally i loath blackbox systems, regardless of how well the docs are written. i want to know how it does things...even if only to satisfy my curiosity.

It's not blackbox in that you can't look inside... it's blackbox in that you can choose to ignore it and be just fine.

Springboot takes an opinionated approach to a lot of things, but always lets you override it and do whatever you want wherever you want.

I suppose that's more of a greybox...

ah, yes! point taken.

You can use Spring without all that magic. I, personally, hate Spring Boot and would never use it, exactly because of all that magic. But Spring itself is perfectly fine and I can control every single bit of it with explicit configurations.

For us, springboot gave about 90% of what we wanted out of the box. The 10% that we wanted customized (special authentication related things or transparent multitenant database switching with repositories etc), it let us override and do what we wanted it to do. It certainly is a lot of magic, but nothing that cannot be deduced if you are familiar with Spring, and/or look at the source.

I used to be opposed to all the "magic"... but after you write a few MVC projects with Hibernate backends, etc... why keep having to do that all from scratch each time?

You end up making your own framework that has all that in it, so that you just change a few things here and there and can start getting to the business logic that much quicker.

Then you realize you're maintaining all of that code... when you'd much rather maintain the business logic bits...

Which leads to using Spring Boot and letting them maintain everything else.

> Honestly: the Java environment is exceedingly good at that.

I would argue that the .NET Framework is way better. The way you can have multiple versions installed side by side and have the framework pick the correct one to run is awesome. Granted, you can do something similar with the JVM by mucking around with env vars, but it's not the same thing.

Well, .Net Framework is analogous to Python 2.7. It will never receive anything but compatibility, security, and bug fixes going forward according to MS.

The Future is .Net Core.

Just for the sake of redundancy:

Python 2.7 is not getting any kind of updates including security and bugfixes.

Not really. The newer open versions still don't match Oracle JDK 8. Wanna bundle a JVM with your app so your users don't have to worry about the Java runtime? Well, you can't do that anymore. Wanna use JavaFX? You have to jump to Java 11 and hope your dependencies don't fail with module packaging errors. Do you use Scala? GLHF.

Standing still in the Java ecosystem is fine. Keeping on with the advances is painful and exhausting. The Java tech stack is awesome when it works, but to solve issues you have to go so deep into the tech stack that I'm starting to think it just can't be done unless you're at least a mid-sized company with JVM specialists on the payroll. Not developers, but systems people.

> Wanna bundle a JVM with your app so your users don't have to worry about the Java runtime?

Why would that be the case?

> Wanna use JavaFX? You have to jump to Java 11 and hope your dependencies don't fail with module packaging errors.

Dependencies which still fail with module errors are basically abandoned and should probably be replaced. But even for those there's an escape hatch if you really want to continue using them, so that seems to be a non-issue?

> The Java tech stack is awesome when it works, but to solve issues you have to go so deep into the tech stack that I'm starting to think it just can't be done unless you're at least a mid-sized company with JVM specialists on the payroll. Not developers, but systems people.

I've worked either directly or indirectly with so many companies of all sizes using Java without any dedicated JVM specialists I'm pretty sure you're doing something very, very wrong with all the problems you seem to have.

> Why would that be the case?

I have a desktop Java app with platform-specific installers (.msi, .deb, .dmg). The Windows and Mac installers include a Java runtime, so the user doesn't have to worry about installing an external runtime. I'm forced to use Oracle JVM, because it's the only Java distribution that includes the tools needed. There's ongoing development on a tool that should bring these and more related features to the open JVMs, but it's not done yet.

> Dependencies which still fail with module errors are basically abandoned and should probably be replaced. But even for those there's an escape hatch if you really want to continue using them, so that seems to be a non-issue?

The issue is that the upgrade path isn't smooth. I have to upgrade everything at the same time, because what works on Java 11 doesn't work in Java 8, and viceversa. It makes everything way harder than it should be, if there was a clear migration path.

> I've worked either directly or indirectly with so many companies of all sizes using Java without any dedicated JVM specialists I'm pretty sure you're doing something very, very wrong with all the problems you seem to have.

What I do falls into a niche. It's not "very wrong" but it's uncommon. The main issue is that Java was a non-issue, a stable platform to build upon. But the recent Oracle license change forced us to move to fully open versions of Java that do not have the same features, do not bundle the same libraries, and do not have documentation detailing all these differences.

I'm not asking for a community to finish whatever feature I need, or for a company to make what I need free to use. I'm just trying to explain my own problems caused by Oracle forcing us to move to open distributions of the Java platform.

> The newer open versions still don't match Oracle JDK 8

I'm currently working on a web service that was written under Java 8.

For giggles, I just ran the end-to-end tests suite under Java 8, 11, and 13. The only error I got was an incompatibility with Google's Error Prone linter, which I was able to fix with one line change.

This represents a few hundred thousand lines of production code that works just fine under every important JDK in use today.

The problem isn't usually with the runtime itself, but with all the ecosystem and tools around it. This becomes very apparent when venturing beyond web services and you start depending on those tools.

> The newer open versions still don't match Oracle JDK 8.

Oracle JDK 8 contained some small parts that were proprietary. And they mostly had the proper package name. You took the risk.

> Wanna bundle a JVM with your app so your users don't have to worry about the Java runtime? Well, you can't do that anymore.

You mean, bundling a JVM exactly like all JetBrains IDEs currently do?

> Wanna use JavaFX? You have to jump to Java 11 and hope your dependencies don't fail with module packaging errors.

JavaFX is a bit of a strange thing, and yes, it can be painful. And some modules struggle with the new module system. The module system is a breaking change, and in fact it took quite a while.

> Scala

That's a problem with the Scala compiler AFAIK, not with Java.

> Oracle JDK 8 contained some small parts that were proprietary. And they mostly had the proper package name. You took the risk.

I did not. Some libraries I depend on did.

> You mean, bundling a JVM exactly like all JetBrains IDEs currently do?

Yes. I'm researching how they do it. Previously it could be done with one command. Jetbrains use their own patched JVM and have JVM experts on their payroll. What they do goes way beyond my current skills.

> That's a problem with the Scala compiler AFAIK, not with Java.

From a JVM point of view, Scala is just one more library. The problem with Scala right now is that it doesn't support the module system. This means yet another dive at a lower level to try to fix any issue that appears.

That's a lot of FUD spewing.

It's my own personal and professional experience. Yours may differ, but please don't call it FUD.

How are people adapting to Oracle’s licensing changes?

The biggest change I've seen is that people are using OpenJDK itself (rather than Oracle's official distribution) a lot more, not only from the official OpenJDK distribution, but also the new Amazon's Corretto version, Azul's Zulu (the one I use at work, which has been backporting lots of security improvements to Java 8, which is very useful!), RedHat's OpenJDK, and even AdoptOpenJDK, all of which are completely free and open source...

If you want to quickly and easily switch between distributions, I highly recommend https://sdkman.io/ - it lets you install and use any of the above mentioned JVMs with a simple CLI... e.g. `sdk use 13.0.1.j9-adpt` or `sdk use java 8.0.232-zulu`.

I would guess that Oracle's JVM popularity has gone down from around 75% of Java users, to something like 20% after the changes, even though I don't actually have the numbers.

It's not really that big a deal in my experience. We switched to the Corretto[1] distribution of the OpenJDK from Amazon and it was seamless.

If you want newer versions of the JDK, you can get them directly from Oracle[2] still, or from alternatives like AdoptOpenJDK[3].

[1]: https://aws.amazon.com/corretto/

[2]: https://jdk.java.net/13/

[3]: https://adoptopenjdk.net

Except, you know, fundamental stuff like how many cores does the JRE think it is running on.

The dance over that as we've gone from 8 to 9 to 10 to 11 is mind boggling.

As I get older and crankier I find simplicity and stability become a lot more interesting than reinventing the same wheels over and over again.

This is probably why I’ve come to appreciate Rich Hickey’s talks more and more as years go by.

I think Clojure is a good fit when it comes to stability as per their own development guidelines [1]:

The Clojure development team values a measured and thoughtful approach to language evolution with a strong emphasis on maintaining backward compatibility.

[1] https://clojure.org/dev/dev

To anyone who's interested in this topic, I strongly recommend this talk from Rich Hickey about "giving something to somebody that they can use and making a commitment": Spec-ulation (https://www.youtube.com/watch?v=oyLBGkS5ICk)

Go has a compatibility promise from 2012 that they've upheld for now seven years. They are still discussing whether to change something in Go 2.


> They are still discussing whether to change something in a backwards-incompatible way in Go 2.


Fortran from the 1980s is still supported...

Fortran from the 1960s is still supported. Almost all modern Fortran compilers support syntax that was deprecated decades ago.

Hell, I fixed something in Turing last week. My eyes are still bleeding.

You misspelled the 1960s.

And yet almost no one chooses to write new Fortran code for applications. We rely on a few ancient critical widely used libraries like BLAS, accessed from non-Fortran programs.

This is not true, there are niches where Fortran is still dominant for new code.

Also for BLAS, the underlying code is usually no longer Fortran.

I came to say this. I learned Go in the early years and have come back after not touching it in years and it all still sounds eerily familiar and a lot of the basics I learned are still relevant to what I do.

Go is good at being Pythonic in some senses too. Some would say to a fault.

They probably do not want to end up with a Perl 6 situation. I meant Raku, the language formally known as Perl 6.

Use C.

Standardized since the 80s, it has had only minor, backwards compatible revisions in 99 and 11.

According to the TIOBE index, it's the second most popular language in the world. It works on almost any platform imaginable and it has decades of tooling available for it.

Of course it doesn't have any of the good new things developed recently, but if it's stability what you look for, then I don't see a better option.

Or Perl. Perl takes backwards compatibility very seriously. There was a lot of gnashing of teeth a while back on whether to make newer versions actually change the default behavior finally to some new features could be opt-out instead of opt-in. An by new features, I mean stuff like "use strict", which has been best practice for a couple decades now, but the change would cause problems for existing scripts that didn't turn that feature on.

The rule has generally been that you can take a 20 year old Perl script and it will still just run. This has its own complications though, as it makes it harder to advance the language, and then people feel it's being left behind (the Perl 6 stuff came after that feeling developed).

It's a longstanding problem in language design and communities, stability vs advancement.

Unless it uses CPAN stuff, or calls out to external tools. Those are often needed for key functionality, and they get stale much faster than perl itself.

Lots of CPAN stuff is just plain Perl code, so the, "it just works" keeps working, too. It's to the point where a lot of "competing" Perl modules that try to solve the same problem have similar/same API's (but under the hood, rely on different things) - sometimes that is to solve the, "Well, the more popular choice went off and did something interesting in $VERSION+1, but we like the old way"

Also, if you desperately need an older version, you can usually install that older version rather than the newest.

Anyways, there's things you can do. I still work on code I started in my dorm room (in the 90's). Cranky CPAN modules aren't too big of a problem.

The old versions of modules are available, if you don't happen to have the exact same code present.

Core modules are all I would worry about, but I'm pretty sure those are held to the same standard as far as backwards compatibility.

One or two modules might have been ejected from core or moved into core, but those should be easily found on CPAN as well.

In the end, getting something working from scratch on a newer Perl that has a lot of CPAN dependencies (as opposed to just updating Perl) might be a little annoying in that you have to track down all the module versions, but it's far from impossible, or even all that hard.

For example, the CGI module[1], which was a core module but historically seen as fairly bad (at least as of now, and at least in some of its uses, it supported a wide set of use cases), it was removed from core and housed on CPAN. If you follow the link provided and check the past versions (dropdown as part of the module path and name), you'll see there are many versions shown, possibly every public version, and going back to 1998 for CPAN, and 1995 for 1995 for the BackPAN. Each of those is visible as an item with it's documentation at the time and the module available to download and use. You can also access the CPAN testing matrix[2], and if you dig around you can actually find the results for some tests back in 1999.[3]

If I was responsible for getting some old Perl app to run on a more modern system, I would by much more worried about the OS changing in a complex way than I would about getting the Perl code to run again as expected. Which is to say, I wouldn't be worried much at all.

1: https://metacpan.org/release/CGI

2: http://matrix.cpantesters.org/?dist=CGI+4.44

3: http://matrix.cpantesters.org/?dist=CGI%202.55

SaaS modules too, a lot of services have their API modules and examples provided in various languages. Perl is often not one of them. In some cases where it is, it's out of date or plain old broken. As an example (which is understandable, no hard feelings) I had to edit the SendGrid modules to fix a bug when I tried, and there were issues on GitHub that have been ignored since before it became officially unsupported in 2016.

I usually use Python these days when dealing with SaaS apps since the ecosystem is much better, and I want to learn it anyway.

To me, C is a tale of the perils of maintaining backwards compatibility: there are lots of parts of C, and particularly its standard library, that should never be used because there's no way to do so safely, but can never be removed and so inevitably new code will continue to be written that uses them.

What is there? I'm thinking of gets() which was actually removed from the standard. Other functions, you can absolutely use them correctly, even if it's not a great idea. I don't think a large percentage of C bugs comes from the standard library.

If you depend on some functions in the standard library, it's super-easy to replace them. They're leaf dependencies.

Anyway, C has about the smallest standard library you can have. And if we're talking about it's benefits in terms of compatibility, we're probably talking about the fact that it doesn't introduce new syntax / language features each other week.

Elixir 1.9 came out. No future core language changes are planned. There are no intentions of creating an Elixir 2.0. There will still of course be regular improvements and updates, but the core language is basically set. Read about it here:


If Elixir wasn’t an option for me, Clojure would be my next option. Clojure seems like it has a similar philosophy, and builds on a dynamic immutable core.

That being said I like writing code that can feel “done”. No “I wish I could refactor this object hierarchy to make it easier to use, etc” notions in the back of my mind. Not every piece of code is like that, but enough.

Surprisingly enough, PHP is a good example of environment that doesn't break backward compatibility much.

About the only change that had a significant impact on projects I worked on was removal of the 'mysql' DB extension (replaced by 'mysqli' and 'PDO'). For an older project I created a few shim functions and it still works just fine.

While frameworks come and go, the core of the language is highly backward compatible. The choices made early on were in the UNIX style, so they were right.

To quote an earlier post[1],


There is something to be said about PHP's staying power. The early versions of the language weren't considered "right". Except maybe they were right for the problem at hand.

The way I see it was PHP captured the vector of change, and left ample room for future developments. Both internal changes (hello, bytecode; hello, JIT), language level features (hi there, namespaces), and runtime level features (oh hai, countless built in classes and functions).

Moreover, unlike many framework that shone brightly and burned out quickly (Rails?), PHP captured the essence of the environment: HTTP is stateless, and URLs aren't uniformly routable from either end, and well-formed HTML/XML/JSON is just a subset of tag soup.

Worse is better.


[1] https://news.ycombinator.com/item?id=19536132

If your timeframe goes back to PHP4, this most certainly isn't true. Just one glance of https://www.php.net/manual/en/reserved.keywords.php should prove this - every new keyword is a backwards incompatible change. And I mention this because I used to support software that used, as a variable name, something which became a reserved word...

as a variable name, wouldn't it have been prefixed with $ ?

the XML parsing was probably the biggest BC change I hit between 4 and 5.

I think I'm misremembering and they were method names. I believe it was clone, in fact.

Can't relate with that... I'm still hosting some WordPress websites for clients from a few years ago. I'm afraid to upgrade PHP versions because many things broke in the past. True, it's not the WordPress core but the plugins that usually break, but I can't think of a single client of mine that doesn't have at least 3 plugins installed.

Ditto. I just did a project upgrading hosted websites to the latest PHP from PHP 5.6. These are already sites that vetted past the obvious issues of the mysql and mcrypt APIs going away, and these sites were all maintained in at least the last 4 years (I host sites that haven't been touched for over a decade, they aren't getting upgrades).

Even with these "cherry picked sites", we had about a 30% failure rate dealing with the new PHP. Ranging from total non-function of some random CMS, to mostly errors printing all over every page due to some deprecated or change of code behavior in the site templates.

Nowhere on the level of Python 2 to 3 but every minor version upgrade of PHP breaks Drupal core (and some contrib as well) one way or another. It's OK, we love the new features :)

Try Common Lisp. The language has an ANSI standard, published in 1994. So the base has been stable and fixed for the past 25 years. A lot of Lisp code that old or even older works perfectly fine on a current CL implementation.

Not only that, the "Common" in Common Lisp is a reference to a standardization of several existing dialects, so a lot of Lisp code from the 1960s and the 1970s will run without any modifications.

Lots of people say Java but of the more fancy and convenient (and correct, cough I’m obviously biased cough) ways to do development, I can’t stress enough how amazing it has been to watch release after release of Clojure consisting of mostly trivial changes or library additions.

The language is arguably complete and any syntactic changes you could wish upon yourself are doable in form of a library because it’s a lisp.

And all that piggybacks on top of already quite stable java ecosystem.

Can’t get much better than that.

The world keeps moving, if you don't want to move with it then pick something that has already stopped.

Maybe C? Cobol? Ada? Fortran? Though even those languages have modern flavors & updates...

We make progress because we learn, not just new technology, bu new approaches to old problems. When I see an "old" programmer who keeps writing C89 code because "it still works", it is painful to see them rejecting decades of learnings about how to write better software.

> I'm starting to long for a stack that doesn't constantly change.

I looked up "engraving marble" to find an example of the above, but no luck there: They use lasers for that now.

I think Python 2 had hit the perfect balance of "just enough but not too much" which I'm very deeply worried that Python 3 is going to deviate from.

Python 3 is barely different in my experience and it's been about 10 years at this point.

I do enjoy some of the "new" features of python 3.

I used `async` as the name of a function decorator in some Python3.4 code, which then broke in 3.6 which introduced `async` as a keyword.

What was the level of effort to remediate that?

Probably around 4-6 hours of work all told.

We had to come up with a new name, do a find/replace, review the changes, test, fix a couple of issues because the replace wasn't perfect/complete, update the dev documentation, make a new patch release, and then create packages for the various supported platforms. Oh and looking back over it now, the filename had to be changed too, because apparently you can't use a keyword as a module name in Python either. So where the documentation linked directly to that file, that had to be updated as well.

[1] https://github.com/saulpw/visidata/issues/164

[2] https://github.com/saulpw/visidata/releases/tag/v1.2.1

sed 's/async/whatever' ...

Some of the syntax changes are barely different (print statement a function instead of a keyword, good) but a lot of new things are added that I would consider to be syntactic sugar (walrus operator, for example) that threaten to detract from the two biggest things that drew me to Python: clean syntax and obvious understandability.

Python allows imperative code with side effects in imported modules, as well as the "else" statement for "for" loops, so Python has never guaranteed obvious understandability.

Sure there are plenty of things that aren't perfect in Python, but I think it was the best of all viable candidates that came closest to that goal.

Not very different, but fallout from the Unicode switch is quite a headache.

I generally run with LANG=C, and run into Python3 issues regularly as a result.

Also, if you have to support 'async' code that someone wrote, that could be a heavy load as well.

> I'm starting to long for a stack that doesn't constantly change

Come on over to Javaland, it's nice, warm, and the sun is always shining!

With the exception of the Java 8 -> Java 11 transition (skipping the non-LTS versions), things are pretty stable. You don't have to use any new-fangled features if you don't want... and your Java 3 code compiled nearly 20 years ago will still run just fine on modern JVM's.

> Come on over to Javaland, it's nice, warm, and the sun is always shining!

My oracle predicts much worse weather.

> My oracle

Funny joke, but Oracle is all but out of the equation at this point.

They've stepped back and become yet another JVM vendor that takes OpenJDK source and compiles it into a binary for users, sprinkled with some special sauce here and there, and just happens to charge for support.

That puts them in line with Azul, IBM, Amazon, AdoptOpenJDK Project, SAP, Red Hat, and a ton more.

You get your pick of JVM vendors now... That's a fantastic thing for Java.

Python 2 (the OP topic) from 2000 still works too.

Running code is the easy part. Writing it is the hard part.

And if running on a VM is your standard, my old DOS programs still run on DOSBox :-)

Clojure has had only a single backward breaking change since its creation in 2008, it took 10 full years before being introduced, and it was a pretty minor one at that.

Java and the JVM themselves value backward compatibility, and I'd say Clojure takes it a notch above that.

I guess I have missed that. What was the change?

When they added clojure.spec. It started validating macros at compile time. If you were previously using an illegal syntax, but that somehow didn't cause any issues, those would now throw a compile time error.

Overall it was for the best, since prior to that it was just weird code that had a bit of a undefined behavior since it wasn't using the macro in the proper way. But it still meant we had to go and fix a few of them so they compiled again. Though in the process we realized that we were misusing the macros to begin with.

One day tech historians are gonna look back at this era and shake their heads at all the time, energy, and talent being wasted by people just trying to keep up with tech trends for the sheer purpose of remaining employable.

I don't think it's that bad. I've never heard of someone who got fired because they didn't know the latest JS framework.

Also, new tech trends are not just hot air, there's also a lot of innovation happening. You just have to see through the hype and wait until the dust settles.

How many haven't been hired because they didn't know the latest JS framework?

Reinventing wheels isn't innovation, particularly if the reinvention isn't round.

> I've never heard of someone who got fired because they didn't know the latest JS framework.

Sure, but I've worked with a number of individuals who have either resigned (on their own), or have been let go, solely because they had no desire or interest in keeping up with their craft.

I think that's effectively the same thing, though.

Ultimately, the notion of not wanting to stay abreast of the current industry trends is neither something I can relate to, nor is it something I see as being compatible with one's employability.

If we are talking client side JS I think they're just trying to solve the problems that desktop had solved ages ago. And they keep smashing heads in a process.

There is also the situation that someone gets hired for their tech skills, but the thing that keeps them in a job is their ability to pick up the domain. The tech choices are often point in time decisions which are irrelevant to any outcome. The domain knowledge builds is worth more than any framework knowledge.

If you are successfully (i.e. the business or product keeps being developed) you still need to be aware of the internal weaknesses of your original choices and the external opportunities of updating to something more modern. The challenge is in managing the transition which is a skill in itself.

My general rule is that I don't even think about learning a new technology until at least two years after I first heard about it.

I assume that I hear about new stuff fairly early in their adoption, since I try to keep up with these things, so if it's still a thing two years later, it might have enough staying power to be worth learning. After that it still takes a while for me to get around actually learning the things.

Go and Rust are languages that have been on my todo list for a while now, for example, but I still haven't touched a JS framework because I haven't seen one with staying power yet.

Do you think React can be going away any time soon?

Java is fairly stable I've found.

Just get away from javascript and once in 20 year platform language changes like swift and most platforms don't have that much churn.

Delphi still lets you build programs that used to run on Windows 3 with minimal changes, and definitely anything 32-bit.

It's part of the value proposition: the language and libraries are planned for forward compatibility and stability for use ten years in the future, as well as now.

Actually there were some problems (very long procedures I think would be one example). Other then that compatibility shines.

To save others the same DDG search I did -> Delphi is a Pascal language IDE that compiles to native Windows binaries. Apparently there is this thing called Lazarus that is a free version of it?

FreePascal is language/compiler and Lazarus is an IDE. Lazarus IDE runs on all major platforms and produces native code. Suitable for anything starting from system code and up to complex GUI apps.

You want to find a stack that's small / concise / organized enough that you could support it if you needed to.

It's ideal if it has a stable community, but it should be a small group of core devs, so they don't have time to change things that aren't broken. It helps if the community has a culture of stability.

In my experience, FreeBSD and Erlang fit this bill. You may need to manage some small changes now and again when upgrading between releases, but generally the most apparent changes are things you were doing before work better. And you can run a fleet with rather divergent versions, as long as you're careful not to use new features (it helps to make sure your dev environment matches your most out of date hosts)

Have you tried Fortran?

C also shows excellent stability and backwards compatibility.

Cobol is also quite stable, but IDK about the quality of open-source tools for it.

The problem of stability is that you are pinned to the moment when the interfaces were defined and committed to, and this moment goes further and further into the past. Quite often this means that you can't use the new things which give advantage to your competitors.

Then how do you improve?

Currently there is no all encompassing theory for the design of languages or anything. So there's no prove-able way to design a language that can always be improved and always be backward compatible.

I imagine that such a language would be incredibly simple. Minimal syntax and all sugar and conveniences would be built with libraries.

I feel like pure Ruby meets this. I've been using Sinatra for years, it's been basically the same that entire time, and the language feels pretty stable in my usage (some nice added things, like stabby syntax and `.then`, but those are really just sugar).

Rails is so-so.

> I'm starting to long for a stack that doesn't constantly change. Just set the features and that's it.

I hear every day: don't change a thing in D, just add these two features.

I think the only option there is C. It hasn't changed much for a long time, for better or worse.

I think you are looking for COBOL.

Python 2 is 19 years old.

Python 3 is 11 years old.

11 years is a long time to married to the bugs of the past.

20 years is a long time to be married to a fixed language.


Sounds like mathematics.

Can't you just ignore those comments?

I, for one, would very much enjoy if posts about Python on HN didn't -- without exception -- contain threads about py2 vs py3.

We get it, some people are unhappy that py2 is deprecated. But whatever sympathy I might hold for that viewpoint is drowned out by the incessant and unproductive whining I keep seeing here.

Can we please move on now?

I recently went through this process for the Python portions of our large central codebase. It took a few months but most of that was due to coordinating delivery and careful backwards-compatible rollout to multiple streams and licensees, guiding licensees through upgrading their own scripts, etc. Maintaining backwards compatibility wasn't as painful as I expected, but I'm happy that we got to deprecate P2 support on schedule - constantly testing both for every change review gets a little tedious!

How so? It’s automatic with tox, isn’t it?

Sorry, what is automatic with tox? Tox just helps manage environments.

What surprises me most is his insistence on a non-incremental update. To this day, I have no idea what the logic behind "no intermediate steps" is. Codebases are written in Python that would dwarf Margaret Hamilton's famous stack. Why should everyone have to adapt the string functions to unicode and get rid of different-type comparisons and change syntax in one go, without a good way to sound out bugs inbetween?

Regarding string, the breaking change was necessary because it was a double-duty type, sometimes acting as a byte-array and other times acting as a string. Meaning some of its functions, like string.length(), gave a value that only made sense for string-as-byte-array, but not as string-as-simple-list-of-characters. More detail on Stackoverflow if you want (https://stackoverflow.com/questions/5471158/typeerror-str-do...). Anyhow the double-duty type needed to be disambiguated into two separate types, a breaking change.

string.length() still doesn't make sense in Python 3

There's nothing "simple" about "list of characters" in Unicode. Discussed at length at: https://news.ycombinator.com/item?id=18154667

Here's how a Unicode export with 20 years of experience in Unicode explained it: https://blog.golang.org/strings

> Some people think Go strings are always UTF-8, but they are not: only string literals are UTF-8. As we showed in the previous section, string values can contain arbitrary bytes; as we showed in this one, string literals always contain UTF-8 text as long as they have no byte-level escapes.

> In fact, the definition of "character" is ambiguous and it would be a mistake to try to resolve the ambiguity by defining that strings are made of characters.

> [Exercise: Put an invalid UTF-8 byte sequence into the string. (How?) What happens to the iterations of the loop?]

If you aren't in Unicode, then "character" is the same as "byte" so splitting the types gets you nothing of vlaue.

Assuming you are not implementing a low-level Unicode library, what's your use case for "list of Unicode characters" ?

Even in Unicode-aware applications, it is generally not that useful to segment just by codepoint, you more often really want to segment by extended grapheme cluster.

In python 2.7 they also introduced "bytes" although it was just another alias to "str". I wonder if they would make python 2.7 distinguish between them and add an option (like another import from __future__) to throw errors when these types were mixed.

This would help a lot in preparing code incrementally to work on python3.

In the mean time, mypy in python 2 mode doesn't warn about str <-> bytes either :/

I mean, it seems like you are suggesting 3 incremental non-backwards compatible changes. I'm not sure how that's better. Nobody is going to sign on to the first two, so you end up with the exact same situation we have now, but took a more complex approach to get there.

Yes, that's what I'm suggesting. I know a big python2 codebase (SageMath) that has been "close" to python3 adaption for several times within the last 5 years but had to call it off every time because something came in the way. By the time the next serious attempt was started, lots of the changes had rotten. Anything that could be done incrementally (python 2.7.5, from-future imports, prints with parentheses) has been done long ago.

There has to be a term for when the business builds something with one tech stack, and then later, as if it was totally by surprise, realizes it needs to replace the whole thing, for basically no added business-value, and this happens every few years. I mean, everyone has known they would have to do this eventually, but it just kept being put off, like the business is a bad procrastinator. And they all do this.

It's just delaying maintenance. We're bad at recognizing that software requires maintenance and periodic updates/replacement, same as anything physical.

Businesses think once it's written it will always just work...and that's true if the underlying OS and hardware never change, and the business never changes, and requirements never change. But even if it's a completely niche application whose requirements never change, after a while the hardware breaks, and we can't buy the same, and we can't license the old OS or it has documented problems that won't be fixed, so we have to use a new one, and we can't get the old software dependencies to work on the new one, etc etc.

The software version of driving a car until it falls apart instead of doing more than the bare minimum to keep it running. It's not always the wrong choice but it's wrong to be surprised when faced with the expected outcome.

Add to that list "and there is basically one guy still qualified to work on the system", which is practically the case now for old COBOL banking systems. See http://cobolcowboys.com/

Same as every other industry.

Being a CADD person, I had a lot of contact with A/E/C, facilities management, space planning, etc.

Seems like everyone suffers legacy, neglects maintenance, prefers do overs.

Cultural? Misaligned incentives? Bad accounting? Complexity catastrophe (when cost of any change outweighs benefits)?

If someone knows, please tell.

It's always easier and sexier to build than maintain. All capital expense decisions are made at a point in time.

Sustained operations are hard unless you have growth, because flat budgets are cuts given inflation, etc. Given the choice between stocking the toilet paper or doing a preventative maintenance on the elevator that may impact your successor a few years from now, what do you choose?

The only times that facilities are run well is when there is a retired senior military nco in charge. Usually someone who was a chief of the boat in the Navy.

I hadn't thought to mine knowledge from old seadogs, a la Peter Drucker. Thanks.

When I was working for Thomson Reuters WestLaw they sold the Danish/Swedish holdings and we had to move off the platform we had just spent a year moving onto (the sale happened about 7 months after the move was completed), we had one year to move off or we would have had to pay approximately $180,000 a month for using their platform (approximately as being represented in DKK not USD), of course we managed the move and also managed quite a lot of added business value.

I assume you are referring to "technical debt", unless you were being ironic and I didn't catch your irony emoji. (Is there an ironic emoji?)

I don't think that quite applies. Technical debt comes from making choices that help you in the short term but cost you more in the long run (i.e. you're paying interest). Needing to move off a deprecated tech stack wouldn't really qualify unless there was a longer-term option available at the time you originally decided on your tech stack.

Would it really be technical debt when we talk about transition from py2->py3? Technical debt is about the consequences of bad design. But py3 is more the consequence of growth and gain of experience. And for a company it is an external debt.

I would call it more something like upgrade debt or time debt, as it's a problem that appeared because an external situation changed over time. Similar problem in that regard are operating systems which change and let old software behind, or in case of linux old packages not working anymore, repos disappearing, etc.

py2 to py3 would be handling the technical depreciation of value.

This is different. Tech debt is a factor of letting things get slowly worse. Those things can be improved if you just work on your tech debt.

What I'm talking about can't be improved or fixed, or higher-ups refuse to improve or fix it. Instead, it has to be completely replaced. It's like a Replicant with a four-year lifespan. You literally know it's going to die in four years, and nobody orders a new one until the old one is practically dead, and everyone has to rush to replace it.

Sounds like Egyptian emperors and pyramids

there almost is, an irony mark or percontation point "⸮" https://en.wikipedia.org/wiki/Irony_punctuation

stack underflow.

Python 2 to Python 3 breakage is the worst thing that ever happened to Python. Even simple things like retaining the print statement (which could certainly coexist peacefully with the print() function) would have made things easier.

It's not too late though. We could add better backward compatibility into Python 3.

The print function replacement can be done in a few seconds using at least 3 different tools with no risk (futurize, modernize, pyupgrade). The amount of time spent talking about why it’s hard to change greatly exceeds the time required to do so.

Use logging for non-trivial apps.

What is 'even' about this, given that Dropbox was founded well before the first version of Python 3 was released? Feels like less of a 'did you notice' and more of a 'let's have an interminable python2 v python3 thread no matter the topic at hand'.

Then why did Dropbox shove a Python2 v Python3 commend in the middle of a "happy retirement" blog post?

No really, what was the relevance of that comment? A task from a decade ago, with maintenance reaching end of life in two months, and they're off-topic boasting about "already starting" it. What, they think it's early and applause-worthy? They're trying to show that even Guido had to deal with it? Or that even with Guido they couldn't complete it quickly? "Here are some words about Python to make the PR representative look hip and with it"?

Nothing about that interjection reflects particularly well on Dropbox - it's far off the tone of the rest of the post, they would have been better off not mentioning it at all.

Guido has been working there for 6 years yet only now is starting the conversion?

Is the Python 2->3 process and decisions surrounding it an important part of Guido van Rossum's long career? Sure. Does the fact that Dropbox, quite predictably and like all businesses with large Python codebases has also had to deal with that transition add anything interesting, unexpected, "curiosity-gratifying" to a story about his retirement? I'm not seeing it. It's a long thread of stridently off-topic comments most of which aren't even about Python, never mind the dude's retirement.

Business decisions. Dropbox needs to stay competitive especially with all the alternatives from FA(N)AG (removing Netflix). Many alternatives come bundled with other services.

Not all companies have the big tech or mid/early stage VC money to throw around. Who knows what kind of other tech debt Dropbox has accumulated.

The article doesn't say "only now".

It says "He has already put into motion the conversion of the Dropbox server code from Python 2 to Python 3. "

"already" refers to "sometime in the past 6 years".

Motion describes something that isn’t stopped, meaning not yet complete. This implies it’s more recent, or the conversion is more difficult than it should be.

Perhaps it’s already done, in which case this word choice is poor.

There was a blog article maybe a year ago, that Dropbox finished migrating client side code to Python 3.

He also spent large amount of time working on mypy (type checker) which my understanding is also is used for the migration to address unicode issues.

Definitely stood out for me. We haven't started that process either, but we're hoping to soon.

Don't they have source to source compilers that do the migration for you? I don't understand why the friction is so strong with python.

Yes but the source-to-source compilers do a fraction of the work needed to get a code base migrated. I've tried both the builtin 2to3 utility and many proprietary converters, including fancy ones that run type inference on Python 2 code and produce type annotations. None of them, as expected, can deal with the str/bytes split without manual intervention.

Fundamentally the issue is that the Python language is too dynamic for static analyzers to fully understand the code and perform conversions. I'm not just talking about type systems, but also monkey patching, reflections, etc.

What we really need is a tool that runs your Python 2 code, records types and other dynamic manipulations, and then produce Python 3 code. But I'm not aware of such a tool.

Funny that you mentioned it, because there's such tool and one of it orginates from Dropbox - PyAnnotate[1] There's also MonekyType[2] from instagram, but it only works on Python 3. I will also drop PyType[3] from Google, although this one tries to statically determine the types and when I used it felt a bit buggy.

[1] https://github.com/dropbox/pyannotate [2] https://github.com/instagram/MonkeyType [3] https://github.com/google/pytype

> What we really need is a tool that runs your Python 2 code, records types and other dynamic manipulations, and then produce Python 3 code. But I'm not aware of such a tool.

That's a great idea, but it sounds pretty intense to implement. I ended up becoming addicted to ML-family languages pretty early (Scala, F#, OCaml, Rust) for a variety of reasons, but I never fully grasped how much strong/static types help for refactoring until I tried refactoring some Ruby code. I can't imagine trying to write an automated tool for the task.

Unfortunately that's not as easy, because python is a dynamic language, and allows you to do some shenanigans.

For example in python 3 they actually made import statements more predictable they are now absolute by default, it was a bit ambiguous in python 2. Ignoring the ambiguity (which is a problem by itself) some people were creative and instead doing standard imports like god intended decided to do it dynamically using importlib, that can't be easily fixed if the file you're importing is a variable.

Another problem: unicode, actually most python 2 code is broken and it can crap out whenever you use characters outside of ascii. That's because it conflates bytes with text, python 3 makes the distinction strict, so you have to properly address it. Though for 99% of application user meant text, so perhaps this assumption could be made?

- division, in python 2 division worked similar to C, if both the parameters were integers you get integer as a result, python 3 changed it to be more what most people would expect, although python 3 also introduced // which had the prior behavior so I guess this could be automated

There are probably others too.

Python is too dynamic to do a source-to-source compilation that works 100% of the time. Yes, you can get 80% there or maybe even 90% there, but the last 10% is a killer. For a small program, you can stop all development, certify the program under Py3 and then go on.

Dropbox's server code base is literally millions of lines of Python 2. The "stop all development" approach simply won't work there. The only thing that works is to port the whole code base to "straddling code" that works on both interpreters, then switch interpreters. Automated tooling to prevent regressions is essential.

The friction is that people don't like changing thngs that work, and a lot of Python 3 is backported from future

People migrate when the allure of Python 3 outweighs the hassle of juggling 2 versions. Naturally, people who grew up on Python 2 are more hesitant than people who started later on Python 3.

Dropbox could be much more dependent on Python 2 than the average user, because from what I've read, they use a heavily modified version of the interpreter for obfuscation purposes (e.g. scrambling of opcodes). On internal projects, that is likely to be different, of course.

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact