Hacker News new | past | comments | ask | show | jobs | submit | mdbauman's comments login

It seems weird to me that the article lists several gamified apps without mentioning advertising. It seems obvious to me that gamified apps like Duolingo are incentivesed to keep eyeballs glued to screens mostly because advertisers pay per view, and strange not to mention this as a reason why we see so much of it in this space. Maybe the author thought it was too obvious to mention.


Advertising makes up only 9% of Duolingo's revenue and is also usually a small part of the revenue of most mobile games. Mobile games rely on "whales" (people who pay a lot, like a casino) and Duolingo makes almost all of its revenue on subscriptions. Both require sticky retention but advertisers don't drive the business model or executive decisions at these companies.


Absolute percentage of revenue (or for that matter, profit) is irrelevant. If employees are able to justify their salary/promotion by increasing a metric such as ad impressions or "engagement", you're gonna see more advertising, even if it affects long-term profitability or even kills the product.


Although I agree, most times where I mix any of these without parentheses I end up having to explain it either in code review or when somebody does a `git blame` a couple months later. Rather than waste everyone's time explaining it, it's easier to just use the parentheses since it's what many/most people expect and everyone else can read it well enough.


Agreed, with the caveat that you also need a policy/culture that the escape hatch should be avoided whenever possible and the low-code work should not be done by developers.

Whenever I've discussed this, the common theme is that business users continue to request features that would be easily achieved in the low-code platform being used. It's hard to blame them; that's been standard procedure for them for their entire career.

But if you're not strict about saying "no", you still end up writing all the same methods but now on top of that you have a GUI that's not providing any value. Or maybe worse, your developers end up maintaining all of the low-code stuff too when they could have just written the code, switching context pointlessly and (probably, depending on the platform) not using source control.


An interesting thing with developers getting involved in those boxes-and-arrows UI things is outages. I made a mistake with one once, and the postmortem quite reasonably asked:

* Where was the design doc?

* Where were the alerts?

* Where was the code review?

* Why didn't you write an integration test?

* What do you mean it just rolled out to production instantly?

When we're considering options in advance of building something, it's a more time-efficient, less wasteful alternative to programming. But having built it, everyone acknowledges that what we have done is programming, and now they wonder why we've programmed so badly.

Maybe the standard IDEs, Git, code review, CI, metrics, and incremental deploy workflows were fine actually?


For me, none of the learnings is a direct result of using a low-code, arrow-boxes environment. I can deploy instantly to production using any programming environment, if I don't automatically have design documents when using a much-code environment.

Without discipline, any programming environment can lead to failures.

It is true that there aren't any well-defined workflows for using an arrow-boxes environments but that does not mean that these environments don't support specific workflows.


Are these environments attractive because boxes and arrows are actually better than characters for expressing programs? Or are they attractive because they encourage you to skip steps that turn out to have been important? Sure, you can replicate a normal, responsible development process with a no-code tool, but at that point do you really have a compelling alternative to a more traditional programming environment?


This xkcd seems relevant also: https://xkcd.com/303/

One thing that jumps out at me is the assumption that compile time implies wasted time. The linked Martin Fowler article provides justification for this, saying that longer feedback loops provide an opportunity to get distracted or leave a flow state while ex. checking email or getting coffee. The thing is, you don't have to go work on a completely unrelated task. The code is still in front of you and you can still be thinking about it, realizing there's yet another corner case you need to write a test for. Maybe you're not getting instant gratification, but surely a 2-minute compile time doesn't imply 2 whole minutes of wasted time.


If you can figure out something useful to do during a two minute window, I envy you.

I really struggle with task switching, and two minutes is the danger zone. Just enough time to get distracted, by something else; too little time to start meaningful work on anything else...

Hour long compiles are okay, I plan them, and have something else to do while they are building.

30 second compiles are annoying, but don't affect my productivity much (except when doing minor tweaks to UI or copywriting).

2-10 minute compiles are the worst.


Spot on. The mind often needs time and space to breathe, especially after it's been focused and bearing down on something. We're humans, not machines. Creativity (i.e., problem solving) needs to be nurtured. It can't be force fed.

More time working doesn't translate to being more effective and more productive. If that were the case then why are a disproportionate percentage of my "Oh shit! I know what to do to solve that..." in the shower, on my morning run, etc.?


I love those moments. Your brain has worked on it in the background like a ‘bombe’ machine cracking the day’s enigma code. And suddenly “ding… the days code is in!”


You might like the book "Your Brain on Work" by Dr David Rock. In fact, I'm due for a re- read.

https://davidrock.net/about/


I agree to some extent. Though, I don't think it has to be a trade-off though. After a sub-5 second compile time, I go over to get a coffee to ponder the results of the compile rather than imagine what those results might be. Taking time to think is not mutually exclusive to a highly responsive dev process.


I get what you are saying but I still think fast compilation is essential to a pleasant dev experience. Regardless of how fast the compiler is, there will always be time when we are just sitting there thinking, not typing. But when I am implementing, I want to verify that my changes work as quickly as possible and there is really no upside to waiting around for two minutes.


Yes! Pauses allow you to reflect on your expectations of what you're actually compiling. As you sit in anticipation, you reflect on how your recent changes will manifest and how you might QA test it. You design new edge cases to add to the test suite. You sketch alternatives in your notebook. You realize oh compilation will surely fail on x because I forgot to add y to module z. You realize your logs, metrics, tests and error handling might need to be tweaked to unearth answers to the questions that you just now formulated. This reflection time is perhaps the most productive time a programmer will spend in their day. Calling it "wasted" reflects a poor understanding of the software development process.


It's for this reason that I appreciate this article, even though it has a (playful, well-intentioned) negative tone toward Godot which is a project I donate to.

This is wonderful criticism! It's thoughtful and well-researched. Hell, even I'm inspired to finally dive into Godot's internals, which I've yet to do despite following the project for several years. I hope this inspires even more contribution and constructive criticism.


It feels like a breath of fresh air to me. Yes, random people, come in and start using Godot, the more eyes are on this project the better. They're articulating gripes I have had but couldn't express. It would be great if, like JS, all the attention became an impetus to make Godot really fast.


Totally agree, that's why I submitted this article. I'm personally invested in seeing Godot become successful, but this type of constructive criticism is great for the ecosystem (even though some people get very defensive). I think it's exciting that there's so much room for improvement and shining light into Godot internals is a great way to expose what needs to be done.


Back when I decided it was time to add a scripting language, Perl and Python seemed like the obvious choices, and in my mind were equally good options. I asked my best friend which I should choose, and he more or less said, "You can't go wrong with either one, but when you ask for help Perl people are assholes and Python people are nice."

I can't confirm his thoughts on Perl and I haven't interacted much with Ruby, but the Python community is definitely welcoming and patient in my experience. I wouldn't be surprised if this was a significant factor in Python's prevalence over Perl, Ruby, or anything else.


yep the Perl community kind of had issues around the turn of the millennium and the perl6 debacle did a lot to convince people that Perl was kind of a dead end.

I don't think there was any toxicity in the Ruby community but it was made up of working programmers where as the big leading voices in the python community was teaching assistants and students so it might have been more tailored to newbies.

I don't recall there being much real industrial use of python prior to Ruby emerging even if python is technically older so i think the real answer lies in why the educational sector decided that teaching python was easier and significant whitespace plays a huge part here.


> I don't recall there being much real industrial use of python prior to Ruby emerging even if python is technically older

Yeah that's my recollection too. About 2011ish there weren't a lot of jobs in python yet. Perhaps in SV, but not out in the real world. Several startups were doing it, including Youtube and google at the time.

But in the F500 world, python wasn't used at all. I started using it in 2008/9-ish.


> [...] the perl6 debacle did a lot to convince people that Perl was kind of a dead end.

Not GP, but the Python 2 vs 3 holy wars were also something that kept me from adopting Python as a scripting language a couple of years.


Yeah, python 2->3 transition was painful. But, I would argue that was self inflicted. Guido and company chose not to develop a 2.8/2.9/etc series where people could move their code base over incrementally.

I mean, I love python, but that sucked!

Yes it would have been more work for the devs, but the amount of work it meant for the users were worse.

In fact they pretty much just threw away anything before python 3.6 anyway now. Many things introduced in the 3.x series before 3.6 just don't work anymore (asyncio syntax being the notable one).


> Guido and company chose not to develop a 2.8/2.9/etc series where people could move their code base over incrementally.

That is literally what 2.7 was, as well as reimplementing some features in later p3 (up to 3.4).

The core team definitely had the wrong transition model at the start, and it took some time for the community to convince them then decide on which items were the most important, but let’s not act like they did not get it in the end.

> In fact they pretty much just threw away anything before python 3.6 anyway now. Many things introduced in the 3.x series before 3.6 just don't work anymore (asyncio syntax being the notable one).

What?


What would have been a better transition model? Are there any languages with major breaking changes that have done the upgrade smoothly?


> What would have been a better transition model?

Better supporting cross-version transition codebases.

The core team initially saw the transition as “run 2to3, fix what’s left, publish updates, P2 is gone”, but aside from 2to3 being quite limited such transition is quite hard for dependencies, as it means they leave all older dependents behind entirely (dependents which might be the primary user for e.g. company-sponsored projects), or they have to keep two different codebases in sync (which is hard), plus the limitations of pypi in terms of versions segregation.

What ended up happening instead was libraries would update to 2.7 then to cross-version codebases, this way their downstream could migrate at their leisure, a few years down the line people started dropping P2.

> Are there any languages with major breaking changes that have done the upgrade smoothly?

Some but usually statically typed languages e.g. elm’s upgrade tool worked pretty well as long as you didn’t have native modules and all your dependencies had been ported. I think the swift migrator ended up working pretty well after a while (Swift broke compatibility a lot “initially”) though I’ve less experience with that.

An alternative, again for statically typed languages more than dynamically typed ones, is to allow majorly different versions of the language to cohabit e.g. editions in Rust (Rust shares the stdlib between all editions but technically you could version the stdlib too).

Not workable for Python, not just because it doesn’t really have the tooling (it has some with the __future__ imports but nowhere near enough) but also because it changed runtime data model components specifically the entire string data model, which is not a small matter (and was by far the most difficult part of the transition, and why they piled on smaller breakages while at it really).


The only ruby person i've met was insistent that ruby was the one true way and he trued to force it into everything. That attitude turned me off.

Of course I already knew Python, and so did the rest of my team so we had been doing tools in Python (the guy wasn't on my team), but until he pushed ruby into places where python would have been better (import a Python library rather than shell to out to a program) I was willing to accept it was probably fine '


> The only ruby person I've met was insistent that ruby was the one true way and he trued to force it into everything. That attitude turned me off.

I mean, if you read almost any Elixir article that has hit the front page of HN, there are always comments from Pythonistas saying, "Why bother when there's Python?" Similar attitude. Obviously it's not everyone, but it's not everyone in the Ruby community either.


It’s such a bizarre reason. “One person using something rubbed me the wrong way so I decided not to use it”. Did the person extrapolate to an entire community from a sample size of 1?


    The only ruby person i've met was insistent that ruby was the one true way
That sucks. I've been doing Ruby full-time since 2014 at 4 companies and I've never seen that sentiment, even from people who really love it. My experiences have been really positive.


I agree. I’ve found ruby and its developers to be pretty friendly and open to other languages and styles.


My experience with python was simply, people wanting to get shit done. This was circa 2008. They weren't really engaging in language wars, but doing innovative things like extending Java, with Jython.

I was arguing for the F500 company I was working on to explore using Jython to write unit tests for Java code.

Why not have a scripting language to write unit tests for Java code?

I see this with Rust trying to extend python in interesting ways. I don't see this with Java trying to extend C/C++ or Python.


Python and Ruby have some things where the intuitions are exactly inverted from one another. It took me a long time to figure out why Python rubbed me the wrong, and that if I dig up how I used to structure code in Pascal, it’s fine.

Not that I care much these days since I prefer writing in Elixir.


> I haven't interacted much with Ruby

“Matz is nice and so we are nice” https://en.wiktionary.org/wiki/MINASWAN :)

The Rails community is another story, unfortunately.


That’s funny because that’s one of the reasons I tend to point beginners to R instead of Python for data work.


I can't imagine Python's welcoming community has anything to do with it. If anything it was Ruby that had a reputation for being the most welcoming community with its MINASWAN (cringe) philosophy.


> I can't imagine Python's welcoming community has anything to do with it. If anything it was Ruby that had a reputation for being the most welcoming community with its MINASWAN (cringe) philosophy.

TBH, community had nothing to do with Python's enormous success over its competitors (Perl, and Ruby. Possibly Tcl too.).

Nor did any technical merit, nor ergonomics.

There's one, and only one, reason why Python exploded at the expense of the other competitors: The ease and acceptance of using the language as glue for C functions.

Python's popularity is built on a solid foundation compatibility with C.

If, in the 90s, C++ had taken off enough to displace C, I doubt Python would be as popular as it is. Python owes its ubiquity to C, because if C was not ubiquitous, Python wouldn't be either.

(It's only recently, like the last 10 year or so, that I started seeing popular Python programs which didn't have a dependency on C libraries. And even now, it's still rare to see).


I don't think your analysis is accurate.

My experience of trying to get my own C functions to use in Python to have been nightmarish. Yes, you can do it... if you have exactly the same compiler & version used to produce the python interpreter itself.

C's only usefulness to Python is: it allows optimization of the 80/20 or 90/10 rule, so performance doesn't have to totally suck with Python.

Python 'won' IMHO because it hit a sweet spot -- simple enough for beginners, in fact, beginner-friendly, but due to having a good basic set of datatypes (lists, tuples, sets, plus the usual ints, floats, and complex) -- this allowed complex ideas to be compactly expressed. The ability to switch between functional and imperative styles also helped.

Python is a 'good enough' lisp. MIT switched, and Norvig has said as much.

No, the astonishing thing is that Python survived the 2->3 transition, and came out stronger on the other end. Language cleanups, new 'syntactic sugar' (e.g. @ as the decorator syntax), and what you see is Python is trying to actively steal all the successful programming paradigms under one unified syntax.

Is python perfect? Hardly. But it's beginner-friendly and expert-optimized. AND, unlike C++ (at least for me), you can get ALL of Python into your head at the same time. (Libraries, ok, but true in any language). In this specific sense, it is exactly like C (you can keep it all in your head, even the edge cases).

There are newer languages gunning for a piece of Python's mindshare (Zig, Nim). But because Python is a moving target: getting better and better, the others will need to provide a spectacular use-case advantage --- and I just don't see that happening.


Maz (ruby author) is nice and so we are nice, isn’t so bad. It’s twee sounding but saying you are going to follow the example set by the founder is absolutely fine. Is it any worse than the Python ‘benevolent dictator for life’ example?


I'm going to answer your question directly: No, it's not worse.

My interactions with Guido haven't been awesome. But the people put up with him regardless. The other people in python have been awesome.


I'm genuinely curious: what's "cringe" about MINASWAN?

(I write mostly Python these days, but have been involved in both communities for a long time, and MINASWAN never particularly stood out to me other than as a cute reminder to be nice.)


There is no problem with someone being nice. It's only a problem when they want you to be nice exactly like them.


It was meant tongue in cheek as I was defending Ruby being the most welcoming community. It's just a bit twee like the voice-over on London Underground - "See it, say it, sorted".


Oh I have met Ruby people and it's a big factor in why I never learnt the language.


This is how I met most of my current friends, as well. I was at a local bar one night on what turned out to be trivia night. There were a few of us who didn't know anybody well but had seen each other at the bar before, so we decided to form a team to win some of our tab back.

Fast forward 8 or so years. We no longer do trivia, but get together just about every week along with the siblings, childhood friends, girlfriends, and wives that have joined us over the years.

It's a little strange to think of how different my life would be had I not been at that bar on that night, but that's how these things work: you put yourself in a situation where something social is going on, and all that's standing between you and making some new friends is saying "yes."


Imagine the friends you didn't make because you stayed in that one night when you were on the fence about going out. Good thing too -- it kept you out of prison!

I mean, imagining what 'wouldn't have been' is a lot easier than imagining what 'could have happened' because you literally have no idea. You can imagine life without your friend group, but try to imagine a life where you ended up grabbing a drink at Jeffery Dahmer's apartment or any of a million other things.


Chromebooks that are a couple years old seem to run pretty cheap, especially refurbished. Installing Linux is simple enough, although some (all?) have non-standard key layouts which can require some additional setup to get working comfortably.

I've had a few of these over the years that I take to coffee shops/bars to work. It's nice not to feel nervous about a $1000+ investment just because the server is coming around to refresh my water.


I highly doubt the phone and PC markets will converge any time soon.

Apple has made seemingly made the most progress toward this and it isn't hard to imagine someone plugging their iPhone in to a screen when they arrive at work and resuming their Excel spreadsheet with the connected keyboard, no different than the company-issued laptop today. But I don't see what incentive Apple has to make that a reality when they can keep selling people two separate $1000+ devices.

Edit: I would love to be proven wrong, so any opinions/examples to the contrary are very welcome.


Relevant xkcd: https://xkcd.com/2347

I was a little sad to read that Mills in fact lives in Delaware, not Nebraska. I always had a sneaking suspicion that this comic was NTP-related.


To be fair, that's only off by 0.0069625621 seconds. Since time advances at the speed of light, you could say everyone lives in Nebraska and still be correct well within the margin of error.


The title text for the image implies it might be about ImageMagick, but that's probably dated now since the likes of vips


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: