Hacker News new | past | comments | ask | show | jobs | submit | kpw94's comments login

> But when people talk about "fairness", it usually means quality of life right?

Seems you're focusing on the Floor whereas pg refers to a Ceilings?

It's normal that enterprise sales lovers ends up as taller poppies than pottery lovers. You could take it a step further and say: it's normal than Tom Brady and Ronaldo ends up rich but mediocre football players make $0 from football, even though both have interest in football.

That's the typical Ikigai diagram stuff: if someone's "what you love" naturally aligns with "what you can be paid for", pg point is that this person will be richer. (On top of this, if someone's "what you're good at" also aligns naturally with "what you can be paid for", they'll also be richer).

But you're approaching a different question: do pottery lovers have a good enough quality of life? Do they deserve one if nobody needs any of their pottery stuff?

Does everyone's deserve a good quality of life regardless of what their passion is? What about people with antisocial passions (crime, exploiting others etc)?


I think you make a really good point at the end, that those with antisocial and pathological passions shouldn't be encouraged for the sake of societal health. They can really make a huge profit if done in a certain way, right? Exploiting others like using dark patterns or scams definitely can reap huge rewards.

As a layman, I'm curious to know what you and others think about what standards should be held to meet the floor, what standards should be held to reach the ceiling.


> France is not very accommodating to non French speaking foreigners. Trying to get around in life with just English there outside of Paris is not easy.

I don't really get this kind of comments... Usually people also say the same thing when visiting Japan. "This restaurant only has menu in Japanese and staff only speaking Japanese!!".

That's true a bit everywhere in the world, isn't it? In the US, apart from places with say huge Spanish speaking presence, you better interact in English.

Try "getting around in life" using only say French, or Portuguese, or Japanese in a random US city like Portland, NYC, or Chicago.


I didn't sense any judgement there, just a statement of fact. Learning a new language as an adult is doable, but not trivial, so it's certainly a factor in making a decision to relocate to another country for a job.


> That's true a bit everywhere in the world, isn't it?

No it's not, western Europe for example has a bunch of countries where English is almost as good as native. But obviously that's not the common case across the world, and like you say there's nothing wrong with expecting people to know the local language.


>That's true a bit everywhere in the world, isn't it?

That has not been my experience. I'm not arguing it should be this way, but for better or worse I've gotten by very well with english virtually anywhere with tourists and most places without. I lived in east ukraine for two years and learned russian - enough people spoke english there that at times it could be hard to get practice time in russian. This was with a younger student crowd, most non-students and older people did not speak english, but the point stands that you can find english speaking people and get by in most situations.


Not really, the world isn't either black or white but various shades of gray. Everything North of Benelux is a lot friendlier and open to speaking English and doing things in English outside of capitals, compared to places like France where not speaking it gives you a severe handicap in life and career.


I'm sure Clojure is a great language for some tasks...

But, looking at the examples (picked the Wordle one since I know that game): https://github.com/HumbleUI/HumbleUI/blob/main/dev/examples/...

I find it extremely hard to read. Even small snippets, say line 56 to 74 which define this "color", "merge-colors" and "colors"... then the "field" one lines 76 to 117 is even harder.

is it more natural read for people familiar with writing functional programs? (am I permanently "broken" due to my familiarity with imperative programing?)

I wonder what the same Wordle example would look like in, say pure Flutter.

Also wonder how would that code look with external dependencies (say hitting a server to get the word of the day), and navigation (with maintaining state in between those pages)


"is it more natural read for people familiar with writing functional programs? (am I permanently "broken" due to my familiarity with imperative programing?)"

As just one person who has written a great deal of functional code, it reads well to me. I think because I am used to reading it "inside out"? Reading lisp-likes is probably helpful.

Take 'color' for example. It opens with a 'cond', with three branches. First branch is if the idx-th position in word is the same as letter, return green. Second branch is if the word includes the latter at all, yellow. Otherwise we're grey.

That took me a few seconds to grok. Just one anecdote for you. Don't think you're broken but reading/writing this kind of code even a little bit will change the way you see code IMO.


This is the function that confused the person you respond to, ported to Python:

    def color(word, letter, idx):
        if word[idx] == letter:
            return GREEN
        elif letter in word:
            return YELLOW
        else:
            return GREY
I know which one I'd prefer to grok at 2AM with alerts going off.


That's because you are more familiar with whatever style of code you are used to.

Don't confuse familiarity with readability.


> I know which one I'd prefer to grok at 2AM with alerts going off.

At that time I'd just opt for sleep. Or sex. Or drink. Reading code doesn't belong to things one should do at 2AM.


And yet we've all done it.


No. We didn't. At least not we all.

That's just a myth spread by a few workaholic programmers. Luckily, there are enough 9-5 programmers to clean up the mess created by those 2AM committers.


note how they said grok and not work? this is what oncall looks like, reading code at 2 AM


I'll invoke a no true Scotsman argument here.


> I know which one I'd prefer to grok at 2AM with alerts going off.

I hate meaningless statements like this. This means nothing, other maybe that you know Python. 20 years ago people might have said that about Python - I even know many people today who would say that about Python.


I was in a "101" undergrad compsci class the first year the program used Java (1997, I think?) and so this asst prof was showing a simple example of some Java syntax.

I had been programming in C for a while, learning from K&R, to build ray tracing input files and that sort of thing so I was kind of disappointed but whatever, I was a mature student who had rediscovered computers a couple of years before (had a C64 in the 80s) and was just happy to be there.

Anyway, this guy in the back yells out "I could do that in 2 lines of Q-BASIC" or something to that effect (Q-BASIC was definitely part of his pithy one-liner). Little did I know he was representing so many of the people I would encounter over the next decades.


Honestly both read about the same to me, and I'm largely unfamiliar with Clojure. The main difference appears to be the 3 `str` callouts, which appear extraneous as the following version works just the same:

    (defn color [word letter idx]
      (cond
        (= (nth word idx) letter) :green
        (str/includes? word letter) :yellow
        :else :gray))
Interesting that even with the `str` callouts removed, the function still appears to work on other datatypes such as:

    (def s (seq ("test1"))
A lazy sequence, but one Clojure still allows to be indexed over in O(1) time. That's probably what the `str` conversion was trying to speed up.

Python, meanwhile, fails on lazy input as it isn't indexable.

    word = (c for c in "test1")
I guess I'll be checking out Clojure this weekend.


I'm guessing that str allow it to work when the inputs are symbols? So that they are compared as strings rather than by identity. There could be more than one symbol named "foo"; if you want those to compare the same, you can't use regular symbol equality.

Or possibly the code even uses non-symbols for some of the arguments. Suppose that letter is sometimes the integer 1.


I do definitely get more 2AM alerts going off when I work with Python, so it's got that going for it.


Having written a wordle clone recently, this produces the wrong result, by the way. For example guess SASSY with the answer STICK.


Kotlin time (since we're in the JVM context for Clojure)

    fun color(word: String, letter: Char, idx: Int) =
      when (letter) {
        word[idx] -> GREEN
        in word -> YELLOW
        else -> GRAY
      }


And here's what cond could look like in Python syntax:

    def color(word, letter, idx):
        cond:
            word[idx] == letter: return GREEN
            letter in word: return YELLOW
            True: return GREY


> It opens with a 'cond', with three branches. First branch is if the idx-th position in word is the same as letter, return green. Second branch is if the word includes the letter at all, yellow.

This is a tangent, but I've been thinking about how I feel when the conditions of an if-else ladder rely on the order they're listed in.

This is an example; if you swapped the order of those branches around, the coloration would become incorrect.

I'm a little happier when the conditions are described completely, such that swapping the order of the checks doesn't change which of them evaluate false or true, but it's also true that that can add quite a bit of complexity over an order-sensitive set of conditions.

Thoughts?


You could do something like this in Clojure:

    (first (filter identity
                   [(and (condition-three) (action-three))
                    (and (condition-one) (action-one))
                    (and (condition-four) (action-four))
                    (and (condition-two) (action-two))]))
And you could write a macro to do it with nice syntax. A bit more work and you could parallelize it.

You probably wouldn't want to most of the time, but if the conditions are slow to test but otherwise inexpensive, it might be a useful optimization.


Having the order matter and matching the first `true` branch makes for more readable and less-wordy if statements. Otherwise when you have two conditions which have any overlap, such as A and B, for the A branch you need to add `and not B` and for the B branch you need to add `and not A`. This can create very long expressions. Having them evaluate in order and only match the first true one makes this unnecessary.


I mean if the checks are expensive and it's on hot path, then that's wasteful.It might also require then to use more nesting of IFs which isn't necessarily nicer.


It's possible to write some pretty unreadable code with Clojure, just like it's possible in any programming language.

I can tell you that this code is very easy to read if you are familiar with Clojure. In fact, this example is lovely! Seeing this code really makes me wanting to try this library! Clojure has this terse, yet readable aesthetic that I like a lot.

But I completely understand you because at some point Clojure code also looked alien to me. You are not broken for having familiarity with some style of code. Familiarity is something you can acquire at any time, and then this code will be easy to read.

True hard-to-read code is one that is hard to understand even if you master the language it is written in.


>> I find it extremely hard to read. Even small snippets, say line 56 to 74 which define this "color”

I think you make a great point, a point that once someone has gotten used to lisp, is harder to fully appreciate. I’m at the stage now in my lisp journey that i didn’t find those hard to read but it wasn’t that long ago that i felt almost nerd sniped by this weird language. I think it’s worth pointing out that in a more advanced example, I’d still have been comfortable because of the repl - I could navigate between each sub expression with a keystroke and I can send each to the repl with a keystroke and see what they do. Lisp really makes it easy to do this kind of bottom-up assembly - both when you’re writing and when you’re understanding someone else’s code.

A corollary to that, and which was key to me falling in love with lisps, is that the signal-to-noise ratio is off the charts. Whatever you want to implement, probably doesn’t require a lot of code. Wordle in 189 lines is pretty decent. There’s just less to fit in your head and what’s there tends to be solving the problem at hand, not boiler plate.


Just don't mention Electric Clojure, because that might cause some head explosions. (Fully reactive multi-tier programs where an entire web UI is 60 lines of code and implements infinite scroll with search and dynamic viewport resizing.)

If you know Clojure, the code presented in the example seems fairly straightforward. Parentheses demarcate the syntax tree; and the last expression in any tree is the result carried forward.


It's just a matter of familiarity. If you showed me an article written in Italian I would struggle to read it, but that's not because Italian is inherently an unreadable language.


As an experienced Clojure programmer, I found that code easy to read. It uses quite a few idioms that are specific to Clojure or at least Lisp.

Examples include cond, let [{:keys ...}], for being a list comprehension rather than a loop, #(%) function literals, and @ deref.


Also found it easy to read even though I haven't written any Clojure in about a decade (spent a LOT of time with it when it was new).


Let's see the features used in that snippet:

* The cond macro which works similarly to C switch

* Hashmap functions like merge and merge-with

* Destructuring

* The for macro which is similar to the "for each in" statements

None of these are something unfamiliar to common programming languages so that code will not be hard understand once you go over the initial syntax and idiom hump. The syntax makes things much easier once you get to used to it, I think all Clojure programmers like it.


> I find it extremely hard to read.

I avoided Clojure for nearly 15 years because I thought so too.

Turned out I spoke English and couldn't read Russian. But that didn't mean Russian was unreadable—I just didn't know how. It had nothing to do with whether or not it was "readable" or not, it was easy to read (and understand) once I learned how.

After about two weeks, I found reading Clojure to be just as easy as any other code. I did that at 46, so I don't think age is a major barrier. (I've written read and written code my entire life.)

I'm now writing Clojure code every day and am much happier as a developer. (That's why I made the effort initially, and it definitely paid off.)

One thing that really helped was asking ChatGPT or Claude to explain a piece of Clojure code to me, when I had questions. Especially early on, that was invaluable.

Also, learning structured code editing made a big difference—I consider it to be essential. It was extremely frustrating until I spent an afternoon doing that.

Clojure code is "read" differently than, say, Python or JavaScript or C and that's reflected in how you navigate and edit the code.

YMMV


Can you expand on structured code editing?


Structural editing commands are like 'slurp' - swallows an expression inside another one; 'barf' - spits out a thing out of an expression; You can also do it from the left or right side. 'wrap' - wraps a selection into an expression; 'unwrap' - does the opposite; 'transpose' - swaps two expressions at point, and there are more commands.

Once you learn the basic structural editing commands, writing code becomes like composing poetry out of haiku pieces. Instead of thinking like: "how do I grab these vars used inside this function and refactor it to be them in their own unit?...", you'd just grab some expressions and move them around, like bricks or lego pieces. It is extremely satisfying way of writing programs. The only drawback of that approach is that later it becomes harder to work with "more traditional" PLs, you just can't easily manipulate code the same way in Python, JS/TS, Kotlin, Java, C++, etc. - you need a "structured", homoiconic, lispy language for that trick to work. Treesitter makes an effort to improve the process, but it still not on the same level of simplicity.


I think they mean when you learn the shortcuts for selecting and manipulating entire blocks between matching parentheses in an editor that helps balance them and so on, making it rather easy to test things out and refactor Lisp-like code.


This one's pretty clean for Clojure code, due to the simplicity of the data model, in the most conventional sense, as the state of the program is just the word, the guesses, and the grid of colored characters for the guesses.

External dependencies you manage like in most other applications nowadays, you don't hit external services in the "guts" of your code unless you really need to, for performance, testability and to keep the less reliable parts of your code isolated, you keep the interactions with external services as close to the "main" of the application as you can.

When things break down is with more complex data models of the application, not even as much because of the language itself but because Clojure programmers actively reject using record types and interfaces, and just pass dictionaries around. You wind up with some code that, bafflingly, gives the impression of being very simple and neat, but you can't tell what it's actually doing.


Clojure is a lisp. Lisp languages represent code as trees. That's why you have so many parentheses. The trees contain language constructs (if, cond, let, defn, ...), data (numbers, strings,...) and names (function names, names of value bindings). There is also some more advanced syntax for quoting and macros.

When reading lisp code, you navigate it like a tree. Indention matters and clean lisp code has the same indention level for all sibling nodes (with minor deviations for special constructs). Most code follows the pattern of defining "variable" bindings (e.g. via `let`) and then it has one final expression that uses all these bindings to calculate a value.

  (defn name-of-a-function-that-i-define [first-argument second-argument]
    (let [sum-of-some-numbers (+ 1 2 3)
          product-of-some-numbers (* 1 2 3)]
      (+ first-argument
         second-argument
         sum-of-some-numbers
         product-of-some-numbers)))


(i'll just make it clear that indentation matters to users as a strongly recommended convention for ease of reading. to the interpreter, you can write everything on a single line)


I'll add that the related 'power' many Lisp acolytes talk about stems from the fact that everything is a list of lists. Due to this, you can write programs that take syntax (a list of lists) and modify that syntax to do something else (another list of lists).

Imagine a language that has a built in parser and code generation library.


Correct. This is not python.


It's readable, your (justified) problem is with his names. There's no language where "colors" and "field" will be descriptive function names that make it clear what's going on.


I use Clojure professionally, but it's readable to me. I would personally encode the color merging rules differently since it looks too obscured as-is. But I think what doesnt help is the lack of types (or schemas) and the unhelpful function names, as described by another commenter.

Also,

  (apply merge-with {} ...)
is pretty "evil" in the sense that it's a very roundabout data transformation and would likely not pass code review at my company.


> I find it extremely hard to read. Even small snippets,

And unfortunately, you won't get much compiler assistance either with Clojure, beyond basic things. So it's easy to have bugs that will take a while to track down in a complex codebase.


Clojure codebases may be easier or more difficult to maintain depending on factors such as team experience, project complexity, and code quality. The ease of tracking down bugs varies across programming languages and is influenced by development practices and tooling. Clojure codebases are not more difficult to maintain than any other PLs.

- Clojure has strong type inference, catching many errors at compile-time.

- The REPL provides immediate feedback and testing capabilities.

- Clojure's immutability and functional paradigms reduce bug-prone code.

- Tools like core.spec offer runtime type checking and data validation.

- IDEs like Cursive provide advanced static analysis and refactoring support.

- Clojure's simplicity and consistency make bugs easier to spot and fix.

- You also completely ignoring Clojure's rich ecosystem of testing frameworks and tools.


> IDEs like Cursive provide advanced static analysis and refactoring support.

Can you give an example? Would these tools allow you to define a custom type with fields and ensure it is correct everywhere at compile time like a static language?


While Clojure is dynamically typed, tools like clj-kondo, Cursive and clojure-lsp can offer some static analysis benefit like warning about undefined vars or functions. There isn't "true" static checking, but you can use Spec and Malli for runtime checking. That doesn't provide same level of compile-time guaranties as statically type language, yet it offers some unique capabilities that many statically typed languages struggle to match, like

- Dynamic predicates - Spec allows you to define types using arbitrary predicates, which can be more expressive than traditional static type systems;

- Runtime generative testing - Spec can automatically generate test data based on your specifications, which is powerful for property-based testing;

- Flexible validation - You can validate complex nested data structures and apply specs selectively, which is often more flexible than static type checking;

- Extensibility - Specs can be added to existing types without modifying their source, and data-driven nature of it - Specs are just data and can be manipulated programmatically.


Yep, I'd add these advantages that schema systems have:

- Power - you can do arbitrary checks on the data, static type systems are quite weak in what kind of properties they can verify (far from turing complete)

- Flexibility to do checking at where you want at runtime (eg check data at API boundaries)

- Ability to treat schemas as data, generate them, output them as data and share between systems (for example in databases), throug conversions possible to interop with other platforms (eg json schema)

- loose coupling to your programming language, are just libraries


I've tried to learn Clojure a few times and just bounced right off every time. I found it impossible to read and frustrating and tedious to write. Everyone else who tries it seems to fall in love, but I really don't get it.


> I found it impossible to read and frustrating and tedious to write.

Perhaps you've done it wrong? To read any Lisp code one needs a REPL. And you don't typically type directly in it, you connect to it and eval things from source files. Once you get connected to a REPL, you can eval any expression and sub-expression, and with practice, you'd learn to grok the code without a REPL.

And for writing Lisp, you only need structural editing support in your editor. Once you find basic commands - moving structures around is far more enjoyable process than writing things in an unstructured language.

I am far more productive using Clojure instead of Java and Clojurescript instead of Javascript, Fennel instead of Lua, etc. - it's easier to read, easier to modify, easier to maintain. But, yeah, it does require some practice, just like any other skill.


I am well aware of the benefits of a REPL, and find it pretty essential for learning any language. It didn't help me grok clojure any better, though.

I'm not sure what you mean by structural editing support. I usually find things like autocomplete or automatic parenthesis to be more of a nuisance than a help.


> find it pretty essential for learning any language

No, REPLs in other languages are not equal to REPLs in Lisp dialects. I bet what you are describing is not the same workflow that an average Clojurian would use. In other languages you typically type directly into the REPL console. With Clojure, you typically connect your editor to a running REPL instance and then manipulate things directly from the source code - you basically write the program, while living inside it - your codebase becomes a living, breathing, maleable entity.

Structural editing has little to do with autocomplete, it's just a way to manipulate expressions - move them around, raise them, transpose them, wrap/unwrap, etc.

I suppose you tried to understand Clojure by looking at the code, and that could be challenging - without proper REPL and structural editing support, it may not be the same joyful experience that many Clojurians know.


Yes, I've heard this sales pitch before. Yes, I used an actual REPL tied to an editor, among other configurations. I found it rather underwhelming. I tried a few other lisp dialects as well, but had the same experience. Interactivity is great, but it doesn't make up for the language itself.


"Underwhelming"? Seriously? I don't think you actually tried the real thing, have you? I don't know about you, I find it extremely satisfying, when you can run a basic Puppeteer or Playwright script and then explore the DOM structure directly from your editor, without having to copy/paste, move or change your code, or even using devtools (everything controlled from your editor), and then navigate the page interactively from your editor and execute pieces of code (that run in the browser) without any preliminary ritual, even without having to save the file.

Or you'd run a curl command once and continue exploring the data - parsing slicing, dicing, grouping, sorting any way you like, or even have it visualized in the Portal tool with charts and graphs.

Look, I'm currently writing tests for a thing, while my IDE is connected to a service running on a Kubernetes pod in the cloud - I can eval any function that affects the execution of the service, I can explore the db tables, change the routes and re-run the tests - all that without having to restart the pod, without having to deploy anything, without even having to save any files (if I don't have to).

Lisp REPL-driven development gives you immediate feedback, allows you to modify and debug running programs on-the-fly, allows you to experiment, it's great for understanding language features interactively, and it's a real, tangible productivity boost - it's superb for rapid prototyping.

> Interactivity is great, but it doesn't make up for the language itself.

The language is what allows that great interactivity and exploratory programming. I mean, I get it - while it may initially appear challenging to read, much like how sigma notation for loops in mathematics can be difficult to comprehend, with practice it becomes intuitive. One wouldn't go to math.stackexchange to complain about sigmas and other mathematical symbols being unintuitive, would they?


Strange, I use Lisp and type to a REPL all the time. Now you tell me that is a feature of other languages, not of Lisp?


It does take some patience, but once it clicks, it's just awesome.


Hickey talks about readability of Clojure in this talk (timestamp at 6m40s) https://www.youtube.com/watch?v=SxdOUGdseq4#t=6m40


As someone who’s written a lot of Clojure and have been using it on and off since 2009, this looks like decent quality code to me.

I think it’s just a familiarity thing. Clojure is different from most languages in that it’s a lisp and it’s immutable-first functional. That gives it a bit of a learning curve compared to other languages, but I find other simpler languages quite dificulte to read until I’m familiar with them, too.


> I find it extremely hard to read.

Having a bit of Lisp experience (really not a lot), I find it very easy and elegant to read.

> is it more natural read for people familiar with writing functional programs? (am I permanently "broken" due to my familiarity with imperative programing?)

No, most people who say something like this are simply unwilling to invest an evening into a language they're not already familiar with.


>I wonder what the same Wordle example would look like in, say pure Flutter.

Try this with ChatGPT, Claude or Gemini. All LLM's are really good with this translation tasks


not going to say you're wrong or right, but learning lisps/clojure fairly deeply you can find worse examples. Also when people learn it their mind tends to find it a very consistent langauge overall visually. I've moved on after a long stint in clojure to elixir and while I like it, most other language pale in language consistency.


The single space indent seems the most weird thing about that example...


If you haven't used a Lisp-inspired language, yeah, it's going to seem different than all those imperative Algol-derived languages that have been popular for so long.

I don't use Clojure professionally, but I've spent years using the language personally. It might be hard to believe, but it really is beautiful once you learn it, and extremely powerful as well.


I've done a bunch of Clojure development professionally and don't find that code extremely hard to read.

One thing to add to what others have said, when you're met with code like this in the wild and you need to modify/add/remove something but don't have a 100% understanding yet, the common workflow is that you explore this code with your "evaluator" (basically a REPL connected to your editor).

So if you come across snippets of code you don't understand very well, you place your cursor at the various forms and execute them. So you'll start at the inner forms, and slowly work your way outwards, and after that you've verified assumptions both about the function's internal workings and how you'll use it from the outside.


Lisp becomes easier to read if you have an IDE plugin that removes opacity from parentheses


They're easy to spot, at least for recent sedans & suv, they're the ones with a front grille looking like beaver teeth and the grille look more solid vs traditional vent (which make sense since there's no combustion engine that needs cold air intake to cool down)

https://www.google.com/amp/s/www.businessinsider.com/bmw-ix-...


Many listings for apartments are on on-site.com (which is "a RealPage company") and are publicly available...

So it could be argued it's not

"private prices" + "autoaccept" + "compliance"

But rather

"public prices" + "autoaccept" + "compliance"

Still problematic behavior (and probably add "private knowledge of inventory forecast" on top of that), but I'd argue price signaling of the available inventory isn't the main issue.


There is more than just pricing at question here. If you go to your typical local gas station with a 5000 gallon tank and fill up the station will raise their prices above the other stations in the area because they will only have a small amount of gas in their tanks and so they want everyone to go elsewhere until the next delivery fills the tanks up again. (depending on when you fill up the station may not even have 5000 gallons in their tanks.) How much tank is left at any station is NOT public information shared with other stations in the area.

RealPage though has information on how many apartments are empty and uses that in algorithms even though it isn't public information.


>are publicly available

The list price is different than the final accepted monthly rate\term for the renter. Realpage is getting the actual rental information.

In addition, the occupancy of the building is also not public data.


>So it could be argued it's not "private prices"

I added "private prices" as one of the factors because the official DOJ wording in the complaint mentions "nonpublic/confidential/sensitive" prices in 3 different places:

>The complaint alleges that RealPage contracts with competing landlords who agree to share with RealPage nonpublic, competitively sensitive information about their apartment rental rates

>“We allege that RealPage’s pricing algorithm enables landlords to share confidential, competitively sensitive information and align their rents.

>Landlords agree to share their competitively sensitive data with RealPage in return for pricing recommendations and decisions that are the result of combining and analyzing competitors’ sensitive data. *


I don’t think they’re talking about the price they’re renting the apartment for; I can’t imagine that number is secret in any meaningful sense of the word. Who rents an apartment without knowing the price?

I think they’re talking about more sensitive internal numbers. What are the costs and margins on the unit? How quickly are units moving at a certain price? What’s the turnover at particular prices?

I think the core mechanics bear some similarities to insider trading, with a third party “washing” the non-public information.


Another big one is when do the leases end for inventory control. RealPage is why Apartment dwellers report getting options to renew at cheapest price for odd number of months like 17. Realpage is trying to prevent a bunch of leases ending and flooding the market.


> I think they’re talking about more sensitive internal numbers. What are the costs and margins on the unit? How quickly are units moving at a certain price? What’s the turnover at particular prices?

Yeah that's my understanding of it too "competing landlords who agree to share with RealPage nonpublic, competitively sensitive information about their apartment rental rates and other lease terms".

I.e. realPage has an oracle view of all the lease ending times etc, so it knows for instance, this next July there's going to be very few availability, so boost all the rents by X%

I guess there's a "private price" if the apartment complex share what, after all negotiations, the renter ended up signing for. It can be more than what was on the public website, if they ended up signing for a shorter lease, or less if the apartment ended up needing to throw in "first month free" etc.

There's also private price of lease renewals done before the unit is put for rent on the website.


The author provides an example of the bad "Loading bar writing" but unfortunately not a good example of what they call "Outline speedrunning writing"

pg, who's good at writing essays, does provide a good example of the latter, with https://byronm.com/13sentences.html.

This is the writing process that lead to https://paulgraham.com/13sentences.html. (the https://code.stypi.com/hacks/13sentences?doomed=true URL on pg's blog is a now a dead link. Previous discussion: https://news.ycombinator.com/item?id=6993060).


What I do for my blog is I write everything at once. Then I figure out where to put images, then I publish it!

It makes me go back and read it carefully since I have already published it, and then I polish, rewrite sections and add stuff that I missed.


I have done something similar on the past and I was very happy with the results. At the time I was starting up a consulting business and got the first few gigs directly from engagement with my blog.

I also time boxed myself when writing. I wouldn't write unless I had a really clear topic in mind, then I'd give myself an hour to get it published. A few times I ran out of time and ended it with a "follow-up post coming soon to dive into ___" type message and that worked just fine.


Follow up posts are important! It takes the pressure off having to cram everything into a single publication. Nice!


The other provides a very good example in one of the video illustrations, with the left hand side showing "loading bar" writing and right hand side simultaneously showing "outline speed running" writing.


> I also finally learned how signals work from top to bottom, and boy is it ugly. I’ve always felt that this was one of the weakest points in the design of Unix and this project did nothing to disabuse me of that notion.

Would love any resources that goes in more details, if any HN-er or the author himself knows of some!


If you haven't already, I would start with Advanced Programming in the Unix Environment by Stevens

https://www.amazon.com/Advanced-Programming-UNIX-Environment...

It is about using all Unix APIs from user space, including signals and processes.

(I am not sure what to recommend if you want to implement signals in the kernel, maybe https://pdos.csail.mit.edu/6.828/2012/xv6.html )

---

It's honestly a breath of fresh air to simply read a book that explains clearly how Unix works, with self-contained examples, and which is comprehensive and organized. (If you don't know C, that can be a barrier, but that's also a barrier reading blog posts)

I don't believe the equivalent information is anywhere on the web. (I have a lot of Unix trivia on my blog, which people still read, but it's not the same)

IMO there are some things for which it's really inefficient to use blog posts or Google or LLMs, and if you want to understand Unix signals that's probably one of them.

(This book isn't "cheap" even used, but IMO it survives with a high price precisely because the information is valuable. You get what you pay for, etc. And for a working programmer it is cheap, relatively speaking.)


Not positive, but pretty sure that this, and the Unix Network book were golden for us in the 90s when we were writing MUDs. Explained so much about Socket communications (bind/listen/accept,...) Been a long time since I looked at that stuff, but those were fun times.


I believe that's the book I still have on my shelf. IIRC "UNIX Network Programming" and I learned a lot about networking and a lot about how UNIX works reading it cover to cover. I think I learned more from that book than any other.

Mr Stevens replied to something I wrote back in the day. I can't recall if it was a Usenet post or email, but I was over the moon!


I believe this was the 3rd time I’ve seen this book being recommended this week. It must mean something.


It is a must for anyone serious about UNIX programming.

Additionally one should get the TCP/IP and UNIX streams books from the same collection.


Is the Unix streams book “Unix Systems V network programming”?


That one is also relevant, yeah.

Although, I did a mistake, I was thinking about all Richard Stevens books for networking, that go beyond plain TCP, UDP, IP.

https://en.wikipedia.org/wiki/W._Richard_Stevens

Unfortunelly given their CS focus, they are kind of on the expensive side, I read most of them via libraries, or eventually getting my own copies.


It's been the standard reference for decades for a reason. I learned from it, too. There's really nothing else quite like it available.


It might mean the Baader–Meinhof effect.


It's well written and full of practical advice and fun to read.


Signals are at the intersection of asynchronous IO/syscalls, and interprocess communication. Async and IPC are also weak points in the original Unix design, not originally present. Signals are an awkward attempt to patch some async IPC into the design. They're prone to race conditions. What happens when you get a signal when handling a signal? And what to do with a signal when the process is in the middle of a system call, is also a bit unclear. Delay? Queue? Pull process out of the syscall?

If all syscalls are async (a design principle of many modern OSes) then that aspect is solved. And if there is a reliable channel-like system for IPC (also a design principle of many modern OSes) then you can implement not only signals but also more sophisticated async inter-process communication/procedure calls.


As I wrote in some older discussion about UNIX signals on HN, the root problem (IMHO, of source) is that signals conflate three different useful concepts. The first is asynchronous external events (SIGHUP, SIGINT) that the process should be notified about in a timely manner and given an opportunity to react; the second is synchronous internal events (SIGILL, SIGSEGV) caused by the process itself, so it's basically low-level exceptions; and the third is process/scheduling management (SIGKILL, SIGSTOP, SIGCONT) to which the process has no chance to react so it's basically a way to save up on syscalls/ioctls on pidfds. An interesting special case is SIGALRM which is an asynchronous internal event.

See the original comment [0] for slighlty more spellt out ideas on better designs for those three-and-a-half concepts.

[0] https://news.ycombinator.com/item?id=39595904


At least the first two are also conflated in a typical CPU’s trap/interrupt/whatever-your-architecture-calls-it model, which is what Unix signals are essentially a copy of. So this isn’t necessarily illogical.


SIGHUP and SIGINT have no CPU-level equivalent.


Sure. What I meant is, a CPU’s trap/interrupt mechanism is very often used to signal both problems that arise synchronously due to execution of the application code (such as an illegal instruction or a bus error) and hardware events that happen asynchronously (such as a timer firing, a receiver passing a high-water mark in a buffer, or an UART detecting a break condition). This is not that far away from SIGSEGV vs SIGHUP.

Some things (“imprecise traps”) sometimes blur the difference between the two categories, but they usually admit little in the way of useful handling. (“Some of the code that’s been executing somewhere around this point caused a bus error, now figure out what to do about it.”)


IPC was actually introduced in "Columbus UNIX."

https://en.wikipedia.org/wiki/CB_UNIX


A story about the problem with delivering interrupts to a process in kernel mode in unix:

https://www.dreamsongs.com/RiseOfWorseIsBetter.html


Unix signals do... a lot of things that are separate concepts imo, and I think this is why there are people who don't like it or take issue with it.

You have SIGSTOP/SIGCONT/SIGKILL, which don't even really signal the process, they just do process control (suspend, resume, kill).

You have simple async messages (SIGHUP, SIGUSR1, SIGUSR2, SIGTTIN, SIGTTOU, etc) that get abused for reloading configuration/etc (with hacky workarounds like nohup for daemonization) or other stuff (gunicorn for example uses the latter 2 for scaling up and down dynamically). There's also in this category bizarrely specific things like SIGWINCH.

You also have SIGILL, SIGSEGV, SIGFPE, etc for illegal instructions, segmentation violations, FP exceptions, etc.

And also things that might not even be good to have as async things in the first place (SIGSYS).

---

As an aside, it's not the only approach and there's definitely tradeoffs with the other approaches.

Windows has events, SEH (access violations, other exceptions), handler routines (CTRL+C/CTRL+BREAK/shutdown,etc), and IOCPs (async I/O), callbacks, and probably some other things I'm forgetting at the moment.

Plan 9 has notes which are strings... which lets you send arbitrary data to another process which is neat, but it using the same mechanism for process control imo has the same drawbacks as *nix except now they're strings instead of a single well-defined number.


The Windows mechanisms you're mentioning were also added over the course of many, many years. Much of Windows also happened a long time after UNIX signals were invented.

If you're including all that other stuff, it's probably fair to include all of the subsequent development of notification mechanisms on the UNIX side of the fence as well; e.g., poll(2), various SVR4 IPC primitives, event ports in illumos, kqueue in FreeBSD, epoll and eventually io_uring in Linux.


Except much of these UNIX later development were done by their derivatives and are often available with certain degree of incompatibility among them (or not even at all)


Yeah, it definitely is (especially since SIGIO is a thing :)). Even the Unix signals had more added to them over time (SIGWINCH and friends iirc came from the BSDs).

A lot of the mechanisms are very OS specific but I do think they're good comparisons to have with signals as well.


"signalfd is useless" is a good article: https://ldpreload.com/blog/signalfd-is-useless

It goes into the problems with Unix signals, and then explains why Linux's attempt to solve them, signalfd, doesn't work well.


That is a good article. I found myself nodding in agreement while reading it, thinking "Yeah, I've been bitten by that before".

How does Windows handle this? There's still signals, but I believe/was under the impression that signals in Windows are an add-on to make the POSIX subsystem work, so maybe it isn't as broken (for example, I think it doesn't coalesce signals).


Windows has a slightly better concept: Structured Exceptions (https://learn.microsoft.com/en-us/windows/win32/debug/struct...). It is a universal concept to handle all sorts of unexpected situations like divide by zero, illegal instructions, bad memory accesses... For console actions like Ctrl+C it has a separate API which automatically creates a thread for the process to call the handler: https://learn.microsoft.com/en-us/windows/console/handlerrou... . And of course Windows GUI apps receive the Window close events as Win32 messages.

Normal windows apps doesn't have a full POSIX subsystem running under them. The libc signal() call is a wrapper around structured exceptions. It is limited to only a couple well-known signals. MSVCRT does a bunch of stuff to provide a emulation for Unix-style C programs: https://learn.microsoft.com/en-us/cpp/c-runtime-library/refe...

In contrast to Unix signals, structured exceptions can give you quite a bit more information about what exactly happened like the process state, register context etc. You can set the handler to be called before or after the OS stack unwinding happens.


I am such a moron. Every one of those three links above is colored as 'visited' for me.

I have obviously read this up before and just didn't remember :-(


There were differences between BSD and SYSV signal handling that were problematic in writing portable applications.

https://pubs.opengroup.org/onlinepubs/009604499/functions/bs...

It's important to remember that code in a signal handler must be re-enterant. "Nonreentrant functions are generally unsafe to call from a signal handler."

https://man7.org/linux/man-pages/man7/signal-safety.7.html


reentrancy is not sufficient here - at least that provided by mutex style exclusion. the interrupted thread may have actually been the one holding the lock, so if the signal handler enters a queue to wait for it, it may be waiting quite a while


That's why the word reentrant is used, not thread safe.


I always felt VMS' mailbox system was much more elegant, but I imagine it's an ugly mess under the surface too.

https://wiki.vmssoftware.com/Mailbox



I wanted to say the exact same thing! I would love to get more details about that.


Would love to read a blog post about that.


patio11's thread on that topic was quite interesting. His move to Japan was apparently 90% motivated by the wrong prediction of all the jobs moving abroad:

https://twitter.com/patio11/status/1779978518656332212


Great links, especially last one referencing the Goto paper:

https://www.cs.utexas.edu/users/pingali/CS378/2008sp/papers/...

>> I believe the trick with CPU math kernels is exploiting instruction level parallelism with fewer memory references

It's the collection of tricks to minimize all sort of cache misses (L1, L2, TLB, page miss etc), improve register reuse, leverage SIMD instructions, transpose one of the matrices if it provides better spatial locality, etc.


The trick is indeed to somehow imagine how the CPU works with the Lx caches and keep as much info in them as possible. So its not only about exploiting fancy instructions, but also thinking in engineering terms. Most of the software written in higher level langs cannot effectively use L1/L2 and thus results in this constant slowing down otherwise similarly (from asymptotic analysis perspective) complexity algos.


No question, always nice to read your yearly updates.

Last year looked really promising , your revenue already high but profits not there yet, but I'm happy to see you turned a nice profit this year! Congratulations!


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: