Hacker News new | past | comments | ask | show | jobs | submit | devmunchies's comments login

Related to the "blissful" feeling, an under emphasized criteria when choosing current tasks for engineers from a backlog of tasks is which feature are they more excited to work on right now.

The motiviation and tinkering can be similar to a side project, and results in higher quality work IMO. Obviously there are urgent tasks, but it's an ignored vector in the "weighting system" for choosing work for engineers.

If you wait to assign the task in the next sprint, the excitement for that particular task might be gone.


The federal govt can’t be the majority of technological innovation. If we’re lucky this vacuum will be filled by an even larger private sector innovation hub like Xerox park and bell labs.


Various institutions that, among other things, receive federal grant money, make up an enormous amount of technological innovation. The federal government deciding it doesn’t want to pay out money that it has already contracted to pay is not going to help these institutions succeed at their innovation mission.

I, personally, believe that much of the current financial structure of the universities is broken, and the structure of the “indirect costs” causes strongly misaligned incentives, but arbitrarily and massively lowering the rates on zero notice is not the solution. I’ll note that no one involve in DOGE seems to have an actual proposal to improve the situation — they just seem to want to shut everything off.


Oh my god dude, who do you think is the major subsidizer of private industry research?


I didn’t say I was in favor of axing all govt positions and grants, dude.


You didn't say that, but you're supporting the guys who did.


This is the first thread ive commented on since trump was inaugurated and it’s sad to see HN has turned into Reddit.

emotionally charged, bad faith assumptions at the smallest whiff of independent thought.


Indeed you did.


US innovation is looking at how it can extract as much profit as it can in the next quarter.


They say on the internet, a direct result of DARPA funding.


Just like in healthcare, right? Where we are lucky the private industry has created...oh wait, most expensive healthcare system in the world, doesn't cover a third of the population, doesn't even crack the top ten in outcomes.


The regulators ain’t regulating. That’s their main job, not innovation.


Private sector research basically does not exist anymore, especially basic science research. A lot of truly revolutionary stuff starts in academia and spins into tech and biotech startups.


Not sure how you have such optimism. It will take a lot more than luck to rebuild what's being destroyed and we don't have what it takes.

[edit: Found a less condescending way to make my point.]


> If we’re lucky this vacuum will be filled by an even larger private sector innovation hub like Xerox park and bell labs.

Why would they do that when stock buybacks are more profitable for shareholders?


Why do you think the vacuum won't be filled by other countries willing to fund their scientists?

This is after all the same reasoning used to give tax breaks to get a factory to setup in town


It is not a zero-sum game. Less spending in the US does not magically becomes more spending elsewhere. It’s actually likely to become less spending in several countries, as military spending increases because of geopolitical instability.


Every single American cellular operator was part of this program.


It will be largely filled by Chinese universities and Chinese companies working together. Non-Chinese researchers will probably have to go to Europe.


It's going to be Huawei.


You do realize PARC and Bell Labs are long gone, right?


But the memories will live on in our hearts.


> I imagine the entire point of RTO mandates is to keep cities sustainable.

The main important factor IMO is mentorship of junior talent. (I'm speaking for technical orgs)

Viewing the organization as a living organism where an employee is a "cell", then there are material benefits in the "cellular replication" of talent and rejuvenation of the next generation.

It can definitely be true that RTO is worse for an individual engineer but better for the health of the organization long-term. Both can be true.

In my experience, remote only companies tend to prefer a higher ratio of senior employees for this reason. It's plug-an-play.


Completely, agree. I think am lot of senior engineers work well independently and feel more productive at home. They miss that this isn’t always the case for more junior ones.


I think it's less capability and more the cost of goods. You can't compete with items that are half the cost to produce in east asia.

Manufacturing was always going to move to where it costs less.

It's bad for a productive american economy but it's a prisoner's dilemma so you can't blame businesses or consumers.

The government is the only party that had the power to do anything. it's 100% an economics problem.


I didn't use it 10 years ago but I've been using it for the last 4 years on mac and linux exclusively.

Microsoft seems to be prioritizing "cloud" on all their developer products (rather than just windows). I don't feel disadvantaged by NOT using dotnet on windows.


> string_of_int, int_of_string

That didn't bother me so much because i speak spanish and can read french. OCaml is of french origin. `string_of_int` is a bad english translation—should have been `string_from_int`.

I like F# where I can use the `int` or `string` functions:

    let myString = "2024"
    let myInt = int myString
    let myStringAgain = string myInt


Wait, all the music I bought on bandcamp can disappear if the artist removes it from their public inventory?


No, making an album private does not disappear the music from people who have purchased it.

However, Bandcamp does abide by the laws that require them to stop distributing music at the rights holders' request.

No music service, or any service, for that matter, will guarantee access to files without regard to laws. Some will try harder, some have tried harder and been beaten down.


this has happened to me once, appropriately enough with some hauntology tracks. the songs are weirdly enough still available in the ios app to play but not to download via the web. presumably they're still somewhere in bandcamp behind a boolean, but i never got around to downloading them (to be fair, i have them on a _cassette_ that originally included bandcamp codes, so i mean, i really can't complain, i knew what i was getting into)


This happened to me a bunch. I think in one case an artist released an album, but wanted to disallow buying individual tracks so they relisted it after I bought one track.

The most infuriating thing is that bandcamp gaslights you on it. I eventually confirmed that yes, I had bought that song, after tracking down the receipt in my email provider or something (it was hidden from purchase history, bandcamp "I can't find my music" link didn't mention this, etc etc).

This isn't just an issue for playing music though - if you buy a bunch of stuff but don't download it right away you could loose it too. I bought the track I mentioned right after it was released, and the substitution happened within a day or two of that.

Like, I get it, if there was some legal issue (pirated work - and then they should issue refunds). But the fact that bandcamp tries to hide it just means that they know they have no moral grounds here.


I recently learned about buying Mp3s on Amazon. Most CD purchase pages have a "purchase options" and you can do Mp3s. I do that for mainstream things for my kids that aren't on bandcamp (such as music from a kids TV show).

I'm actually working on a IoT device where one of the main goals was selfhosting audio content for my kids. Uses AI for the user interface. Similar to Alexa but battery powered. Still in private beta (orders are closed right now) but here is the link for anyone curious. https://heycurio.com/


Ah. AG Talking Bear meets LLM. I started working on something similar a year back - but tried to keep it restricted to offline which made it more challenging since inference on CPU of raspberry PI limits you to very small models.

Sending voice clips of children to an always listening server is just a bit too dystopian for me.


What a beautiful website! Absolute joy to explore it.


It exploded. I get only server errors now.


What do i even pay Vercel for?! gah


Seems like a smart setup


One thing I dislike with erlang based languages (both gleam and elixir) is that they use “<>” for string concatenation.

In F#, “<>” is the equivalent of “!=“. Postgres also uses <> for inequality so my queries and f# code have that consistency.


Ha, ok so I gotta give one of these "that's a really strange thing to get hung up on" responses.

Erlang and Elixir don't overload the `+` operator. In fact, they don't overload ANY operators. If you can forgive the syntactic choice of the operator itself (which I think it pretty fair considering Erlang predates Postgres by a decade and F# by two decades), this allows them to be dynamic while maintaining a pretty high level of runtime type safety. For example, one of the "subtle bugs" people refer to when criticizing dynamic languages (even strongly typed dynamic languages) is the following would work when both args are given strings or numbers:

    function add(a, b) { a + b }
Erlang/Elixir eliminate this particular subtle bug (and it goes beyond strings and numbers) since:

    def add(a, b), do: a + b
will only work on numbers and raise if given strings.


ML (which is the precursor to OCaml/f#), pascal, basic, and sql use <>. If you consider that <, <=, etc are used as comparison operators it makes sense for <> to be in that camp. I actually never thought of it that way.

Interesting table here highlighting old programming languages https://en.wikipedia.org/wiki/Relational_operator#Standard_r...


It doesn’t predate sql and certainly not it’s use in mathematics. There are other options for concatenation so this is an unfortunate error.

Shouldn’t copy Erlang, otherwise might as well use it.


>It doesn’t predate sql and certainly not it’s use in mathematics.

What do you mean by "it's use in mathematics"? To my knowledge <> was invented by Algol language creators to use it for inequality. There was no previous use in mathematics. And to my opinion, that was an unfortunate error.


Interesting, must have learned it so long ago… Pascal? that I conflated it with math class. Still ~1958 is rather venerable.

The plot thickens, apparently ++ is used for erlang. So I still find it a poor choice.


++ is for concatenating lists, it's not the only functional language that uses this.

Really though who cares? `=` is already misused in most programming languages.


When looking at new languages, getting the basics right is the first thing I look at. Clumsy string concatenation is a blocker in my business, which is like 75% of the code.


Actually in Elixir when doing string building you want to use "improper" lists which lets you very efficiently build up a string without doing any copying.


Oh ha, duh me, I did not consider it wasn't invented by Postgres.


Oh really? What's the operator for adding two floating point numbers then?

The solution to type confusion is not separate operators for every type, it's static types!


Ha, I was going to mention this but there is none. `+` is for both ints and floats. OCaml, which is statically typed, has a separate operators for ints and floats, though.

I don't want to get into it but Erlang is dynamic by design. There have been several attempts to bring static typing to it over the years which have failed. People are still trying, though!


One thing I hate about F# and SQL is that they use <> as a "not equals" operator. In Haskell, <> is the binary operator of any Semigroup instance.


> One thing I dislike with erlang based languages (both gleam and elixir) is that they use “<>” for string concatenation.

Erlang doesn't use <> for concatenation so it's odd to name it in this comment, like that language and its developers have anything to do with your complaint. If it upsets you so much, lay it at the feet of the actual groups that chose <> for concatenation instead.


I just assumed it was an erlang thing since elixir and gleam both do it. Now it seems even more odd that erlang doesn’t do it but they both chose it.


- in Haskell <> is binary operator of a Monoid

- in Elixir <> is Binary concatenation operator. Concatenates two binaries. This seems like it might be kind of a joke, actually, purposefully confusing "binary operator" with "an operator that takes two binaries" for humorous effect?

- in Gleam <> is string concatenation operator

As far as I can see it, they are taking inspiration from Haskell, where <> denotes the monoid binary operation, one concrete example being in the monoid of Lists binary operator being list concatenation, of which String is one example.

But really, <> for inequality is also kind of dumb and nonstandard idea (from mathematical notation perspective), originating from Algol. != which C popularized is more clear, and corresponds to the mathematical symbol, of course =/= would be even more close, but that is one more character.

ML originally used <> for inequality, following the standard (in CS) of Algol, and it was Haskell which deviated from that tradition. So F# uses still Algol tradition, but Haskell uses /= and C and others use !=, for more mathematical and logical notation.


Well binaries are <<>> so that's consistent at least. And <<>> is quotation marks in several languages, including French.


Guillemets are not the same and have their own symbols.


Yeah, ok. Go back to 1986 and tell the Erlang team to go use Unicode guillemets


Gleam is from the past few years.


« and » are also the hyperoperators in perl6/raku

https://docs.perl6.org/language/operators#Hyper_operators



I don't like languages that use > a lot simply because if I accidentally paste a code snippet in my Bash shell it is likely to pipe to some file.

Also, <> was != in BASIC, I believe.

PS: Don't paste this comment in your shell.


F# inherits <> from ML, which inherits it from Algol, which invented it. But that was actually a bad idea, since it deviates from mathematical practice. To follow math, it would be better to use != as in C and those inspired by it, or /= as in Haskell. Or maybe even =/= if you really want to go for the mathy looking notation.

Elixir uses <> as an operator for concatenation of binaries, (which does form a monoid of course), not to be confused with how Haskell uses <> as a binary operator of a Monoid, but for sure inspired by it. And Gleam picked it up from them, probably, to use for a special case of a list monoid, String. And Haskell created <> for Monoid, because it would be too confusing to use multiplication sign for the binary operation like mathematicians do. It would not be ok in programming context.


Then Gleam (and others) use “|>” when piping with “|” would make more sense, except that’s a bit wise OR, not to be confused with “||” which is… string concatenation (in Postgres).


My main criticism of this article and the article it references is that it refers to Marc Andreessen as a “not-so-bright billionaire” who “doesn’t build shit”. Do people forgot that he built the first popular web browser (mosaic) and then built Netscape? He is much more part of “hacker” culture than other names mentioned.


Maybe thirty years ago. Today, his money is the most relevant thing about him.


I'm very much on the side of the hackers in this argument, but honestly, why is a hacker turned businessman seen as a bad thing? Most people on this forum and in this community are trying to do the same thing.


It’s selling out. Trading in your idealism to work on spyware/ad tech or whatever.


Oh, please. "Selling out" is such an immature take.

Andreessen was one of the first people who saw the potential of the web in the early days, worked on the first graphical web browsers and built a highly influential business around it. Then he went on to create one of the most powerful VC firms in the industry, which in turn funded some of the most successful companies. His impact wouldn't have been anywhere near what it is today had he stayed working as a software developer. His ideals and goals were clearly much higher than that, and he got rich in the process. Sounds like a capitalist success story if I ever heard one.


>Then he went on to create one of the most powerful VC firms in the industry, which in turn funded some of the most successful companies.

Funding crypto nonsense, “apocalyptical” AI, and questionable politics. Meanwhile, writing “manifestos” that complain people don't recognize how great he and their peers are. I mean, thanks for Netscape and all, but Andreessen is as bad as the rotting VC culture can get.


So you only approve of VCs that fund companies you like? Or do you disapprove of the VC culture in general?

It's kind of ironic we're discussing this here, considering YC is following the same model, and has made similar investments. And yet both firms have also funded many companies outside of the sectors you dislike. Invoking ad hominems doesn't change the impact that both YC and a16z have had on the industry.


VC, YC, and pg get roasted here all of the time. This site itself and its community gets criticized. If it hadn’t displaced slashdot, we’d be doing that over there.


I think they are just answering your question “why is a hacker turned businessman seen as a bad thing”. You don’t have to believe VC or YC are platonic forms of goodness to post here.


Well, a capitalist success story isn’t necessarily incompatible with moving away from hacker beliefs like information should be free - in fact, you can probably make a lot of money by locking it up and selling access. And neither is getting rich intrinsically a hacker ideal.


Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: