Hacker News new | past | comments | ask | show | jobs | submit | can3p's comments login

One of the things that surprised me in the article was their usage of J2K. They’ve been using it as part of IntelliJ, alright, but why did they have to run it headless? They’ve even mentioned that it was open sourced. And later they’ve said that they were not able to do much improvements because it was on maintenance mode at Jet Brains.

I mean, with the ressources meta has I’m sure they could have rewritten the tool, made a fork or done any other thing to incorporate their changes (they talk about overrides) or transformed the tool into something better fitting their approach. Maybe it has been done, just not clear from the article


Local state is indeed a problem that's exacerbated by swapping logic. Simple example: you have a form + a collapsible block inside or maybe a dynamic set of inputs (imagine you're adding items to a catalog and you want to allow to submit more than one). If you save the form and simply swap the html the block state will be reset. Ofc you could set the toggle state somewhere to pass it along to the backend, but that's a pain already compared to spa approach where you don't even bother with that since no state is reset.

You could you query string as mentioned in the article, but that's not really convenient when done in a custom way for every single case.

Having said that I think that a way to go could be a community effort to settle on the ways to handle different ui patterns and interactions with some bigger project to serve as a testing ground. And that includes backend part too, we can look at what rails does with turbo

My opinion is that htmx (and similar) approach is different enough to potentially require a different set of ui interactions and it usually hurts when one tries to apply react friendly interactions to it.


Nice! Personally I think that the more niche social networks we have the better it is. The big problem with the mainstream networks is that they've evolved from a media to communicate and keep in touch with real people into a platform for influencers and businesses.

The common complaint I hear about instagram for example is that every second connection of yours would try to sell/teach something and that's just garbage if all you need is to keep in touch with your friends.

The main problems to tackle imo are:

- information propagation speed. This is good in case you want to get a quick update but it also a double edged sword, since this allows information attacks, trolls etc

- Scale. Anything of big scale becomes a problem by itself since it becomes economically viable to target the platform with bots, scam etc.

- Incentives. I think we should get to the point where social networks are being run by non profits

I've posted the link a couple of time, I'm working on my personal take on this problem[0]. My approach is the following:

- Slow down information propagation. Every post is visible to the direct connections, to their connections if you allow it, but no further

- No way to get a connection request from a stranger. Either you specifically allow it, or it's introduced by your direct connections

- No federation, since my idea was to have small communities

- Fully open in the sense of data formats, import/export etc. Migrating between instances is as easy exporting posts in bulk, creating an account on another instance and doing the import. You could do the bulk updates the same way

Also, it's all go + htmx just in case anyone else is also tired of modern frontend mess. I have a couple of videos on the feautures[1], if you like. The design is not great, since I wanted to focus on the idea itself

[0]: https://github.com/can3p/pcom

[1]: https://www.youtube.com/playlist?list=PLa5K-kCUS-FozB6Cw7rJL...


I've got to chime in here, because of how much this overlaps with the project I've been working on called Haven[1].

A lot of these problems go away with a decentralized/open-source private model. If your posts aren't public then there is no spam. If everyone runs their own node of open-source (or better yet: open-protocol, ie RSS) software, then there is no centralized entity able to have incentives of profiting off the platform.

Information propagation speed is a good call-out as dangerous. Even with all the spam/shilling/trills removed, it still leads to the girl who's having a great time on her snowboarding trip until she posts pictures on Instagram and drops into a foul mood because not enough people immediately liked her posts.

I'd love to connect and share thoughts, feel free to reach out[2]/

[1]: https://github.com/havenweb/haven

[2]: https://havenweb.org/contact.html


Good post. Have you already took a look into NOSTR?

It permits both private/niche communities and public (global) texts.


Just checked it, thanks for pointing to it. I think it's more of a decentralized encrypted messaging platform, and my idea was to have a way constrain the visibility of the conversations to naturally connected groups of people while giving a way to slowly expand the connections rather then fighting censorship

More or less like in real life, where you chat a lot with your friends, but necessarily with some of their friends you don't know that well. In this case you would ask your friends for the introduction and that what I've tried to model.

One other feature I've been thinking about was to make the moderation automatic in a sense of making signups possible only via invitation and putting some weight on it. Basically if you invite somebody who's misbehaving on the platform and they get flagged, you get penalized as well unless you do it first. My theory is that it should make users care about their digital surroundings.


By default all texts are open. There is encrypted messaging, albeit only used for private messages inside a group or to another person.

What you mention could be achieved with the a nostr relay. Just permit inside who you want, but anyone can keep participating on internet at large with exactly the same account.

But if you want to moderate everything inside, then likely mastodon or a traditional web forum might be more suited.


I’ve been scratching my own itch lately trying to build a communication medium that I like.

IMO the problem with current social networks is their scale and public only approach. Any network that goes this way ends up with lots of bad actors and public only approach means that it’s easy to harass people and bots are economically viable.

I’ve addressed both points [0]. Visibility of the posts is limited to direct connections, you need a proxy connection to make a new one and at the same time it’s mega easy to import/export, markdown support and apis are there etc. That was my way to get miningful discussions back.

In general, you need to look to small scale places

[0]: https://github.com/can3p/pcom


Small nitpick - the author mentions upsert in the beginning of the article only to "forget" about it and use it in the end (insert .. on conflict ignore) and that's the obvious solution there.

The final query is very neat though! And special thanks for mentioning "MERGE ... RETURNING" in PG17, that's really cool


I think the whole article can be generalized as following: one cannot make assumptions about relative values.

You can replace UTC with US dollars and Amsterdam time with euros and talk about conference prices. If you convert 100 euro to US dollars at the time of record creation, the price at the day may not be 100 euros anymore.

Ideally you record a tuple that allows to define value unambiguously (e.g. time 2022-02-21 9:00 timezone Europe/Amsterdam) because in this case you can always resolve the absolute time value later.

This is a typical hiccup for any business expanding into multiple locations with different parameters (time zones, currencies, units etc).

Once case that I find particularly interesting is an event that happens in different locations at the same logical (but different absolute time). Let's say you want to send an email to all your users across the global on Monday 9:00. If you want to add up to complexity, you could think about sending an email to all your users across the globe on the first working day next week.


I think the module imports apis are a python2/3 moment for node.js ecosystem. There is no clearly superior way and as a consequence not too many people care, however it hurts for real.

The proposal to disable node.js style imports will just split ecosystem and make a large part of industry stick to ancient version / make a fork. Is that really worth the gain? Just check how long it took some bigger projects to migrate from python2 to python3


The difference is the language/standard in question neither originated with NodeJS nor is NodeJS now nor has it ever been led by the people behind the language/standard (unlike Python)...

When you hear "NodeJS", you really need to bethinking of it in the same category as IE (wrt browser behavior) or Visual C/C++. It's but one, often (knowingly/deliberately) quirky, non-standard implementation by a group that doesn't necessarily have your best interests at heart or the interests of those outside their own platform umbrella.


The part that blows my mind all the time is that there are all sorts of cups - https://en.wikipedia.org/wiki/Cup_(unit)

Mile unit is another one of the same sort - https://en.wikipedia.org/wiki/Mile with the exception that only two kinds have survived if I'm not mistaken - US mile and nautial mile

I guess it can be very simple in US, especially if mug/cup volumes are defined in terms of a "cup" multiples, but in europe I get totally lost when I see the cup unit in the recipe since I din't know what volume it refers to


I always do cup = 245ml. If you then undershoot or overshoot by 5ml, you’re always okay.

That Wikipedia page is also missing the Dutch “kopje” (little cup) measurement, which is 150ml.


The US “statue mile” is also used in the UK (but not Canada, surprisingly to me, as I’d assumed the reverse before I’d visited).

As for cooking, US volume measurements are strictly defined, despite the history of where some of the unit names came from. So it should always be possible to convert to metric volume if you know it’s a US recipe.


The USA had two different miles until a year ago: the international standard mile (based on Carl Edvard Johansson’s inch), and the US Survey mile. https://en.wikipedia.org/wiki/Mile#US_survey


Yes! It includes the Japanese cup (go), familiar to any American who owns a rice cooker.


The volume of 240mL of water. Fill a standard 500mL water bottle to the bottom of the cone: half of that.

     =  500mL
    /_\ 2 cups
    |_| 1 cup 
    |_| zilch
* Not to scale.


The Swedish Mile (= 10km) is still in use and once tripped me up when searching for a used car.

They were all wonderfully cheap until I realized that 50,000 on the website actually meant 500,000km driven.


Nice post, thanks! Do I read it right that using jit results in the worst max times? What could be a reason in your opinion?


Two parts: I did the benchmark on a laptop and didn't spend enough time forcing its runtime PM in a fixed state, I'll run a real pgbench on my desktop once I implement all required opcodes for it. And since JIT requires a minimum amount of time (about 300us on my tests), on such small runtimes this can quickly overcome the benefits.


I think linux became much better in recent years not at least due to things moving into web. Ofc there is a ton of specialized windows only software still, but it became much less relevant for ordinary people.

Just remember all the people struggling with open office (not relevant because of google docs and alike) and video/audio codecs (Spotify, Netflix etc). In general the amount of desktop apps needed to be productive has reduced.

The desktop is just good enough even though App Store apps are quite terrible in both Ubuntu and fedora in my experience.

On the other hand with pipewire we can finally have working Bluetooth headphones and after years of endless shitshow simple things like screen sharing are working again


Just remember all the people struggling with open office (not relevant because of google docs and alike) and video/audio codecs (Spotify, Netflix etc).

You can call me an old curmudgeon if you like, but I just don't trust the Cloud for storage. (Ever since a badly configured DropBox instance removed around a gigabyte of files.) So I insist on keeping all my Documents at home on my main system, along with all of my 'write once, keep forever' files.

In my early days, I tried to maintain the ability to use MS Word documents as they were all that was circulated in the 90s and early naughties. That included the use of WABI and other kludges. But with the later advent of ubiquitous .PDF files, that pressure for MS Office compatibility has disappeared. Consequently, I find I need only Libre Office Calc to do my spreadsheets, and Libre Office Writer to do my text word-processing. They can do everything I need.

When you say "App Store apps", are you talking about the packages managed by package managers like Synaptic which is used on Ubuntu/Debian/Mint/etc. ?


I recently bought a windows laptop for the primary reason that I wanted a full version of Excel, not libreoffice, and not 365.

But yea, aside from the fact that it's a lot newer hardware than my Slackware 15.0 desktop, there's really not much in the way of anything it can do that makes me go "wow, this is what I was missing out on."

Some games that don't play nice with lutris are still around, but honestly, I almost have more fun fighting to get games to run than I do actually playing them anyway.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: