Hacker Newsnew | past | comments | ask | show | jobs | submit | natrys's commentslogin

I think tree-sitter's relationship with JavaScript is entirely syntactic. You don't need any JS runtime installed to write grammars, because technically tree-sitter CLI already has a JS runtime included and using that it converts your grammar first to an intermediate JSON format, then it generates parser code in C. And then this C code gets compiled into a shared library, which is what editors like Emacs use, so to use tree-sitter modules you definitely don't need a JS runtime either.

Very impressive demo. From VM curation to vibe coding something running on port 8000 in Shelley just worked in minutes. I imagine quite a few technically impressive things happening under the hood, would be interested in reading more about those.

Small nit: I think you should make it more clear in the docs (if not in the landing page) that one can just use any key with the ssh command the very first time and it automatically gets registered. Also on the web UI one should have the ability to add the ssh keys. I logged into the web UI first, and was a bit confused.

I think the pricing is alright for the resource and remote development features, though might be a bit much if someone doesn't need higher level of resources for deploying something that's mostly already developed.

Anyway, this reminds me of a product called Okteto that had similar UX. They were focused on leveraging k8s for declarative deployment. But for some reason they suspended their managed cloud/SaaS offering for individual/non-enterprise clients, I wonder if it was because they couldn't make the pricing work. Hope that doesn't happen here.


That's the Kimi K2 Thinking, this post seems to be talking about original Kimi K2 Instruct though, I don't think INT4 QAT (quantization aware training) version was released for this.


I am going to try and stick with Prolog as much as I can this year. Plenty of problems involve a lot of parsing and searching, both could be expressed declaratively in Prolog and it just works (though you do have to keep the execution model in mind).


Well they do that too: https://huggingface.co/deepseek-ai/DeepSeek-Prover-V2-671B

But I suppose the bigger goal remains improving their language model, and this was an experimentation born from that. These works are symbiotic; the original DeepSeekMath resulted in GRPO, which eventually formed the backbone of their R1 model: https://arxiv.org/abs/2402.03300


It can hardly be called resistance to improvement, when everyone do improve it - just in their own ways. The default isn't some fashion statement, some aesthete that's objectively good (though I am sure some people do subjectively like it). But it's meant to be sort of a least presumptuous blank state that everyone can radically overhaul. So arguably it's an encouragement for improvement just like everything else in Emacs, which focuses on making the tools for improvement easier.

It's just that "improvement" as a matter of public consensus that everyone can agree on to elect the next blank slate has been to impossible to settle on. But the counterculture here broadly might be extreme reluctance to inconvenience even a minority of existing users, in pursuit of market share/growth.


Wasn't aware of user-var-changed, cool write-up!

I had used urxvt forever before and the simple solution that works (even for ssh e.g.) is to ring the terminal bell, and urxvt just sets the window urgency hint upon that. I just do that in shell prompt unconditionally because if it's triggered in a focused window, then nothing happens. But if it's from a different workspace, I get this nice visual cue in my top bar for free.

But features like setting urgency isn't available in wezterm (understandable, as it's not a cross-platform thing). I could patch that in the source, but the Emacser in me chose to do something more unholy. By default Lua is started in safe mode, which means loading shared C module is forbidden. I disabled that, and now use a bunch of missing stuff written in Rust and Zig interfaced with cffi. Don't recall ever having a crash so I am surprised by some of the other comments.


Yes it seems the binaries are here: https://ferron.sh/download

I will say that though, it's probably not rational to be okay with blindly running some opaque binary from a website, but then flip out when it comes to running an install script from the same people and domain behind the same software. At least from security PoV I don't see how there should be any difference, but it's true that install scripts can be opinionated and litter your system by putting files in unwanted places so nevertheless there are strong arguments outside of security.


Qwen's max series had always been closed weight, it's not a policy change like you are alluding.

What exactly is Huawei's flagship series anyway? Because their PanGu line is open-weight, but Huawei is as of yet not in the LLM making business, their models are only meant to signal that it's possible to do training and inference on their hardware, that's all. No one actually uses those models.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: