Hacker Newsnew | past | comments | ask | show | jobs | submit | Gazoche's commentslogin

I had not heard of it, but sounds pretty interesting.


I watched that talk a long time ago, but maybe on some level! Though I don't think the idea is particularly novel.


Thanks again for making Wasmi :)

> It might interest you that newer versions of Wasmi (v0.45+) extended the resumable function call feature to make it possible to yield upon running out of fuel:

That is really interesting! I remember looking for something like that in the Wasmi docs at some point but it must have been before that feature was implemented. I would probably have chosen a different design for the WASM apps if I had it.


I am really sorry I have waited so long to extend Wasmi's resumable calls with this very useful feature. :S Feel free to message me if you ever plan to adjust your design to make use of it.


Please don't take it as a reproach! Not your fault at all, and at least it forced me into creative problem-solving ;)


Thanks! I tried to get wasmtime working but it was too much of a pain to compile in no_std mode, so I settled for wasmi instead.


Wasmi author here. Glad to see Wasmi being used in embedded contexts were it really shines. :)

I just watched the demo video of Munal OS and am still in awe of all of its features. Really impressive work!


Thank you! And thanks for making Wasmi, it's a really impressive project and it's the reason why I decided to go this whole WASM sandbox route (because I could embed it easily) :)


Awww, makes me very happy to hear! :) Thank you!


Yeah it's one of those projects were I'm so impressed that I'm saying nothing because there's nothing to say, it's just really impressive. I'm not sure what will come of this project, but it has a lot of potential to at least inspire other projects or spark important discussions around its innovations.


Wasmtime maintainer here - curious to hear what went wrong, I and several other users of wasmtime have production embeddings under no_std, so it should do everything you need, including building out WASI preview 2 support. You can find me on the bytecode alliance zulip if you need help.


I think I was a bit spooked by the examples (https://github.com/bytecodealliance/wasmtime/tree/main/examp...), and the need to implement platform dependencies in C code (which would have complicated the build process). Makes sense since it's a more complex and mature project, but Wasmi on the other hand was just a pure Rust dependency that only required a single line in the Cargo.toml. So in short I went the lazy route :)


All of the C primitives there implemented in (unsafe) Rust, but we built that example for an audience that already had some platform elements in C. We'll try to improve the example so that both integrating with C, and using pure Rust, are covered.


I'm not the OP, but I have a similar experience with Motor OS: wasmi compiles and works "out of the box", while wasmtime has a bunch of dependencies (e.g. target-lexicon) that won't compile on custom targets even if all features are turned off in wasmtime.


Not sure how to help with this much information but I've built and run wasmtime on some pretty squalid architectures (xtensa and riscv32 microcontrollers among others) but the right collection of features might not be obvious. We can help you find the right configuration on the Bytecode Alliance zulip or the wasmtime issue tracker if you need it.


> Not sure how to help with this [...]

I guess not much can be done at the moment: dependencies are often the primary obstacle in porting crates to new targets, and just comparing the list of dependencies of wasmtime vs wasmi gives a pretty good indication of which crate is a bit more careful in this regard:

https://crates.io/crates/wasmtime/33.0.0/dependencies https://crates.io/crates/wasmi/0.47.0/dependencies


Wasmtime has many capabilities that wasmi does not, and therefore has more optional dependencies, but the required set of dependencies has been portable to every platform I've targeted so far. If anything does present a concrete issue we are eager to address it. For example, you could file an issue on target-lexicon describing how to reproduce your issue.


> If anything does present a concrete issue we are eager to address it.

That's great to hear! I think it is a bit too early to spend extra effort on porting Wasmtime to Motor OS at the moment, as there are a couple of more pressing issues to sort out (e.g. FS performance is not yet where it should be), but in a couple of months I may reach out!


Is that wasmtime in interpreter mode? I didn't see a rv32 backend to wasmtime (in cranelift) or did I not look in the right place.

What are the min memory requirements for wasmtime/cranelift?


There’s now an interpreter in wasmtime called Pulley. It’s an optimizing interpreter based on Cranelift, which generates interpreter opcodes which are more efficient to traverse than directly interpreting the Wasm binary.

I have run wasmtime on the esp32 microcontrollers with plenty of ram to spare, but I don’t have a measurement handy.



But if this benchmark is right, then wasmtime is 5x faster than wasmi for it:

https://github.com/khvzak/script-bench-rs


Wasmtime, being an optimizing JIT, usually is ~10 times faster than Wasmi during execution.

However, execution is just one metric that might be of importance.

For example, Wasmi's lazy startup time is much better (~100-1000x) since it does not have to produce machine code. This can result in cases where Wasmi is done executing while Wasmtime is still generating machine code.

Old post with some measurements: https://wasmi-labs.github.io/blog/posts/wasmi-v0.32/

Always benchmark and choose the best tool for your usage pattern.


That's a good point I didn't think about.

I guess it's like v8 compared to quickjs.

Anyway all this talk about wasm makes me want to write a scriptable Rust app!


For the love of everything, CHATGPT IS NOT A PRIMARY SOURCE. Always assume every fact it spits out is made up.


Good thing then that I didn't use it as a primary source. :-)


it's not a source at all, it's definitionally bullshit because the LLM has no concern for the truth. Never post LLM text. Anyone can get that for themselves. To do so is an insult to the comment section. I'm not here to read bot summaries. If you ask the bot something and then it leads you to a source and then you read that and then you put that in your own words, that's a comment worth reading.

what you did is like farting in a crowded space. STINKY AND RUDE


unless you know, it gives you verifyable sources to dig in deeper to verify - much like Google search (minus the ads on top) :)


But you wouldn't post your google search either.


Lmgtfy links were extremely common around 2010, so ppl totally did

(Which is besides the point even, as the comment referencing chatgpt also provided links to sources)


why wouldn’t I if asked to provide source(s) for my “claim(s)”?


I remember buying the bundle and thinking "I'll just play that lightweight 2D platformer while waiting for the bigger, more interesting games in the bundle to download"...and then spending the entire evening on it.


Agreed, coming from Python it's one of the main things I miss in Rust. You can achieve something similar with the builder pattern or with structs + the Default trait, but it takes much more effort.


Looks like the fake recovery scams have leaked out of X and into HN.


Oh boy, HSBC. I have never seen a bank with such a byzantine and convoluted login process.


I heard that was a well-known trick at my old uni dorm. There was a single thermostat for the whole floor so once people figured out where the sensor was, the ones who lived closest to it would often leave packs of frozen food on it.


Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: