Hacker Newsnew | past | comments | ask | show | jobs | submit | baudaux's commentslogin

I easily managed to build quickJS to WebAssembly for running in https://exaequOS.com . So I need to do the same for MicroQuickJS !

I'm curious what practical purpose you could have for running a js execution engine in an environment that already contains a (substantially faster) js execution engine? Is it just for the joy of doing it (if so good for you, absolutely nothing wrong with that).

Sandboxing.

Figma for example used QuickJS, the prior version of the library this post is about, to sandbox user authored Javascript plugins: https://www.figma.com/blog/an-update-on-plugin-security/

It's pretty handy for things like untrusted user authored JS scripts that run on a user's client.


WebAssembly also runs in places other than the web, where there isn't a JavaScript interpreter at hand. It'd be nice to have a fast JavaScript engine that integrates inside the WebAssembly sandbox, and can call and be called by other languages targeting WebAssembly.

That way, programs that embed WebAssembly in order to be scriptable can let people use their choice of languages, including JavaScript.


It allows, for example, to create bindings as I did for raylib graphics library. exaequOS can run any program that can be built to WebAssembly It will soon support WASI p1 and p2. So many programming languages will be possible for creating programs targeting exaequOS

Is there not a way to use the browser native js execution environment for that? You lose a non-trivial amount of performance running js inside quickjs inside of wasm vs the browser native js engine. I wouldn't be surprised if that's 10 or even 20 times slower, and of course requires loading more code into the browser (slower startup, more ram usage). Maybe you don't care about that, but all of that is pretty orthogonal to the environments I an embedded engine like this is intended for.

Maybe with plugins. The WebAssembly way is cross platform. You would be very surprised with the performances of WebAssembly. I have built a Fibonacci test program in Rust that runs faster when built to Wasi than the native target on my MacBook

This is because the execution is very predictable, so the JIT in the runtime can emit optimized code with the knowledge of how the code is going to run. Embedding unpredictable code (like a javascript interpreter) typically has substantially worse performance when executing under a JIT. This is in addition to the fact that Quickjs (despite being pretty good) can't match the performance of sophisticated JIT implementations like V8 or JavaScriptCore

Or both the browser and wasi. As I am doing with exaequOS


I think this kind of thing could be really useful for a project I'm building.


What is your project ?


I'll email you over the weekend

Is there a WebAssembly WASI version of swi prolog ?


Not sure... Other Prologs compiled to WASM with very good performance is https://ciao-lang.org/playground/

The same toplevel runs also from 'node' as well.


Thanks. I will have a look. I would like to integrate one Prolog in exaequOS.


I am building a WebAssembly WASI runtime for exaequOS (https://exaequos.com), an OS fully running in the Web browser. It will support WASI 0.1 and 0.2. Basic implementation can be tested by running ‘wex’ in the terminal


I have built Fibonacci wasm wasi executable for Rust. When I execute it in https://exaequos.com (with wex runtime under development), it is faster than the native app on my MacBook


It looks like what we can do with bc, basic calculator



I put QuickJS in https://exaequos.com. You can do graphics app with raylib


Is it targeting WASI ?


Is it targeting WASI ?


You can play with Uxn in the web with https://exaequos.com. See online documentation: https://www.exaequos.com/doc/build/html/dev.html#uxn


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: