Our Discord community is buzzing. Come and join us!

The State of WebAssembly – 2023 and 2024

Welcome to 2024 and our latest article looking at the current state of WebAssembly (Wasm)! In this article, I’ll kick things off by looking at what happened in 2023. Then, I’ll give you my thoughts on what might happen this year.

Before I start, I’ll mention that when discussing browser support for WebAssembly features in this article, I just mention whether the feature is supported in Chrome, Firefox, or Safari. Because Edge is built using the same Chromium open-source project as Chrome, so it usually inherits the work being done around WebAssembly. Because of this, if a WebAssembly feature is supported in Chrome, it’s usually also supported in Edge.

That said, if you’d like to check and see if a feature is supported in your browser, you can use the following website. The website will have a checkmark in the Your browser column to the right of the feature’s name if your browser supports the feature.

2023 in Review

This past year was incredible for WebAssembly, but it did have a few surprises, like the Tail Calls process taking an unexpected turn.

Tail Calls

Tail calls are useful for compilation optimizations and certain forms of control flow, like recursive functions. Some programming languages also need this feature in order to target WebAssembly. If you want to learn more about tail calls, please check out the following article.

Tail Calls were sitting behind a flag in Chrome for quite some time but couldn’t be released until the proposal reached the standardization phase. At the same time, for the proposal to be standardized, there needed to be at least two web VMs implementing it.

When I mention that something is ‘behind a flag,’ I’m referring to options that a user can turn on or off. In a browser, this is typically done by developers to allow testing of new features that aren’t quite ready for general use. Be very careful if you adjust the flags because they can change how the browser handles certain things. In some cases, because the feature may not be complete, there may be security issues with the implementation that haven’t been addressed yet.

In 2023, it looked like Safari would be the second VM to implement tail calls because they had an implementation in Technology Preview 161 that was expected to be released as part of Safari 16.4.

Because it looked like it was about to be released by Safari, Chrome scheduled it to come out from behind a flag. At the same time, the WebAssembly Community Group voted to move the proposal to phase 4 for standardization because two web VMs were about to go live with it. Unfortunately, issues were found with Safari’s implementation, and the feature was pulled before Safari 16.4 was released. This resulted in the proposal being standardized and out from behind a flag in Chrome but with only one web VM having the feature.

By December, Firefox released support for Tail Calls, so we now have two web VMs that support it. Unfortunately, it’s still not available in Safari.

Garbage Collection

In 2022, the garbage collection proposal advanced quickly to phases 2 and 3 after sitting in phase 1 since 2017. I saw a lot of work being put into this feature by the browser makers, and several languages were also adding support. Because of this, I thought garbage collection might be released in 2023, but I figured it would be released behind a flag for testing and that we’d only see it fully released in 2024.

I was pleasantly surprised that I was wrong about my prediction on this. While it was released behind a flag, as I expected, it wasn’t behind a flag for as long as I expected. In fact, it came out from behind a flag a version earlier than initially scheduled in Chrome. 

By October and November of 2023, Chrome and Firefox released support for garbage collection.

Safari is working on garbage collection but still has a little way to go.

Fixed-width SIMD

SIMD (Single Instruction, Multiple Data) is a type of parallel processing that takes advantage of a CPU’s SIMD instructions to perform the same operation on multiple points of data simultaneously. This can result in large performance gains for things like image processing. Because there are multiple types of SIMD, fixed-width 128-bit SIMD operations were chosen as a starting point.

By mid-2021, all major browsers, except Safari, supported 128-bit fixed-width SIMD operations. 

Safari was very close to releasing fixed-width SIMD support by the end of 2022 when it was included in Technology Preview 161. 

In March 2023, Safari rounded out browser support for this feature with the release of Safari 16.4.

.NET 8 and Uno Platform support for WebAssembly now include SIMD support by default, enabling for faster vectorized algorithms execution.

Multiple Memories

Up until this point, modules could only have one block of memory. If the module needed to communicate with JavaScript or another module by sharing its memory, it risked information being disclosed or accidentally corrupted if it allowed direct reads and writes. With this proposal, a module can now have multiple memory blocks. That opens possibilities like a module being able to keep one block for internal data while exposing another block for sharing information, for example.

I hoped this feature would be released in 2022, but because I hadn’t seen any movement by the browser makers, I thought they decided it wasn’t a high enough priority, so I wasn’t expecting it in 2023.

Chrome released support for this feature at the year’s end and is now behind a flag in Firefox.

.NET Improvements around WebAssembly

Every .NET release often sees some WebAssembly support added or improved. This past year was no exception with the release of .NET 8.

Because .NET code uses managed memory, it requires garbage collection, and up until recently, browsers didn’t have WebAssembly garbage collection. To get around this, .NET uses the mono runtime, which is compiled to WebAssembly and handles the garbage collection. Unless your .NET app is AOT (Ahead of Time) compiled, your code is being interpreted in the browser by the .NET runtime.

By AOT compiling your code, you get a WebAssembly module to run faster and access certain features like SIMD. Unfortunately, AOT compiling your code also typically means your module is a lot bigger because the .NET code it depends on needs to be included. This larger file size results in page load delays that are sometimes noticeable.

With .NET 8, a middle ground was found where you have faster downloads of the interpreted code approach but also faster code execution because they introduced a form of just-in-time (JIT) compilation with a Jiterpreter feature. Now, as the code is running, the Jiterpreter can optimize the code by creating WebAssembly code on the fly, resulting in better performance the next time that code is run. This feature is also available for Uno Platform apps.

Another issue that existed with .NET Code running in the browser is that, up until this point, your code and the supporting assemblies were downloaded as .dll files. The files didn’t execute natively, but having .dlls downloading did give some people pause, and some servers blocked the ability to download those files as a security precaution. As a result, with .NET 8, a new format is now being used for the files called WebCIL, where the .NET assemblies are repackaged as WebAssembly files.


In 2022, we saw Docker and Kubernetes add experimental WebAssembly support for using a WebAssembly runtime directly instead of the traditional container runtime: https://www.docker.com/blog/announcing-dockerwasm-technical-preview-2/ 

Using a WebAssembly runtime directly has a number of benefits, such as reducing the size of containers because you don’t need things like Linux libraries or Node.js to launch your module. With fewer libraries and dependencies, the container is more secure because there’s less of an attack surface with fewer items to keep up to date. Being smaller, the container’s startup time improves, and you can fit more containers on the same device. You can leverage your existing container infrastructure to create functions-as-a-service yourself if you wish, and it allows you to use the same infrastructure you use for your normal applications to share and distribute your WebAssembly modules.

At Microsoft’s Build 2023 conference, they gave a teaser of something they’ve been working on called Hyperlight. The following video shows how to use Hypervisor to run what they’ve termed micro virtual machines to run WebAssembly modules.

Even with the short video, it’s pretty exciting to see the possibilities with more places where WebAssembly modules can run.

WebAssembly System Interface (WASI)

WASI is a standardization effort led by the Bytecode Alliance for WebAssembly’s use outside the browser to ensure things are done securely and consistently. It also includes creating a set of interfaces that your modules can access. For example, your module can use a ‘wasi-http’ set of interfaces to send and receive HTTP requests.

WASI sits as a layer above WebAssembly and uses a feature that’s being developed called the Component Model. The component model is a way to describe your module and allows for several WebAssembly modules, potentially written in different programming languages, to be linked together. When you create a component, the module’s types are described at a high level. This allows the runtime to help the modules communicate based on the types specified for the imports and exports. If you’re interested in learning more, this site has a detailed explanation of the component model and the related WASI work.

The WASI work is being released in phases and is currently at Preview 1, but they’re very close to releasing Preview 2. The plan is for 3 previews and then the WASI 1.0 release. The following July update explains what the plans are for each step of the release process. 

There are a lot of people excited about the possibilities of WASI. Still, some people didn’t feel things were moving fast enough, so they created something called WASIX that sits on top of WASI and gives additional features.

At first glance, WASIX reminds me of TypeScript sitting on top of JavaScript. My hope is that there won’t be a duplication of the standards and, as new WASI features are finalized, WASIX will switch to those.

The main limitation  I see with WASIX at the moment is that it’s only supported in the Wasmer runtime while WASI is supported in a variety of runtimes. That said, you can use Wasmer in quite a few locations. We’ll see how this plays out but if you’d like to learn more about WASIX, you can check out the following site

WebAssembly’s Use

I like having metrics, but I’m unaware of any WebAssembly metrics for its use outside the browser. My only way to judge interest outside the browser is based on the conversations I’m seeing, the number of people speaking about WebAssembly at conferences, and product announcements. Based on what I’m seeing, there’s quite a bit of momentum outside the browser. 

In the browser, we have a partial glimpse of WebAssembly’s use. Over the past year, its use has increased by about 1% and is now used by a little over 3% of the sites visited by Chrome users. That’s just Chrome, so the number would be higher if we factor in Safari, Edge, and Firefox. Given the size of the web, 1% is a pretty good yearly increase.

The web usage metrics that I mentioned can be found here if you’re interested (click the ‘Show all historical data’ checkbox)

With 2023 in the books, what might we expect in 2024?

Expectations for 2024

Extended Constant Expressions

Extended Constant Expressions is a feature that’s more of a tooling item than something the average developer will directly leverage. This proposal is meant to help with the dynamic linking of modules by adding additional instructions for initializing globals and specifying table or data offsets that aren’t known until runtime.

Extended constant expressions are now part of Safari’s Technology Preview 184, released in December 2023. I expect that Safari will release this in the coming months, and with that, it will round out browser support for this feature.

Garbage Collection

Safari is hard at work implementing garbage collection. Based on their current list of tickets and the number that are still open, they’re around 75% complete. It’s hard to say how long the remaining tickets will take or if others will be added to the list. The following is an umbrella ticket that holds a list of all of Safari’s garbage collection tickets that they need to address: https://bugs.webkit.org/show_bug.cgi?id=247394

Once the feature is considered complete, it gets added to a Technology Preview and then later to a release, so this takes a bit of time.

I guess that if it does get released by Safari this year, it’ll probably happen toward the end of the year.

Relaxed SIMD

When SIMD was proposed for WebAssembly, 128-bit fixed-width SIMD was chosen as a starting point because it was seen as having the most hardware support.

There are additional SIMD instructions that are possible depending on the hardware. The Relaxed SIMD proposal aims to take advantage of some of those additional instructions. If you’d like to learn more about Relaxed SIMD, the proposal can be found here. 

This feature is behind a flag in Chrome and Firefox, but the proposal is now in the standardization phase. I expect this to come out from behind a flag in both browsers soon.

It’s possible that I’m not looking in the right place but I’m not seeing any activity from Safari on this feature at this time. The Safari team appears to be putting a lot of effort into getting the garbage collection feature implemented so maybe they’re just heads down on that and will tackle this afterwards.

Multiple Memories

The multiple memories proposal is now in the standardization phase. From what I’m seeing, the feature will be coming out from behind a flag in Firefox, but a date hasn’t been published yet. I expect this will be released this year. 

Unfortunately, at the moment, I do not see anything from Safari on this.

Other possible features that might arrive in 2024

Several features sit behind Chrome and Firefox flags, like Memory64 and Type Reflection. These two proposals are still in phase 3, so they’ll need to move to phase 4 first, but they could be released this year if that happens.

Chrome is working on the JavaScript Promise Integration feature. They plan to add it to an Origin trial at the end of January, starting with version 122 and continuing until version 130. Typically, when I see features enter an Origin trial, they go live afterward. Because this is also a work in progress in Firefox, based on Chrome’s timeline, I would guess this could go live in Chrome and Firefox around November if the proposal is moved to the standardization phase around the same time.

Like with Relaxed SIMD and multiple memories, I’m not seeing anything from Safari around Memory64, Type Reflection, or JavaScript Promise Integration.

WebAssembly System Interface (WASI)

Preview 2 of WASI is expected to be released in early 2024. I’m unsure how long it’ll take to implement the final features needed to reach preview 3. Even though Preview 3 will be considered a GA release, it looks like Preview 2 will have a fair number of features that can be used to start building components and experimenting with WASI now.

The following article goes into detail about the plans that the Bytecode Alliance has for 2024 if you’d like to dig deeper: 

WASI with .NET

In .NET 7, experimental support was added for WebAssembly threads and was on the slate for release in .NET 8. Unfortunately, it didn’t come out from behind a flag in .NET 8 but is back on the list of possibilities for .NET 9’s release in November.

I would like to see .NET support the WebAssembly garbage collection proposal, but it looks like there are a number of issues with adopting it at this point based on this issue.

Even without the technical challenges, we’re still waiting on Safari support, which might only arrive later in the year. Without full browser support, I would suspect that the .NET team would focus on other, more pressing issues first.

The WebAssembly working group has a post-MVP plan for garbage collection, and the .NET team has indicated that they may contribute to that specification as they look at ways to address their current technical issues. 

We’ll see .NET add WebAssembly garbage collection support at some point, but I don’t think it will happen this year.

On another note, something that’s interesting is that the .NET team has been experimenting with WASI for server-side WebAssembly and included WASI Preview 1 in .NET 8. With .NET 9, they’re planning on including WASI Preview 2 support. Their WASI work is expected to remain experimental until WASI 1.0 is released. 

This article goes into detail about this experimentation, how to use it, and the results of some of their experiments. 

Steven Sanderson also has a demonstration of several different ways to use WASI from .NET 8 if you’d like to check that out.

In Conclusion

2023 was another fantastic year for WebAssembly, with several standardized proposals, like Tail Calls, Garbage Collection, Multiple Memories, and Relaxed SIMD. Several of these features have already been implemented in some browsers and runtimes, with work progressing in the other browsers. 

We’re seeing more programming languages like Kotlin, Dart, Flutter, and OCaml adding support for WebAssembly, thanks partly to garbage collection.

Tooling support continues to improve. An example is .NET’s Jinterpereter, which dynamically creates WebAssembly modules on the fly, giving you faster downloads using the interpreted approach and the fast execution of the AOT approach. Thanks to WASI, the .NET team is also experimenting with running WebAssembly on the server.

2024 is shaping up to be another exciting year for WebAssembly. There’s a feeling that WebAssembly’s use is about to take off both as normal WebAssembly modules and as WASI components, especially given that WASI Preview 2 will be released very soon.

Finally, I’m hearing from a number of people that WebAssembly is a really good fit with AI. We’ve already seen some work in this space, with one example being Google Meet using WebAssembly and Machine Learning to handle background blur or replacement in video calls. That was back in 2020, and technology has improved since then. Many people expect to see this use of WebAssembly with AI expand this year.

Gerard Gallant

Gerard Gallant

Follow on Linkedin


Related Posts

Uno Platform 5.2 LIVE Webinar – Today at 3 PM EST – Watch