Meet the next iteration of JavaScript on Compute
Since the start, we built a JavaScript SDK that is both highly secure and performant by running the JavaScript virtual machine inside a WebAssembly sandbox. Compute’s isolation technology creates and destroys a sandbox for each request flowing through our platform in microseconds. This technology holistically minimizes attack surface area while maintaining the ability to scale and perform, and keeps your code completely isolated from other requests flowing through the platform.
From this JavaScript SDK beta launch, we’ve learned that JavaScript on Compute is performing on par with our competitors in certain application workloads but not all, and it trails behind our Rust SDK counterpart. Today we’re happy to announce that we’ve made several improvements to Compute that substantially boost the performance of the JavaScript runtime, as well as the overall developer experience on the platform.
With these improvements, we’re excited to announce the latest release of our JavaScript SDK, supporting full production use. Let’s explore what’s new.
Improved compute performance
We’ve improved raw performance on running our customers’ code by applying a number of key changes. For example, we switched from using the Lucet WebAssembly runtime to Wasmtime, which has undergone many performance optimizations over the last two years.
We also developed a more efficient mechanism for scheduling workloads, significantly reducing the scheduling overhead. This change is particularly important for JavaScript performance, leading to a doubling of performance for most JavaScript workloads.
Reduced overhead for creating service instances
In addition to running your code as fast as we can, there’s another key aspect to maximizing performance: how quickly we get to the point of executing it after a request comes in. We’ve made significant improvements to our platform to reduce the overhead of creating new service instances. As a result, you’ll experience faster time to first byte, or TTFB.
Reduced variability in instantiation time
We’ve also enhanced how our operating system handles the way memory gets reclaimed when a service hasn’t been used for awhile. The changes we applied greatly eliminated delays for all subsequent service instantiations after the first.
Additional functionality
The above changes improved the Compute platform as a whole, and the JavaScript SDK as well as our other supported languages benefited as a result. However, for the JavaScript developer experience, we add these new functionalities to the latest release to support full production use:
30+ JavaScript code snippets on our Developer Hub to help you get started developing serverless applications on Compute
Support for Environment Variables
Significantly improved spec-compliance of web APIs such as fetch, Request, Response, and URL, leading to improved compatibility with libraries in the JavaScript ecosystem
Support for efficiently combining multiple upstream Response Bodies into a single downstream Response with minimal JavaScript execution overhead
Many bugfixes and improvements leading to a better developer experience and better results for our customers
Find out more
Developers are building interesting — not to mention performant — things on Compute using the JavaScript SDK. One ecommerce company increased its SEO rankings by maintaining highly performant redirects served from the edge by removing the load of processing from origin servers. A media delivery company is using it to conduct dynamic ad insertion on the fly, resulting in a highly personalized and low-latency experience for their end users.
For more information on how to get started with the JavaScript SDK, check out our documentation and example codes on Developer Hub. And if you’re not yet using our serverless platform, explore Compute in depth for free.