1.2 History of Deno

Let's begin by exploring the reasons that led to the development of a fresh server-side runtime, Deno. Node.js has been around for over 13 years and has become a widely recognized, proven, and resilient runtime for server-side applications. It has gained significant popularity over time and is trusted by major corporations such as PayPal, Netflix, Trello, NASA, Twitter, Walmart, and more. These companies rely on Node.js to power exceptionally swift web applications.
Given the established success of Node.js, you might wonder why there was a need to introduce a new runtime. Let's delve into the motivations behind this decision and uncover the rationale for creating Deno.

Who's behind Deno?

The Deno project is led by Ryan Dahl, who is also the original creator of Node.js. Following the initial launch of Node.js, Ryan led the Node.js project for its initial 3-4 years. Later, the Node.js project was transitioned to be under the management of NPM (Node Package Manager). After this phase, Ryan ventured into other endeavors, including a period of time spent at Google.
In the year 2018, Ryan made a notable return by delivering a widely recognized presentation wherein he candidly discussed his lingering concerns and misgivings regarding Node.js. During this talk, he openly reflected upon various aspects of Node.js that he still felt regretful about. The conclusion of this talk marked the introduction of a fresh runtime named Deno. The driving force behind the creation of Deno was his determination to address these lingering regrets. This motivation propelled him to embark on the creation of an entirely new runtime, built from the ground up.
The overarching aim behind Deno was to craft a contemporary and secure runtime that would rectify the shortcomings he perceived within Node.js. This aspiration to mitigate past mistakes motivated him to conceive a runtime that would incorporate modern practices while offering enhanced security measures.
Now, let's take a succinct look at the specific areas of regret.

The regrets

In the year 2018, during the JSConf event, Ryan delivered a speech concerning the things he wished had been different in Node.js. This talk gained significant popularity, and it can be accessed on YouTube through the link https://www.youtube.com/watch?v=M3BM9TB-8yA. The video holds value for those intrigued by the reasons driving the creation of a fresh runtime.
Exploring a few of the foremost regrets will provide us insight into the initial motivations that led to the birth of Deno. By delving into these aspects, we can better comprehend the foundations upon which Deno was eventually built.


Following the initial release of Node.js, Ryan introduced promises to the platform in June 2009. These promises were initially integrated into the core of Node.js, but they were subsequently removed in February 2010. During this period, the adoption of promises for facilitating async/await operations was relatively novel. The foundational code of the core Node.js framework heavily relied on callbacks, a pattern that continues to persist.
In the modern programming landscape, the integration of promises has become essential to support the implementation of contemporary code that leverages the async/await paradigm. Consequently, a significant portion of Node.js codebase relies on callback-based structures. Despite the evolution of asynchronous programming techniques, the original callback-oriented code remains functional and effective.
As of now, the likelihood of completely replacing the established callback-based code with promises or async/await constructs appears minimal. The existing callback-centric foundation has proven to be robust and dependable, ensuring the stable functioning of Node.js applications.

Open access

Both Node.js and Deno utilize Google's V8 JavaScript engine to execute their JavaScript programs. However, there is a distinction in how they handle security and isolation.
In the case of V8, it operates within a highly secure sandboxed environment. This means that JavaScript code is executed within a confined space, preventing it from directly interacting with sensitive system resources. On the other hand, Node.js has historically taken a more open approach. A Node.js program possesses the capability to access a wide range of resources that a user-level process can interact with. It's important to note that this accessibility was characteristic of all Node.js versions prior to v20.
As of version v20 and onwards, Node.js has taken gradual steps towards embracing the concept of sandboxing for applications. Although this feature is still in its experimental phase, it signifies a significant shift in Node.js' approach to security and isolation. The journey towards maturity for this sandboxing feature may require a considerable amount of time, as it undergoes refinement and enhancement.

NPM or package.json

The concept behind package.json was inspired by NPM, the widely used package manager for Node. In its early stages, Ryan endorsed the adoption of package.json by enabling Node.js's require() function to refer to the package.json file for module resolution. Over time, NPM evolved into the primary and immensely popular repository for various node modules. NPM's scope expanded, making it arguably the largest package manager spanning multiple programming languages. However, a drawback lies in NPM's private control. Although NPM originated as an independent entity, it was recently acquired first by GitHub, and subsequently by Microsoft. This progression has tightly intertwined Node.js and NPM. Ryan lamented the inability to source Node.js packages from diverse repositories, a limitation he came to regret.


The module resolution algorithm in Node.js quickly grew intricate. Initially, the idea of vendor modules seemed promising, yet the use of $NODE_PATH posed issues. Moreover, this approach greatly diverged from the conventions used in web browsers. Ryan, the creator of Node.js, aspired for module resolution to align with established standards, but this goal was not achieved within the Node.js framework. Consequently, the module resolution logic in Node.js remains distinctively tailored to its own environment.

Implicit require

Using require("module") minus the ".js" extension was a specialized feature developed specifically for Node.js. However, this approach does not align with how JavaScript functions in web browsers. In the context of browser-based JavaScript, excluding the ".js" extension from a script tag's src attribute isn't valid. In this scenario, the module loader must scan various file system locations, attempting to deduce the user's intention regarding the module being referred to.


Ryan found index.js to be charming due to its resemblance to index.html. However, this seemingly innocent connection led to an unnecessary complication within the module loading system. This complexity became even more apparent once the 'require' function began accommodating package.json files.

Conception of Deno

The preceding section highlighted the primary sources of regret. However, there are additional points of regret discussed in Ryan's presentation during JSConf 2018 (https://www.youtube.com/watch?v=M3BM9TB-8yA).
During his talk, Ryan expressed remorse over certain decisions he had taken in the early, critical years of Node.js development. Most notably, a significant portion of his regrets stems from the decision to grant a private company, NPM, authority over third-party modules, and subsequently, over the Node.js ecosystem itself. This decision turned sour when it resulted in NPM gaining control over Node.js.
Nevertheless, it's important to note that some of these regrets, such as NPM integration, have been integrated into Deno over time. This adjustment was necessitated by Deno's challenges in maintaining viability without access to the extensive library of 1.4 million NPM packages. Further details about this transformation will be delved into in the upcoming section.
These moments of regret prompted Ryan to conceive a new runtime, one that strives to make the right choices from the outset. This new runtime is constructed using contemporary technologies, emphasizing an authentically open nature and steering clear of external control.