6.4 Transpile


Once the module graph is prepared and structured, the next step involves transpiling the code. To prioritize a quicker startup experience, Deno has constrained this phase to focus solely on transpilation. It's important to note that the 'deno run' command exclusively handles this task without engaging in any type checks.
In the preceding chapter, the transpilation process concentrated on a single module due to the absence of dependencies. In contrast, the hello v2 program encompasses various imports, necessitating the transpilation of all associated modules that contain TypeScript code.


Transpilation is a crucial process that must be carried out for every module that is part of the module graph. This ensures that the code written in TypeScript (TS) is converted into a format that can be understood and executed by the computer. It's important to note that modules without any TypeScript code will be excluded from this process, as there would be no need to convert them.
Another aspect to consider is that modules which already have a valid output from a previous transpilation, known as an "emit," will also be skipped during this process. This skipping occurs automatically, unless a specific instruction is given to reload and re-transpile these modules. This optimization strategy is implemented to enhance the startup speed of the application. By omitting unnecessary transpilation for modules with existing valid outputs, the overall time taken to initiate the application is reduced, leading to a faster and more efficient startup experience.
For a quick recap, here is the code for the main transpile function:
pub fn transpile(&self, options: &EmitOptions) -> Result<TranspiledSource> {
let program = (*self.program()).clone();
let source_map = Rc::new(SourceMap::default());
let source_map_config = SourceMapConfig {
inline_sources: options.inline_sources,
let file_name = match ModuleSpecifier::parse(self.specifier()) {
Ok(specifier) => FileName::Url(specifier),
Err(_) => FileName::Custom(self.specifier().to_string()),
source_map.new_source_file(file_name, self.text_info().text().to_string());
// needs to align with what's done internally in source map
assert_eq!(1, self.text_info().range().start.as_byte_pos().0);
// we need the comments to be mutable, so make it single threaded
let comments = self.comments().as_single_threaded();
let globals = Globals::new();
crate::swc::common::GLOBALS.set(&globals, || {
let top_level_mark = Mark::fresh(Mark::root());
let program = fold_program(
let mut src_map_buf = vec![];
let mut buf = vec![];
let mut writer = Box::new(JsWriter::new(
&mut buf,
Some(&mut src_map_buf),
writer.set_indent_str(" "); // two spaces
let config = crate::swc::codegen::Config {
minify: false,
ascii_only: false,
omit_last_semi: false,
target: ES_VERSION,
let mut emitter = crate::swc::codegen::Emitter {
cfg: config,
comments: Some(&comments),
cm: source_map.clone(),
wr: writer,
program.emit_with(&mut emitter)?;
let mut src = String::from_utf8(buf)?;
let mut map: Option<String> = None;
let mut buf = Vec::new();
.build_source_map_with_config(&src_map_buf, None, source_map_config)
.to_writer(&mut buf)?;
if options.inline_source_map {
src.push_str("//# sourceMappingURL=data:application/json;base64,");
base64::Config::new(base64::CharacterSet::Standard, true),
&mut src,
} else {
map = Some(String::from_utf8(buf)?);
Ok(TranspiledSource {
text: src,
source_map: map,
And to make sure we cover everything comprehensively, let's take a look at the code responsible for traversing the module graph during transpilation:
pub fn cache_module_emits(
graph: &ModuleGraph,
) -> Result<(), AnyError> {
for module in graph.modules() {
if let Module::Esm(module) = module {
let is_emittable = matches!(
| MediaType::Mts
| MediaType::Cts
| MediaType::Jsx
| MediaType::Tsx
if is_emittable {
pub fn emit_parsed_source(
specifier: &ModuleSpecifier,
media_type: MediaType,
source: &Arc<str>,
) -> Result<ModuleCode, AnyError> {
let source_hash = self.get_source_hash(source);
if let Some(emit_code) =
self.emit_cache.get_emit_code(specifier, source_hash)
} else {
// this will use a cached version if it exists
let parsed_source = self.parsed_source_cache.get_or_parse_module(
let transpiled_source = parsed_source.transpile(&self.emit_options)?;
These are the steps explained in detail:
  1. 1.
    Set up the TypeScript compilation configuration.
  2. 2.
    Go through all the modules in the graph.
  3. 3.
    If JavaScript (JS) should be ignored, then skip JS files.
  4. 4.
    If reload is not defined and emit is valid, then skip the file.
  5. 5.
    Parse the module if it hasn't been parsed already.
  6. 6.
    Convert/transpile the module.
  7. 7.
    Provide statistics and the list of loadable modules as output.
Let's see the same in a flowchart:

Transpile hello world v2

Now that we understand the transpile logic, let's see how it operates for the hello v2 program. There are two files to transpile: 1. The main application file, and 2. The Machine_id module. Nanoid is already in JavaScript, so it doesn't require transpilation.
  • Transpile file:///Users/mayankc/Work/source/denoExamples/helloV2.ts
  • Transpile https://deno.land/x/[email protected]/mod.ts
It's easy to see that the sequence of transpilation follows the order of adding nodes to the graph.
When all modules in the graph are transpiled, the result is a set of modules called "loadable modules." These modules can operate independently. The main module is excluded from this list since it's naturally a loadable module. Here's the list of loadable modules once transpilation is complete:


For the output of transpilation, There are two JS files:
import { nanoid } from "npm:nanoid";
import { getMachineId } from "https://deno.land/x/machine_id/mod.ts";
const id = nanoid();
const machineId = await getMachineId();
const homeDir = Deno.env.get("HOME");
function printNumber(input) {
function printString(input) {
console.log("Nanoid=", id, ", MachineId=", machineId, ", homeDir=", homeDir);
machine_id's mod.ts
const { run, build, readAll, readFile, env } = Deno;
// Get machine ID
// Permission in Windows: --allow-run --allow-env
// Permission in MacOS: --allow-run
// Permission in Linux: --allow-read
export async function getMachineId(): Promise<string> {
switch (build.os) {
case "linux":
return getMachineIDLinux();
case "windows":
return getMachineIDWin();
case "darwin":
return getMachineIDMac();
throw new Error(`Not support your operate system '${build.os}'`);
function parse(bytes: Uint8Array): string {
const output = new TextDecoder().decode(bytes);
switch (build.os) {
case "linux":
return output.trim();
case "windows":
return output
.replace(/\r+|\n+|\s+/gi, "")
case "darwin":
const lines = output.split("\n");
for (const line of lines) {
// here is the match line
// "IOPlatformUUID" = "A8226C69-2364-5B3E-83CC-1A72D7531679"
if (line.indexOf("IOPlatformUUID") > 0) {
const [_, val] = line.split(/\s*=\s*/);
return val.replace(/^"|"$/g, "");
return "";
throw new Error(`Not support your operate system '${build.os}'`);
async function getMachineIDWin(): Promise<string> {
const winDir = env.get("windir");
const ps = run({
stdout: "piped",
cmd: [
const output = await readAll(ps.stdout!);
return parse(output);
async function getMachineIDMac(): Promise<string> {
const ps = run({
stdout: "piped",
cmd: ["ioreg", "-rd1", "-c", "IOPlatformExpertDevice"],
const output = await readAll(ps.stdout!);
return parse(output);
async function getMachineIDLinux(): Promise<string> {
// dbusPath is the default path for dbus machine id.
const dbusPath = "/var/lib/dbus/machine-id";
// dbusPathEtc is the default path for dbus machine id located in /etc.
// Some systems (like Fedora 20) only know this path.
// Sometimes it's the other way round.
const dbusPathEtc = "/etc/machine-id";
return parse(
await readFile(dbusPath).catch(() => {
// try fallback path
return readFile(dbusPathEtc);
All the modules have been processed and converted, with recursion applied. There's no more TS code from this point. It's time to integrate them into v8. We'll cover the registration and instantiation process in the following section.