logicentity.com
JavaScript

Hello World : The Advanced Playground

Every example below prints "Hello, World!" using a feature most developers have encountered but few have truly internalized.

Pradeep DavuluriFebruary 26, 202611 min read
JavaScript & TypeScript

"Hello, World!" But Make It Advanced

Every example below prints "Hello, World!", because apparently the first thing any self-respecting language feature should do is prove it can say two words. The output is known. The only thing left to focus on is the mechanism.

00

The Mental Models First

Every feature you've ever struggled to understand (Proxies, Generators, conditional types, infer) has one thing in common: the examples that teach them are too complicated. You're trying to understand the mechanism and the problem it solves and the domain it lives in, all at once.

Hello World fixes that. The output is already known. The problem is already solved. The only thing left to focus on is the mechanism. That's why we use it at the beginning. That's also why we should use it at the deep end.

Before the code, here's what this article is actually teaching:

Feature Mental Model
Proxy / ReflectIntercept the language's own operations
GeneratorsPausable, resumable computation
Symbol.iteratorProtocol-based polymorphism
Tagged TemplatesTransform strings at call-time
SharedArrayBuffer + AtomicsTrue shared memory between threads
Async IteratorsPull-based async data streams
Conditional Types + inferBranching logic at the type level
Template Literal TypesPattern matching on string shapes
Mapped Types + asTransform object types structurally
DecoratorsDeclarative, composable metaprogramming

Now the code.

01

Proxy & Reflect: Intercepting the Language Itself

const handler = {
  get(target, prop) {
    return prop in target
      ? Reflect.get(target, prop)
      : `Hello, ${prop}!`;
  }
};

const greeter = new Proxy({}, handler);
console.log(greeter.World); // "Hello, World!"

Proxy wraps an object and intercepts fundamental operations: property access, assignment, function calls, in checks, delete, even new. The thirteen interceptable operations are called traps.

Reflect is its counterpart: it exposes those same operations as functions, so you can invoke the default behaviour explicitly. Think of it as "do what JavaScript would normally do here." This pattern (intercept, do some work, then call Reflect) is how you build transparent wrappers without breaking anything.

Where you've already seen this This is how Vue 3's reactivity system works. When you access a reactive object's property, a get trap records the dependency. When you set it, a set trap triggers re-renders. The component tree never knows it's being observed.
02

Generators: Computation You Can Pause

function* messageStream(words) {
  let sentence = '';
  for (const word of words) {
    sentence += (sentence ? ', ' : '') + word;
    yield sentence + '!';
  }
}

const gen = messageStream(['Hello', 'World']);
gen.next();               // { value: 'Hello!', done: false }
console.log(gen.next().value); // "Hello, World!"

A generator function returns an iterator. Each call to .next() runs the function until it hits a yield, suspends there, and hands the value back to the caller. The function's entire local state (variables, loop position, call stack) is frozen until .next() is called again.

This is genuinely unusual in programming. Most functions run to completion. Generators are coroutines: they cooperate with their caller, passing control back and forth.

What's built on top of this Two things. First, async/await: the runtime desugars it into a generator coordinated by a Promise-aware driver. Second, infinite sequences: a generator can yield forever, and callers decide when to stop pulling. You can model an infinite Fibonacci sequence, an unbounded event stream, or a paginated data source as a generator.
03

Symbol.iterator: The Protocol Behind for…of

const hello = {
  parts: ['Hello', 'World'],
  [Symbol.iterator]() {
    let i = 0;
    return {
      next: () =>
        i < this.parts.length
          ? { value: this.parts[i++], done: false }
          : { value: undefined, done: true }
    };
  }
};

console.log([...hello].join(', ') + '!'); // "Hello, World!"

for...of, spread (...), and destructuring don't know or care about Arrays. They know about the iterator protocol: if an object has a [Symbol.iterator]() method that returns an object with a .next() method, it's iterable. That's the entire contract.

This is protocol-based polymorphism, the same idea as interfaces in typed languages, except enforced by convention and Symbols rather than types. Arrays, Sets, Maps, Strings, NodeLists, and generator objects all implement this protocol. Your own objects can too.

Why Symbol and not a string? The Symbol namespace exists specifically to avoid naming collisions. Symbol.iterator is a globally unique key that will never clash with a user-defined property named "iterator". This is how the language safely adds new protocol hooks without breaking existing code.
04

Tagged Template Literals: Functions That Own Their Syntax

function emphasize(strings, ...values) {
  return strings.reduce((result, str, i) => {
    const val = values[i - 1];
    return result + (val ? `*${val.toUpperCase()}*` : '') + str;
  });
}

const who = 'world';
console.log(emphasize`Hello, ${who}!`); // "Hello, *WORLD*!"

A tagged template is a function call with unusual syntax. The tag function receives two things: an array of the static string segments, and the interpolated values as separate arguments. It can then do anything with them: transform, escape, validate, translate, or throw.

The static parts and dynamic parts are delivered separately, which means the tag function sees the structure of the string, not just the final interpolated result. It has full control over assembly.

Libraries built on this pattern styled-components uses it to write CSS in template literals and know which parts are static styles vs. dynamic prop-based values. gql parses GraphQL at the call site. SQL libraries use it to safely parameterize queries without string concatenation. The tag function receives the query structure and the values separately, making injection attacks structurally impossible.
05

SharedArrayBuffer & Atomics: Shared Memory Between Threads

// main.js
const sab = new SharedArrayBuffer(1024);
const arr = new Uint8Array(sab);
new TextEncoder().encodeInto('Hello, World!', arr);

const worker = new Worker('worker.js');
worker.postMessage(sab);
// worker.js
self.onmessage = ({ data: sab }) => {
  const arr = new Uint8Array(sab);
  console.log(new TextDecoder().decode(arr).replace(/\0.*/, ''));
  // "Hello, World!"
};

JavaScript is single-threaded, except when it isn't. Web Workers run in separate threads with their own event loops and memory. Normally, data passed between them is copied via structured clone. SharedArrayBuffer breaks that rule: it's a region of memory that both threads can read and write simultaneously.

That word "simultaneously" is where Atomics comes in. Without coordination, two threads writing to the same location produce a data race: undefined, non-deterministic behaviour. Atomics provides operations guaranteed to be indivisible: compareExchange, add, wait, notify. These are the building blocks of lock-free concurrent algorithms.

Not for everyday use This is for compute-heavy workloads (image processing, physics engines, WASM interop) where copying data between threads is too expensive. SharedArrayBuffer requires Cross-Origin-Opener-Policy and Cross-Origin-Embedder-Policy headers to be enabled, a security requirement introduced after the Spectre vulnerability.
06

Async Iterators: Pull-Based Async Streams

async function* streamWords(words, delay = 100) {
  for (const word of words) {
    await new Promise(res => setTimeout(res, delay));
    yield word;
  }
}

(async () => {
  const parts = [];
  for await (const word of streamWords(['Hello', 'World'])) {
    parts.push(word);
  }
  console.log(parts.join(', ') + '!'); // "Hello, World!"
})();

An async generator combines two ideas: the pausable execution of generators with the async resolution of Promises. Each yield suspends the function. Each await inside waits for a Promise. The caller, using for await...of, pulls one value at a time and waits for each one to resolve before asking for the next.

This is pull-based: the consumer controls the pace, the producer only runs when asked. It's the right model for paginated APIs, real-time event streams, and anywhere you'd otherwise reach for a callback-heavy stream library.

Where the protocol already exists Node.js readable streams implement the async iterator protocol natively, as of Node 10. So does the Fetch API's response.body. This means you can for await...of a file read stream, a network response body, or any source that drips data over time, with no library required.

07

Conditional Types & infer: Logic at the Type Level

type Unwrap<T> = T extends Promise<infer U> ? U : T;

type Message = Unwrap<Promise<'Hello, World!'>>;
// resolves to: 'Hello, World!'

const msg: Message = 'Hello, World!';
console.log(msg);

Conditional types give the type system an if/else. The expression T extends Promise<infer U> ? U : T reads: "if T is a Promise wrapping some type U, resolve to U; otherwise resolve to T itself." The infer keyword captures that inner type U in the same step as the pattern match.

This is how TypeScript's built-in utility types are implemented. ReturnType<T> uses infer to capture what a function returns. Awaited<T> uses it recursively to unwrap nested Promises. Parameters<T> uses it to extract a function's argument tuple. None of these would be possible with simple generics alone.

The mental shift Stop thinking of types as labels you attach to things. Start thinking of them as values you can compute with, values that branch, recurse, and transform based on the shape of their inputs. That shift is what separates intermediate TypeScript from advanced TypeScript.
08

Template Literal Types: Pattern Matching on Strings

type Greeting = `Hello, ${string}!`;

const a: Greeting = 'Hello, World!';      // ✅
const b: Greeting = 'Hello, TypeScript!'; // ✅
const c: Greeting = 'Goodbye, World!';    // ❌ Type error

console.log(a); // "Hello, World!"

Template literal types bring string-level pattern matching into the type system. Not just "this is a string" but "this is a string that starts with 'Hello, ' and ends with '!'."

Combined with infer, they become a compile-time string parser:

type ExtractName<T extends string> =
  T extends `Hello, ${infer Name}!` ? Name : never;

type Who = ExtractName<'Hello, World!'>; // 'World'
Where this shows up in real codebases Typed routing libraries use this to infer URL parameters from path strings like '/users/:id/posts/:postId'. Typed event emitters use it to map event name strings to their payload types. It moves validation that previously happened at runtime (throwing errors on unknown event names or malformed routes) into the compiler, where the feedback loop is instant.
09

Mapped Types + as: Structural Type Transformation

type Prefixed<T extends Record<string, unknown>> = {
  [K in keyof T as `hello_${K & string}`]: T[K];
};

type Result = Prefixed<{ world: string; everyone: string }>;
// { hello_world: string; hello_everyone: string }

const obj: Result = {
  hello_world: 'Hello, World!',
  hello_everyone: 'Hello, Everyone!',
};
console.log(obj.hello_world); // "Hello, World!"

Mapped types iterate over the keys of a type and produce a new type, the type-level equivalent of Array.map. The as clause (added in TypeScript 4.1) allows key remapping: rename, filter, or transform keys as part of the mapping.

Filtering is particularly powerful: mapping a key to never removes it from the result type entirely:

type OnlyStrings<T> = {
  [K in keyof T as T[K] extends string ? K : never]: T[K];
};
The foundation of the standard library Pick, Omit, Readonly, and Partial are all implemented as mapped types. Understanding this means you can build your own structural transformations instead of hunting npm for a utility that does exactly what you need.
10

Decorators: Annotating Behaviour Declaratively

function log(_target: unknown, context: ClassMethodDecoratorContext) {
  return function (this: unknown, ...args: unknown[]) {
    const result = (_target as Function).apply(this, args);
    console.log(`[${String(context.name)}] →`, result);
    return result;
  };
}

class Greeter {
  @log
  greet(name: string) {
    return `Hello, ${name}!`;
  }
}

new Greeter().greet('World');
// [greet] → Hello, World!

A decorator is a function applied to a class, method, accessor, property, or parameter at definition time. It receives the thing being decorated and a context object, and can return a replacement. The Stage 3 decorators shown here (now standardised and shipping in TypeScript 5+) are cleaner than the legacy experimental ones, with no dependency on Reflect.metadata.

The real power is composability. Stack multiple decorators and each one wraps the previous:

class API {
  @cache
  @retry({ times: 3 })
  @log
  async fetchUser(id: string) { ... }
}
The frameworks that depend on this Angular's component system, NestJS's routing, and MikroORM's entity definitions are all built on decorators: behaviour declared at the class level, applied automatically by the framework. The decorator is a contract between the class and the framework, expressed in syntax rather than configuration.
11

The Point

None of this is trivia. Every feature here reflects a real design decision, a problem the language designers faced and solved in a specific way.

Proxies exist because frameworks needed reactivity without cooperative objects. Generators exist because async control flow needed to be expressible without callback nesting. infer exists because the type system needed to be expressive enough to model real APIs. Decorators exist because cross-cutting concerns were being bolted on with fragile conventions anyway.

When you understand the why, the syntax stops being something you look up and starts being something you reach for.

Hello World didn't teach you what programs do. It proved you could run one. These examples don't teach you what these features do. They prove you can think in them.

The output is always the same. The understanding never is. Every example in this article resolves to the same two words. That's the whole point. When the result is already known, your cognitive budget goes entirely to the mechanism, which is the only thing worth learning here.