Recent Posts
Archives

Posts Tagged ‘MatteoCollina’

PostHeaderIcon [DotJs2025] Node.js Will Use All the Memory Available, and That’s OK!

In the pulsating heart of server-side JavaScript, where applications hum under relentless loads, a persistent myth endures: Node.js’s voracious appetite for RAM signals impending doom. Matteo Collina, co-founder and CTO at Platformatic, dismantled this notion at dotJS 2025, revealing how V8’s sophisticated heap stewardship—far from a liability—empowers resilient, high-throughput services. With over 15 years sculpting performant ecosystems, including Fastify’s lean framework and Pino’s swift logging, Matteo illuminated the elegance of embracing memory as a strategic asset, not an adversary. His revelation: judicious tuning transforms perceived excess into a catalyst for latency gains and stability, urging developers to recalibrate preconceptions for enterprise-grade robustness.

Matteo commenced with a ritual lament: weekly pleas from harried coders convinced their apps hemorrhage resources, only to confess manual terminations at arbitrary thresholds—no crashes, merely preempted panics. This vignette unveiled the crux: Node’s default 1.4GB cap (64-bit) isn’t a leak’s harbinger but a deliberate throttle, safeguarding against unchecked sprawl. True leaks—orphaned closures, eternal event emitters—defy GC’s mercy, accruing via retain cycles. Yet, most “leaks” masquerade as legitimate growth: caches bloating under traffic, buffers queuing async floods. Matteo advocated profiling primacy: Chrome DevTools’ heap snapshots, clinic.js’s flame charts—tools unmasking culprits sans conjecture.

Delving into V8’s bowels, Matteo traced the Orinoco collector’s cadence: minor sweeps scavenging new-space detritus, majors consolidating old-space survivors. Latency lurks in these pauses; unchecked heaps amplify them, stalling event loops. His panacea: hoist the ceiling via --max-old-space-size=4096, bartering RAM for elongated intervals between majors. Benchmarks corroborated: a 4GB tweak on a Fastify benchmark slashed P99 latency by 8-10%, throughput surging analogously—thinner GC curves yielding smoother sails. This alchemy, Matteo posited, flips economics: memory’s abundance (cloud’s elastic reservoirs) trumps compute’s scarcity, especially as SSDs eclipse HDDs in I/O velocity.

Enterprise vignettes abounded. Platformatic’s observability suite, Pino’s zero-allocation streams—testaments to lean design—thrive sans austerity. Matteo cautioned: leaks persist, demanding vigilance—nullify globals, prune listeners, wield weak maps for caches. Yet, fear not the fullness; it’s V8’s vote of confidence in your workload’s vitality. As Kubernetes autoscalers and monitoring recipes (his forthcoming tome’s bounty) democratize, Node’s memory ethos evolves from taboo to triumph.

Demystifying Heaps and Collectors

Matteo dissected V8’s realms: new-space for ephemeral allocations, old-space for tenured stalwarts—Orinoco’s incremental majors mitigating stalls. Defaults constrain; elevations liberate, as 2025’s guides affirm: monitor via --inspect, profile with heapdump.js, tuning for 10% latency dividends sans leaks.

Trading Bytes for Bandwidth

Empirical edges: Fastify’s trials evince heap hikes yielding throughput boons, GC pauses pruned. Platformatic’s ethos—frictionless backends—embodies this: Pino’s streams, Fastify’s routers, all memory-savvy. Matteo’s gift: enterprise blueprints, from K8s scaling to on-prem Next.js, in his 296-page manifesto.

Links:

PostHeaderIcon [NodeCongress2021] Can We Double HTTP Client Throughput? – Matteo Collina

HTTP clients, the sinews of distributed dialogues, harbor untapped vigor amid presumptions of stasis. Matteo Collina, Node.js TSC stalwart, Fastify co-architect, and Pino progenitor, challenges this inertia, unveiling Undici—a HTTP/1.1 vanguard doubling, nay tripling, Node’s native throughput via HOL-blocking evasion.

Matteo’s odyssey traces TCP/IP genesis: Nagle’s algorithm coalesces packets, delaying ACKs—elegant for telnet, anathema for HTTP’s pipelined pleas. Keep-alive sustains sockets, multiplexing requests; yet core http’s single-flight per connection bottlenecks bursts.

Undici disrupts: connection pools parallelize, pipelining dispatches volleys sans serialization. Matteo benchmarks: native peaks at baselines; Undici’s agents—configurable concurrency—surge 3x, streams minimizing JSON parses.

Mitigating Head-of-Line Shadows

HOL’s specter—prior stalls cascade—yields to Undici’s ordered queues, responses slotted sans reordering. Matteo codes: fetch wrappers proxy natives, agents tune origins—pipelining: true unleashes floods.

Comparisons affirm: Undici’s strictness trumps core’s leniency, APIs diverge—request/stream for granularity. Fastify proxy’s genesis birthed Undici, Robert Nagy’s polish primed production.

Matteo’s clarion—agents mandatory, Undici transformative—ushers HTTP’s renaissance, slashing latencies in microservice meshes.

Links: