Archive for the ‘en-US’ Category
[NDCOslo2024] Mirror, Mirror: LLMs and the Illusion of Humanity – Jodie Burchell
In the mesmerizing mirror maze of machine mimicry, where words weave worlds indistinguishable from wit, Jodie Burchell, JetBrains’ data science developer advocate, shatters the spell of sentience in large language models (LLMs). A PhD psychologist turned NLP pioneer, Jodie probes the psychological ploys that propel projections of personhood onto probabilistic parsers, dissecting claims from consciousness to cognition. Her inquiry, anchored in academia and augmented by anecdotes, advises acuity: LLMs as linguistic lenses, not living likenesses, harnessing their heft while heeding hallucinations.
Jodie greets with gratitude for her gritty slot, her hipster cred in pre-prompt NLP notwithstanding. LLMs’ 2022 blaze beguiles: why bestow brains on bytes when other oracles oblige? Her hypothesis: humanity’s hall of mirrors, where models mirror our mores, eliciting empathy from echoes.
Psychological Projections: Perceiving Personhood in Parsers
Humans, Jodie hazards, hallucinate humanity: anthropomorphism’s ancient artifice, from pets to puppets. LLMs lure with language’s liquidity—coherent confessions conjure companionship. She cites stochastic parrots: parleying patterns, not pondering profundities, yet plausibility persuades.
Extraordinary assertions abound: Blake Lemoine’s LaMDA “alive,” Google’s Gemini “godhead.” Jodie juxtaposes: sentience’s scaffold—selfhood, suffering—sans in silicon. Chalmers’ conundrum: consciousness connotes qualia, quanta qualms quell in qubits.
Levels of Luminescence: From Language to Luminary
DeepMind’s AGI arc: Level 1 chatbots converse convincingly; Level 2 reasons reactively; Level 3 innovates imaginatively. LLMs linger at 1-2, lacking Level 4’s abstraction or 5’s autonomy. Jodie jests: jackdaws in jester’s garb, juggling jargon sans judgment.
Illusions intensify: theory of mind’s mirage, where models “infer” intents from inferences. Yet, benchmarks belie: ARC’s abstraction stumps, BIG-bench’s breadth baffles—brilliance brittle beyond basics.
Perils of Projection: Phishing and Philosophical Pitfalls
Prompt injections prey: upstream overrides oust origins, birthing bogus bounties—”Amazon voucher via arcane URL.” Jodie demonstrates: innocuous inquiries infected, innocuousness inverted into inducements. Robustness rankles: rebuttals rebuffed, ruses reiterated.
Her remedy: recognize reflections—lossy compressions of lore, not luminous lives. Demystify to deploy: distill data, detect delusions, design defensively.
Dispelling the Delusion: Harnessing Heuristics Humanely
Jodie’s jeremiad: myths mislead, magnifying misuses—overreach in oracles, oversight in safeguards. Her horizon: LLMs as lucid lenses, amplifying analysis while acknowledging artifice.
Links:
[DefCon32] Compromising Electronic Logger & Creating Truck2Truck Worm
Jake Jepson and Rik Chatterjee, systems engineering master’s students at Colorado State University, present a compelling investigation into the cybersecurity risks of Electronic Logging Devices (ELDs) in the trucking industry. Their session at DEF CON 32 exposes critical vulnerabilities in these mandated devices, demonstrating the potential for remote exploits and a wormable attack that could propagate across truck networks. Jake and Rik’s research underscores the urgent need for standardized security protocols in an industry pivotal to global supply chains.
Uncovering ELD Vulnerabilities
Jake opens by highlighting the role of ELDs in ensuring compliance with Hours of Service regulations, yet notes their susceptibility to cyber-physical attacks due to inadequate security measures. Working at Colorado State University, Jake and Rik reverse-engineered commercially available ELDs, identifying insecure defaults and poor security practices. Their findings reveal how attackers could exploit these weaknesses to gain unauthorized control over truck systems, posing significant risks to safety and logistics.
Developing a Truck2Truck Worm
Rik details their proof-of-concept attack, which leverages wireless communication vulnerabilities in ELDs. Using tools like Ghidra for firmware reverse-engineering and network scanners, they developed a worm capable of spreading via over-the-air updates, exploiting default credentials. Rik explains how trucks’ proximity at rest stops or distribution hubs, combined with always-on diagnostic ports, creates ideal conditions for a worm to propagate, potentially affecting entire fleets within a 120-foot range in dense environments.
Coordinated Disclosure and Industry Impact
Jake shares their responsible disclosure process, including his first CVE, which prompted a swift response from manufacturer IO6, who issued a patch. However, Jake emphasizes that the root issue lies in government-mandated, self-certified devices lacking rigorous security standards. Their work highlights systemic flaws in ELD certification, urging regulators to prioritize cybersecurity to prevent large-scale disruptions in the trucking industry.
Links:
[DotAI2024] DotAI 2024: Merve Noyan – Mastering Open-Source AI for Sovereign Application Autonomy
Merve Noyan, Machine Learning Advocate Engineer at Hugging Face and a Google Developer Expert in vision, navigated the nebula of communal cognition at DotAI 2024. As a graduate researcher pioneering zero-shot vistas, Noyan demystifies multimodal marvels, rendering leviathans lithe for legions. Her odyssey exhorted eschewing enclosures for ecosystems: scouting sentinels, appraising aptitudes, provisioning prowess—yielding yokes unyoked from vendor vicissitudes, where governance gleams and evolutions endure.
Scouting and Scrutinizing Sentinels in the Open Expanse
Noyan decried data’s dominion: proprietary priors propel pinnacles, yet communal curations crest through ceaseless confluence—synthetics and scaling supplanting size’s supremacy. Open-source’s oracle: outpacing oracles, birthing bespoke brains across canons—textual tapestries to visual vignettes.
Hugging Face’s haven: model menageries, metrics manifold—perplexity probes, benchmark bastions like GLUE’s gauntlet or VQA’s vista. Noyan navigated novices: leaderboard luminaries as lodestars, yet litmus via locales—domain devotion via downstream drills.
Evaluation’s edifice: evince efficacy through ensembles—zero-shot zephyrs, fine-tune forays—discerning drifts in dialects or drifts in depictions.
Provisioning and Polishing for Persistent Potency
Serving’s sacrament: Text Generation Inference’s torrent—optimized oracles on off-the-shelf oracles—or vLLM’s velocity for voluminous ventures. Noyan’s nexus: LoRA’s legerdemain, ligating leviathans to locales sans surfeit.
TRL’s tapestry: supervised scaffolds, preference polishes—DPO’s dialectical dances aligning aptitudes. Quantization’s quartet—Quanto’s quanta, BitsAndBytes’ bits—bisecting burdens, Optimum’s optimizations orchestrating outflows.
Noyan’s nexus: interoperability’s imperative—transformers’ tendrils twining TRL, birthing bespoke ballets. She summoned synergy: Hugging Face’s helix, where harbors host horizons—fine-tunes as fulcrums, fusions as futures.
In invocation, Noyan ignited: “Let’s build together”—a clarion for coders charting communal conquests, where open-source ordains originality unbound.
Links:
[OxidizeConf2024] The Wonderful World of Rust Tooling
Transitioning to Rust’s Ecosystem
The Rust programming language is renowned for its memory safety and performance, but its tooling ecosystem is equally transformative, particularly for developers transitioning from other platforms. James McNally, an independent software consultant, shared his journey from LabVIEW to Rust at OxidizeConf2024, highlighting how Rust’s tools enable reliable and performant industrial measurement systems. With a decade of experience in custom systems for scientists and engineers, James emphasized the productivity and flexibility of Rust’s tooling, drawing parallels to LabVIEW’s integrated environment.
LabVIEW, a visual programming language since the 1980s, offered James a single tool for desktop, real-time controllers, and FPGA development, with built-in UI capabilities. However, its limitations in modern software engineering tools prompted him to explore Rust. Rust’s ecosystem, including Cargo, Clippy, and Criterion, provided a cohesive environment that mirrored LabVIEW’s productivity while addressing its gaps. James’s transition underscores Rust’s appeal for solo developers needing to deliver high-quality systems with limited resources.
Building Robust CI Pipelines
A key focus of James’s presentation was his standard continuous integration (CI) pipeline for client projects. Using Cargo, Rust’s package manager, he automates building, testing, and formatting, ensuring consistent code quality. Clippy, Rust’s linter, plays a pivotal role by enforcing strict coding standards and preventing panics through targeted lints. James demonstrated how Clippy’s checks catch potential errors early, enhancing reliability in measurement systems where precision is critical.
For performance optimization, James relies on Criterion, a benchmarking tool that provides detailed performance metrics. This is particularly valuable for industrial applications, such as a concrete testing system for a university, where performance directly impacts data accuracy. By integrating these tools into CI pipelines, James ensures that his systems meet client requirements for reliability and efficiency, reducing the need for external dependencies and simplifying project management.
Community-Driven Tooling Enhancements
Rust’s open-source community is a driving force behind its tooling ecosystem, and James highlighted tools like cargo-deny for license checking and vulnerability alerting. He acknowledged challenges, such as false positives in large workspaces, but praised tools like cargo-tree for dependency analysis, which helps identify unused dependencies and resolve security issues. These tools empower developers to maintain secure and compliant codebases, a critical consideration for industrial applications.
James also addressed the potential for visual programming in Rust, noting that while LabVIEW’s visual paradigm is effective, text-based languages like Rust benefit from broader community support. Future enhancements, such as improved security tools like semgrep, could further streamline Rust development. By sharing his practical approach, James inspires developers to leverage Rust’s tooling for diverse applications, from one-off test systems to commercialized particle detectors.
Links:
[DefCon32] Prime Cuts from Hacker History: 40 Years of 31337
Deth Veggie, Minister of Propaganda for the Cult of the Dead Cow (cDc), leads a nostalgic panel celebrating 40 years of hacker culture, joined by members of cDc, Legion of Doom, 2600 Magazine, Phrack, and r00t. Moderated by Professor Walter Scheirer from the University of Notre Dame, the session traces the origins of the computer underground in 1984, a pivotal year marked by the rise of personal computers and modems. Through vivid storytelling and audience engagement, the panelists reflect on the rebellious spirit, technical curiosity, and community that defined early hacking, offering insights for inspiring the next generation.
The Birth of Hacker Culture
Deth Veggie sets the stage, recounting the founding of cDc in 1984 in a Texas slaughterhouse adorned with heavy metal posters and a cow skull. This era saw the convergence of disaffected youth, empowered by personal computers and modems, forming groups like Legion of Doom and launching 2600 Magazine. The panelists share how their fascination with technology and rebellion against societal norms fueled the creation of a vibrant subculture, where Bulletin Board Systems (BBSes) became hubs for knowledge exchange.
The Rise of T-Files and Phrack
The panel explores the explosion of written hacker culture in 1985 with the advent of Phrack Magazine and text files (t-files), which became the currency of elite hackers. Panelists from Phrack and 2600 recount how these publications democratized technical knowledge, from phone phreaking to early computer exploits. Their stories highlight the thrill of discovery and the camaraderie of sharing hard-earned insights, shaping a community driven by curiosity and defiance.
Navigating the Underground
Reflecting on their experiences, the panelists discuss navigating the computer underground, from dial-up BBSes to illicit explorations of early networks. Members of Legion of Doom and r00t share anecdotes of creative problem-solving and the ethical dilemmas of their actions. These narratives reveal a culture where technical prowess and a desire to challenge authority coexisted, laying the groundwork for modern cybersecurity practices.
Engaging the Next Generation
Responding to audience questions, the panel addresses how to inspire today’s youth to engage with technology creatively. Deth Veggie suggests encouraging hands-on exploration through hacker spaces, maker spaces, and vintage computer festivals, where kids can tinker with old cameras and computers. The panelists emphasize finding role models who ignite passion, citing their own experiences looking up to peers on stage. They advocate fostering an active search for knowledge, akin to the BBS era, to cultivate emotional and intellectual investment in tech.
Preserving the Hacker Spirit
The panel concludes by urging the community to preserve the hacker spirit through mentorship and open knowledge sharing. Walter Scheirer’s moderation highlights the importance of documenting this history, as seen in cDc’s archives and 2600’s ongoing publications. The panelists call for nurturing curiosity in young hackers, ensuring the legacy of 1984’s rebellious innovators continues to inspire transformative contributions to technology.
Links:
[DefCon32] Clash, Burn, and Exploit: Manipulate Filters to Pwn kernelCTF
Kuan-Ting Chen, known as HexRabbit, a security researcher at DEVCORE and member of the Balsn CTF team, delivers a riveting exploration of Linux kernel vulnerabilities in the nftables subsystem. His presentation at DEF CON 32 unveils three novel vulnerabilities discovered through meticulous analysis of the nftables codebase, a critical component for packet filtering in the Linux kernel. Kuan-Ting’s journey, marked by intense competition and dramatic setbacks in Google’s kernelCTF bug bounty program, culminates in a successful exploit, earning him his first Google VRP bounty. His narrative weaves technical depth with the emotional highs and lows of vulnerability research, offering a masterclass in kernel exploitation.
Understanding nftables Internals
Kuan-Ting begins by demystifying nftables, the successor to iptables, which manages packet filtering and network-related functionalities in the Linux kernel. He explains how features like batch commits, anonymous chains, and asynchronous garbage collection, designed to enhance efficiency, have inadvertently increased complexity, making nftables a prime target for attackers. His introduction provides a clear foundation, enabling attendees to grasp the intricate mechanisms that underpin his vulnerability discoveries.
Uncovering Novel Vulnerabilities
Delving into the technical core, Kuan-Ting dissects three nftables vulnerabilities, two of which exploited challenging race conditions to capture the kernelCTF flag. He details how structural changes in the nftables codebase, often introduced by security patches, can unintentionally create new flaws. For instance, one vulnerability, identified as CVE-2024-26925, stemmed from improper input sanitization, enabling a double-free exploit. His methodical approach, combining code auditing with creative exploitation techniques like Dirty Pagedirectory, achieved a 93–99% success rate across hardened kernel instances, including Ubuntu and Debian.
The kernelCTF Roller-Coaster
Kuan-Ting’s narrative shines as he recounts the emotional and competitive challenges of the kernelCTF program. He describes a series of near-misses: an initial exploit collided with another submission, a second was rendered unusable due to a configuration error, and a third lost a submission race by mere seconds. The turning point came when a competitor’s disqualification allowed Kuan-Ting to secure the bounty just before Google disabled nftables in the LTS instance on April 1, 2024. This gripping tale underscores the persistence required in high-stakes vulnerability research.
Lessons for Kernel Security
Concluding, Kuan-Ting reflects on the broader implications of his findings. He advocates for rigorous code auditing to complement automated fuzzing, as subtle logic errors can lead to potent exploits. His work, detailed in resources like the Google Security Research repository, encourages researchers to explore novel exploitation techniques while urging kernel maintainers to strengthen nftables’ defenses. Kuan-Ting’s success inspires the cybersecurity community to tackle complex subsystems with creativity and resilience.
Links:
[DefCon32] Encrypted Newspaper Ads in the 19th Century
Elonka Dunin and Klaus Schmeh, renowned cryptology experts, unravel the mystery of encrypted advertisements published in The Times between 1850 and 1855. Intended for Captain Richard Collinson during his Arctic expedition, these ads used a modified Royal Navy signal-book cipher. Elonka and Klaus’s presentation traces their efforts to decrypt all ads, providing historical and cryptographic insights into a unique communication system.
The Collinson Cipher System
Elonka introduces the encrypted ads, designed to keep Collinson informed of family matters during his search for the lost Franklin expedition. The cipher, based on a Royal Navy signal-book, allowed Collinson’s family to encode messages for publication in The Times, accessible globally. Elonka’s narrative highlights the system’s ingenuity, enabling secure communication in an era of limited technology.
Decrypting Historical Messages
Klaus details their decryption process, building on 1990s efforts to break the cipher. Using their expertise, documented in their book from No Starch Press, Klaus and Elonka decoded over 50 ads, placing them in geographic and cultural context. Their work reveals personal details, such as messages from Collinson’s sister Julia, showcasing the cipher’s effectiveness despite logistical challenges.
Challenges and Limitations
The duo discusses the system’s mixed success, noting that Collinson received only four messages in Banuwangi due to expedition unrest. Klaus addresses the cipher’s vulnerabilities, such as predictable patterns, which modern techniques could exploit. Their analysis, enriched by historical records, underscores the challenges of maintaining secure communication in remote settings.
Modern Cryptographic Relevance
Concluding, Elonka explores the potential of artificial intelligence in cryptanalysis, noting that LLMs struggle with precise tasks like counting letters but excel in pattern recognition. Their work invites further research into historical ciphers, inspiring cryptographers to apply modern tools to uncover past secrets, preserving the legacy of Collinson’s innovative system.
Links:
[GoogleIO2024] A New Renaissance in Art: Refik Anadol on the AI Transformation of Art
Refik Anadol’s visionary approach merges AI with art, using data as a canvas to create immersive experiences. Mira Lane’s introduction set the stage for Refik’s narrative, tracing his evolution from Istanbul’s cultural fusion to global projects that harmonize technology with nature and indigenous wisdom.
Inspirations and Early Foundations in Data Art
Born in Istanbul, Refik drew from the city’s East-West confluence, seeing water as a metaphor for connectivity. His first computer at eight ignited a passion for human-machine interfaces, influenced by Blade Runner’s utopian visions. Establishing Refik Anadol Studio in Los Angeles, he assembled a multicultural team to explore beyond reality.
Pioneering “data pigmentation” since 2008 under mentor Peter Weibel, Refik views data as memory, liberated from physical constraints. Projects like “Unsupervised” at MoMA used 200 years of art data for AI hallucinations, questioning machine dreams. Collaborations with MIT, NASA, and the Grammys expanded scopes, while partnerships with Rolls-Royce and Chanel integrated AI into luxury.
A landmark was “California Landscapes” with Absen at ISE 2025, employing Stable Diffusion for mesmerizing visuals. Refik’s site-specific installations, like those at Art Basel Miami with Turkish Airlines, drew millions, showcasing generative AI’s public appeal.
Immersive Installations and Nature-Centric Explorations
Refik’s works transform spaces: a New York City archive evolved with real-time data at MoMA, while Serpentine’s nature visualization evoked emotions through AI-generated flora. Audio clustering of Amazonian birds with Cornell Lab aids biodiversity research, highlighting AI’s scientific utility.
“Generative reality” emerges as a new paradigm, creating multisensory universes. Text-to-video experiments and Amazonia projects with weather stations generate dynamic art, influenced by indigenous patterns. The Yawanawa collaboration, yielding “Winds of Yawanawa,” raised $2.5 million for their community, blending AI with cultural preservation.
Chief Nixiwaka’s mentorship taught harmonious living, inspiring respectful AI use. Projects like “Large Nature Model” focus on nature data, fostering love and attention.
Societal Impact and Purposeful Technology
Refik’s art advocates purposeful AI, addressing environmental and cultural issues. Indigenous voices at the World Economic Forum amplify wisdom for humanity’s future. His ethos—forgiveness, love, alliances—urges reconnection with Earth, positioning AI as a bridge to empathy and unity.
Links:
[KotlinConf2024] Channels in Kotlin Coroutines: Unveiling the Redesign
At KotlinConf2024, Nikita Koval, a JetBrains concurrency expert, delved into Kotlin coroutine channels, a core communication primitive. Channels, often used indirectly via APIs like Flow, enable data transfer between coroutines. Nikita explored their redesigned implementation, which boosts performance and reduces memory usage. He detailed rendezvous and buffered channel semantics, underlying algorithms, and scalability, offering developers insights into optimizing concurrent applications and understanding coroutine internals.
Channels: The Hidden Backbone of Coroutines
Channels are fundamental to Kotlin coroutines, even if developers rarely use them directly. Nikita highlighted their role in high-level APIs like Flow. For instance, merging two Flows or collecting data across dispatchers relies on channels to transfer elements between coroutines. Channels also bridge reactive frameworks like RxJava, funneling data through a coroutine-launched channel. Unlike sequential Flows, channels enable concurrent data handling, making them essential for complex asynchronous tasks, as seen in Flow’s channelFlow builder.
Rendezvous Channels: Synchronized Data Exchange
Rendezvous channels, the default in Kotlin, ensure synchronized communication. Nikita illustrated their semantics with two producers and one consumer. When a consumer calls receive on an empty channel, it suspends until a producer sends data. Conversely, a producer’s send suspends if no consumer is waiting, preventing uncontrolled growth. This “rendezvous” ensures direct data handoff, as demonstrated when a producer resumes a waiting consumer, maintaining efficiency and safety in concurrent scenarios.
Building Efficient Channel Algorithms
Implementing rendezvous channels requires balancing efficiency, memory use, and scalability. Nikita compared concurrent queue designs to adapt for channels. A lock-based sequential queue, while fast on single threads, fails to scale due to synchronization costs. Java’s ConcurrentLinkedQueue, using a linked list, scales better but incurs high memory overhead—32 bytes per 4-byte reference. Instead, modern queue designs use a segmented array with atomic counters for enqueuing and dequeuing, minimizing memory and scaling effectively, forming the basis for channel implementation.
Buffered Channels: Flexible Data Buffering
Buffered channels extend rendezvous semantics by allowing a fixed-capacity buffer. Nikita explained that a channel with capacity k accepts k elements without suspension, suspending only when full. Using a single producer-consumer example, he showed how a producer fills an empty buffer, while a second producer suspends until the consumer extracts an element, freeing space. This design, implemented with an additional counter to track buffer boundaries, supports dynamic workloads, though cancellation semantics add complexity, detailed in JetBrains’ research papers.
Performance Gains from Redesign
The channel redesign significantly enhances performance. Nikita shared benchmarks comparing the old linked-list-based implementation to the new segmented array approach. In a multi-producer, multi-consumer test with 10,000 coroutines, the new channels scaled up to four times faster, producing less garbage. Even in sequential workloads, they were 20% quicker. Q&A revealed further tuning, like setting segment sizes to 32 elements, balancing memory and metadata overhead, ensuring scalability across 64-core systems without degradation.
Deepening Concurrency Knowledge
Understanding channel internals empowers developers to tackle performance issues, akin to knowing hash table mechanics. Nikita emphasized that while high-level APIs abstract complexity, low-level knowledge aids debugging. He invited attendees to explore further at the PDI conference in Copenhagen, where JetBrains will discuss coroutine algorithms, including schedulers and mutexes. The redesigned channels, applied to unlimited and conflated variants, offer robust, scalable communication, encouraging developers to leverage coroutines confidently in high-load applications.
Links:
[DefCon32] DC101 – Panel
Nikita, Grifter, and other DEF CON organizers deliver an engaging DC101 panel, guiding newcomers through the conference’s vibrant ecosystem. Their session offers practical advice on navigating DEF CON’s contests, social events, and hacking opportunities, fostering an inclusive environment for first-time attendees. Nikita’s candid leadership and the team’s anecdotes create a welcoming introduction to the DEF CON community.
Navigating DEF CON’s Landscape
Nikita opens by outlining DEF CON’s extensive schedule, from 8:00 a.m. to 2:00 a.m., filled with contests, parties, and spontaneous hacking sessions. As director of content and coordination, Nikita emphasizes the variety of activities, such as laser Tetris and social gatherings, ensuring newcomers find engaging ways to connect and learn.
Engaging with Contests and Events
Grifter, the lead for contests and events, shares insights into DEF CON’s competitive spirit, highlighting past highlights like T-Rex fights and the infamous “naked guy” incident from a scavenger hunt. His anecdotes illustrate the creativity and unpredictability of DEF CON’s challenges, encouraging attendees to participate in contests to hone their skills.
Building Community Connections
The panel emphasizes the importance of community, with Nikita encouraging attendees to network and collaborate. The hotline program, led by another organizer, facilitates communication, ensuring newcomers feel supported. Their advice to engage with others, even in informal settings, fosters a sense of belonging in the hacking community.
Inspiring Future Contributions
Concluding, Nikita urges attendees to submit to the Call for Papers (CFP) for future DEF CONs, emphasizing that research and passion can earn a main stage spot. The panel’s lighthearted yet practical guidance, enriched with stories like the bean chair contest, inspires newcomers to dive into DEF CON’s dynamic culture and contribute to its legacy.
Links:
- None