Recent Posts
Archives

Posts Tagged ‘KotlinConf’

PostHeaderIcon [KotlinConf’2023] Coroutines and Loom: A Deep Dive into Goals and Implementations

The advent of OpenJDK’s Project Loom and its virtual threads has sparked considerable discussion within the Java and Kotlin communities, particularly regarding its relationship with Kotlin Coroutines. Roman Elizarov, Project Lead for Kotlin at JetBrains, addressed this topic head-on at KotlinConf’23 in his talk, “Coroutines and Loom behind the scenes”. His goal was not just to answer whether Loom would make coroutines obsolete (the answer being a clear “no”), but to delve into the distinct design goals, implementations, and trade-offs of each, clarifying how they can coexist and even complement each other. Information about Project Loom can often be found via OpenJDK resources or articles like those on Baeldung.

Roman began by noting that Project Loom, introducing virtual threads to the JVM, was nearing stability, targeted for Java 21 (late 2023). He emphasized that understanding the goals behind each technology is crucial, as these goals heavily influence their design and optimal use cases.

Project Loom: Simplifying Server-Side Concurrency

Project Loom’s primary design goal, as Roman Elizarov explained, is to preserve the thread-per-request programming style prevalent in many existing Java server-side applications, while dramatically increasing scalability. Traditionally, assigning one platform thread per incoming request becomes a bottleneck due to the high cost of platform threads. Virtual threads aim to solve this by providing lightweight, JVM-managed threads that can run existing synchronous, blocking Java code with minimal or no changes. This allows legacy applications to scale much better without requiring a rewrite to asynchronous or reactive patterns.

Loom achieves this by “unmounting” a virtual thread from its carrier (platform) thread when it encounters a blocking operation (like I/O) that has been integrated with Loom. The carrier thread is then free to run other virtual threads. When the blocking operation completes, the virtual thread is “remounted” on a carrier thread to continue execution. This mechanism is largely transparent to the application code. However, Roman pointed out a potential pitfall: if blocking operations occur within synchronized blocks or native JNI calls that haven’t been adapted for Loom, the carrier thread can get “pinned,” preventing unmounting and potentially negating some of Loom’s benefits in those specific scenarios.

Kotlin Coroutines: Fine-Grained, Structured Concurrency

In contrast, Kotlin Coroutines were designed with different primary goals:

  1. Enable fine-grained concurrency: Allowing developers to easily launch tens of thousands or even millions of concurrent tasks without performance issues, suitable for highly concurrent applications like UI event handling or complex data processing pipelines.
  2. Provide structured concurrency: Ensuring that the lifecycle of coroutines is managed within scopes, simplifying cancellation and preventing resource leaks. This is particularly critical for UI applications where tasks need to be cancelled when UI components are destroyed.

Kotlin Coroutines achieve this through suspendable functions (suspend fun) and a compiler-based transformation. When a coroutine suspends, it doesn’t block its underlying thread; instead, its state is saved, and the thread is released to do other work. This is fundamentally different from Loom’s approach, which aims to make blocking calls non-problematic for virtual threads. Coroutines explicitly distinguish between suspending and non-suspending code, a design choice that enables features like structured concurrency but requires a different programming model than traditional blocking code.

Comparing Trade-offs and Performance

Roman Elizarov presented a detailed comparison:

  • Programming Model: Loom aims for compatibility with existing blocking code. Coroutines introduce a new model with suspend functions, which is more verbose for simple blocking calls but enables powerful features like structured concurrency and explicit cancellation. Forcing blocking calls into a coroutine world requires wrappers like withContext(Dispatchers.IO), while Loom handles blocking calls transparently on virtual threads.
  • Cost of Operations:
    • Launching: Launching a coroutine is significantly cheaper than starting even a virtual thread, as coroutines are lighter weight objects.
    • Yielding/Suspending: Suspending a coroutine is generally cheaper than a virtual thread yielding (unmounting/remounting), due to compiler optimizations in Kotlin for state machine management. Roman showed benchmarks indicating lower memory allocation and faster execution for coroutine suspension compared to virtual thread context switching in preview builds of Loom.
  • Error Handling & Cancellation: Coroutines have built-in, robust support for structured cancellation. Loom’s virtual threads rely on Java’s traditional thread interruption mechanisms, which are less integrated into the programming model for cooperative cancellation.
  • Debugging: Loom’s virtual threads offer a debugging experience very similar to traditional threads, with understandable stack traces. Coroutines, due to their state-machine nature, can sometimes have more complex stack traces, though IDE support has improved this.

Coexistence and Future Synergies

Roman Elizarov concluded that Loom and coroutines are designed for different primary use cases and will coexist effectively.

  • Loom excels for existing Java applications using the thread-per-request model that need to scale without major rewrites.
  • Coroutines excel for applications requiring fine-grained, highly concurrent operations, structured concurrency, and explicit cancellation management, often seen in UI applications or complex backend services with many interacting components.

He also highlighted a potential future synergy: Kotlin Coroutines could leverage Loom’s virtual threads for their Dispatchers.IO (or a similar dispatcher) when running on newer JVMs. This could allow blocking calls within coroutines (those wrapped in withContext(Dispatchers.IO)) to benefit from Loom’s efficient handling of blocking operations, potentially eliminating the need for a large, separate thread pool for I/O-bound tasks in coroutines. This would combine the benefits of both: coroutines for structured, fine-grained concurrency and Loom for efficient handling of any unavoidable blocking calls.

Links:

Hashtags: #Kotlin #Coroutines #ProjectLoom #Java #JVM #Concurrency #AsynchronousProgramming #RomanElizarov #JetBrains

PostHeaderIcon [KotlinConf’23] The Future of Kotlin is Bright and Multiplatform

KotlinConf’23 kicked off with an energizing keynote, marking a highly anticipated return to an in-person format in Amsterdam. Hosted by Hadi Hariri from JetBrains, the session brought together key figures from both JetBrains and Google, including Roman Elizarov, Svetlana Isakova, Egor Tolstoy, and Grace Kloba (VP of Engineering for Android Developer Experience at Google), to share exciting updates and future directions for the Kotlin language and its ecosystem. The conference also boasted a global reach with KotlinConf Global events held across 41 countries. For those unable to attend, the key announcements from the keynote are also available in a comprehensive blog post on the official Kotlin blog.

The keynote began by celebrating Kotlin’s impressive growth, with compelling statistics underscoring its widespread adoption, particularly in Android development where it stands as the most popular language, utilized in over 95% of the top 1000 Android applications. A significant emphasis was placed on the forthcoming Kotlin 2.0, which is centered around the revolutionary new K2 compiler. This compiler promises significant performance improvements, enhanced stability, and a robust foundation for the language’s future evolution. The K2 compiler is nearing completion and is slated for release as Kotlin 2.0. Additionally, the IntelliJ IDEA plugin will also adopt the K2 frontend, ensuring alignment with IntelliJ releases and a consistent developer experience.

The Evolution of Kotlin: K2 Compiler and Language Features

The K2 compiler was a central theme of the keynote, signifying a major milestone for Kotlin. This re-architected compiler frontend, which also powers the IDE, is designed to be faster, more stable, and to enable quicker development of new language features and tooling capabilities. Kotlin 2.0, built upon the K2 compiler, is set to bring these profound benefits to all Kotlin developers, improving both compiler performance and IDE responsiveness.

Beyond the immediate horizon of Kotlin 2.0, the speakers provided a glimpse into potential future language features that are currently under consideration. These exciting prospects included:

Prospective Language Enhancements

  • Static Extensions: This feature aims to allow static resolution of extension functions, which could potentially improve performance and code clarity.
  • Collection Literals: The introduction of a more concise syntax for creating collections, such as using square brackets for lists, with efficient underlying implementations, is on the cards.
  • Name-Based Destructuring: Offering a more flexible way to destructure objects based on property names rather than simply their positional order.
  • Context Receivers: A powerful capability designed to provide contextual information to functions in a more implicit and structured manner. This feature, however, is being approached with careful consideration to ensure it aligns well with Kotlin’s core principles and doesn’t introduce undue complexity.
  • Explicit Fields: This would provide developers with more direct control over the backing fields of properties, offering greater flexibility in certain scenarios.

The JetBrains team underscored a cautious and deliberate approach to language evolution, ensuring that any new features are meticulously designed and maintainable within the Kotlin ecosystem. Compiler plugins were also highlighted as a powerful mechanism for extending Kotlin’s capabilities without altering its core.

Kotlin in the Ecosystem: Google’s Investment and Multiplatform Growth

Grace Kloba from Google took the stage to reiterate Google’s strong and unwavering commitment to Kotlin. She shared insights into Google’s substantial investments in the Kotlin ecosystem, including the development of Kotlin Symbol Processing (KSP) and the continuous emphasis on Kotlin as the default choice for Android development. Google officially championed Kotlin for Android development as early as 2017, a pivotal moment for the language’s widespread adoption. Furthermore, the Kotlin DSL is now the default for Gradle build scripts within Android Studio, significantly enhancing the developer experience with features such as semantic syntax highlighting and advanced code completion. Google also actively contributes to the Kotlin Foundation and encourages community participation through initiatives like the Kotlin Foundation Grants Program, which specifically focuses on supporting multiplatform libraries and frameworks.

Kotlin Multiplatform (KMP) emerged as another major highlight of the keynote, emphasizing its increasing maturity and widespread adoption. The overarching vision for KMP is to empower developers to share code across a diverse range of platforms—Android, iOS, desktop, web, and server-side—while retaining the crucial ability to write platform-specific code when necessary for optimal integration and performance. The keynote celebrated the burgeoning number of multiplatform libraries and tools, including KMM Bridge, which are simplifying KMP development workflows. The future of KMP appears exceptionally promising, with ongoing efforts to further enhance the developer experience and expand its capabilities across even more platforms.

Compose Multiplatform and Emerging Technologies

The keynote also featured significant advancements in Compose Multiplatform, JetBrains’ declarative UI framework for building cross-platform user interfaces. A particularly impactful announcement was the alpha release of Compose Multiplatform for iOS. This groundbreaking development allows developers to write their UI code once in Kotlin and deploy it seamlessly across Android and iOS, and even to desktop and web targets. This opens up entirely new avenues for code sharing and promises accelerated development cycles for mobile applications, breaking down traditional platform barriers.

Finally, the JetBrains team touched upon Kotlin’s expansion into truly emerging technologies, such as WebAssembly (Wasm). JetBrains is actively developing a new compiler backend for Kotlin specifically targeting WebAssembly, coupled with its own garbage collection proposal. This ambitious effort aims to deliver high-performance Kotlin code directly within the browser environment. Experiments involving the execution of Compose applications within the browser using WebAssembly were also mentioned, hinting at a future where Kotlin could offer a unified development experience across an even broader spectrum of platforms. The keynote concluded with an enthusiastic invitation to the community to delve deeper into these subjects during the conference sessions and to continue contributing to Kotlin’s vibrant and ever-expanding ecosystem.

Hashtags: #Keynote #JetBrains #Google #K2Compiler #Kotlin2 #Multiplatform #ComposeMultiplatform #WebAssembly

PostHeaderIcon Kotlin Native Concurrency Explained by Kevin Galligan

Navigating Kotlin/Native’s Concurrency Model

At KotlinConf 2019 in Copenhagen, Kevin Galligan, a partner at Touchlab with over 20 years of software development experience, delivered a 39-minute talk on Kotlin/Native’s concurrency model. Kevin Galligan explored the restrictive yet logical rules governing state and concurrency in Kotlin/Native, addressing their controversy among JVM and mobile developers. He explained the model’s mechanics, its rationale, and best practices for multiplatform development. This post covers four key themes: the core rules of Kotlin/Native concurrency, the role of workers, the impact of freezing state, and the introduction of multi-threaded coroutines.

Core Rules of Kotlin/Native Concurrency

Kevin Galligan began by outlining Kotlin/Native’s two fundamental concurrency rules: mutable state is confined to a single thread, and immutable state can be shared across multiple threads. These rules, known as thread confinement, mirror mobile development practices where UI updates are restricted to the main thread. In Kotlin/Native, the runtime enforces these constraints, preventing mutable state changes from background threads to avoid race conditions. Kevin emphasized that while these rules feel restrictive compared to the JVM’s shared-memory model, they align with modern platforms like Go and Rust, which also limit unrestricted shared state.

The rationale behind this model, as Kevin explained, is to reduce concurrency errors by design. Unlike the JVM, which trusts developers to manage synchronization, Kotlin/Native’s runtime verifies state access at runtime, crashing if rules are violated. This strictness, though initially frustrating, encourages intentional state management. Kevin noted that after a year of working with Kotlin/Native, he found the model simple and effective, provided developers embrace its constraints rather than fight them.

Workers as Concurrency Primitives

A central concept in Kevin’s talk was the Worker, a Kotlin/Native concurrency queue similar to Java’s ExecutorService or Android’s Handler and Looper. Workers manage a job queue processed by a private thread, ensuring thread confinement. Kevin illustrated how a Worker executes tasks via the execute function, which takes a producer function to verify state transfer between threads. The execute function supports safe and unsafe transfer modes, with Kevin strongly advising against the unsafe mode due to its bypassing of state checks.

Using a code example, Kevin demonstrated passing a data class to a Worker. The runtime freezes the data—making it immutable—to comply with concurrency rules, preventing illegal state transfers. He highlighted that while Worker is a core primitive, developers rarely use it directly, as higher-level abstractions like coroutines are preferred. However, understanding Worker is crucial for grasping Kotlin/Native’s concurrency mechanics, especially when debugging state-related errors like IllegalStateTransfer.

Freezing State and Its Implications

Kevin Galligan delved into the concept of freezing, a runtime mechanism that designates objects as immutable for safe sharing across threads. Freezing is a one-way operation, recursively applying to an object and its references, with no unfreeze option. This ensures thread safety but introduces challenges, as frozen objects cannot be mutated, leading to InvalidMutabilityException errors if attempted.

In a practical example, Kevin showed how capturing mutable state in a background task can inadvertently freeze an entire object graph, causing runtime failures. He introduced tools like ensureNeverFrozen to debug unintended freezing and stressed intentional mutability—keeping mutable state local to one thread and transforming data into frozen copies for sharing. Kevin also discussed Atomic types, which allow limited mutation of frozen state, but cautioned against overusing them due to performance and memory issues. His experience at Touchlab revealed early missteps with global state and Atomics, leading to a shift toward confined state models.

Multi-Threaded Coroutines and Future Directions

A significant update in Kevin’s talk was the introduction of multi-threaded coroutines, enabled by a draft pull request in 2019. Previously, Kotlin/Native coroutines were single-threaded, limiting concurrency and stunting library development. The new model allows coroutines to switch threads using dispatchers, with data passed between threads frozen to maintain strict mode. Kevin demonstrated replacing a custom background function with a coroutine-based approach, simplifying concurrency while adhering to state rules.

This development clarified the longevity of strict mode, countering speculation about a relaxed mode that would mimic JVM-style shared memory. Kevin noted that multi-threaded coroutines unblocked library development, citing projects like AtomicFu and SQLDelight. He also highlighted Touchlab’s Droidcon app, which adopted multi-threaded coroutines for production, showcasing their practical viability. Looking forward, Kevin anticipated increased community adoption and library growth in 2020, urging developers to explore the model despite its learning curve.

Conclusion

Kevin Galligan’s KotlinConf 2019 talk demystifies Kotlin/Native’s concurrency model, offering a clear path for developers navigating its strict rules. By embracing thread confinement, leveraging workers, managing frozen state, and adopting multi-threaded coroutines, developers can build robust multiplatform applications. This talk is a must for Kotlin/Native enthusiasts seeking to master concurrency in modern mobile development.

Hashtags: #KevinGalligan #KotlinNative #Concurrency #Touchlab #JetBrains #Multiplatform

PostHeaderIcon [KotlinConf2018] Mathematical Modeling in Kotlin: Optimization, Machine Learning, and Data Science Applications

Lecturer

Thomas Nield is a Business Consultant at Southwest Airlines, balancing technology with operations research in airline scheduling and optimization. He is an author with O’Reilly Media, having written “Getting Started with SQL” and “Learning RxJava,” and contributes to open-source projects like RxJavaFX and RxKotlin. Relevant links: O’Reilly Profile (publications); LinkedIn Profile (professional page).

Abstract

This article explores mathematical modeling in Kotlin, addressing complex problems through discrete optimization, Bayesian techniques, and neural networks. It analyzes methodologies for scheduling, regression, and classification, contextualized in data science and operations research. Implications for production deployment, library selection, and problem-solving efficiency are discussed, emphasizing Kotlin’s refactorable features.

Introduction and Context

Mathematical modeling solves non-deterministic problems beyond brute force, such as scheduling 190 classes or optimizing train costs. Kotlin’s pragmatic features enable clear, evolvable models for production.

Context: Models underpin data science, machine learning, and operations research. Examples include constraint programming for puzzles (Sudoku) and real-world applications (airline schedules).

Methodological Approaches

Discrete optimization uses libraries like OjAlgo for linear programming (e.g., minimizing train costs with constraints). Bayesian classifiers (e.g., Naive Bayes) model probabilities for spam detection.

Neural networks: Custom implementations train on MNIST for digit recognition, using activation functions (sigmoid) and backpropagation. Kotlin’s extensions and lambdas facilitate intuitive expressions.

Graph optimization: Dijkstra’s algorithm for shortest paths, applicable to logistics.

Analysis of Techniques and Examples

Optimization: Linear models minimize objectives under constraints; graph models solve routing (e.g., traveling salesman via genetic algorithms).

Bayesian: Probabilistic inference for sentiment/email classification, leveraging word frequencies.

Neural networks: Multi-layer perceptrons for fuzzy problems (image recognition); Kotlin demystifies black boxes through custom builds.

Innovations: Kotlin’s type safety and conciseness aid refactoring; libraries like Deeplearning4j for production.

Implications and Consequences

Models enable efficient solutions; choose based on data/problem nature (optimization for constraints, networks for fuzzy data).

Consequences: Custom implementations build intuition but libraries optimize; Kotlin enhances maintainability for production.

Conclusion

Kotlin empowers mathematical modeling, bridging optimization and machine learning for practical problem-solving.

Links

PostHeaderIcon [KotlinConf2018] Optimizing Unit Testing in Kotlin: Philipp Hauer’s Best Practices for Idiomatic Tests

Lecturer

Philipp Hauer is a team lead at Spreadshirt in Leipzig, Germany, developing JVM-based web applications. Passionate about Kotlin, clean code, and software sociology, he blogs and tweets actively. Relevant links: Philipp Hauer’s Blog (publications); LinkedIn Profile (professional page).

Abstract

This article explores Philipp Hauer’s best practices for unit testing in Kotlin, focusing on leveraging its language features for readable, concise tests. Set in JVM development, it examines test lifecycles, mocking, assertions, and data classes. The analysis highlights innovations in idiomatic testing, with implications for code quality and developer efficiency.

Introduction and Context

Philipp Hauer addressed KotlinConf 2018 on unit testing, emphasizing Kotlin’s potential to create expressive tests. At Spreadshirt, he uses Kotlin for Android and web applications, where testing ensures reliability. The context is a need for idiomatic, maintainable test code that leverages Kotlin’s features like data classes and lambdas, moving beyond Java’s verbosity.

Methodological Approaches to Unit Testing

Hauer outlined a comprehensive setup: Use JUnit5 for lifecycle management, ensuring clear beforeEach/afterEach blocks. For mocking, he recommended MockK, tailored for Kotlin’s null safety. Assertions employed Kotest for fluent checks, avoiding Java’s clunky AssertJ. Data classes simplified test data creation, with named parameters enhancing readability. Spring integration used @MockBean for dependency injection. Test methods used descriptive names (e.g., shouldSaveUser) and parameterized tests for coverage.

Analysis of Innovations and Features

Kotlin’s data classes innovate test data setup, reducing boilerplate compared to Java POJOs. MockK’s relaxed mocks handle Kotlin’s nullability, unlike Mockito. Kotest’s assertions provide readable failure messages. Parameterized tests cover edge cases efficiently. Compared to Java, Kotlin tests are more concise, though complex setups require careful lifecycle management.

Implications and Consequences

Hauer’s practices imply higher-quality tests, improving code reliability. Concise tests enhance maintainability, accelerating development cycles. Consequences include a learning curve for MockK and Kotest, but their Kotlin alignment justifies adoption.

Conclusion

Hauer’s guidelines establish a robust framework for idiomatic Kotlin testing, leveraging its features for clarity and efficiency, setting a standard for modern JVM testing.

Links

PostHeaderIcon [KotlinConf2018] Taming State with Sealed Classes: Patrick Cousins’ Approach at Etsy

Lecturer

Patrick Cousins is a software engineer at Etsy with nearly 20 years of programming experience, passionate about new patterns and languages. He is known for his work on state management and seal-related puns. Relevant links: Etsy Code as Craft Blog (publications); LinkedIn Profile (professional page).

Abstract

This article examines Patrick Cousins’ use of Kotlin sealed classes to manage complex state in Etsy’s mobile apps. Contextualized in event-driven architectures, it explores methodologies for event streams with RxJava and when expressions. The analysis highlights innovations in exhaustiveness and type safety, contrasting Java’s limitations, with implications for robust state handling.

Introduction and Context

Patrick Cousins spoke at KotlinConf 2018 about sealed classes, inspired by his blog post on Etsy’s engineering site. Etsy’s mobile apps juggle complex state—listings, tags, shipping profiles—forming a “matrix of possibilities.” Sealed classes offer a type-safe way to model these, replacing Java’s error-prone instanceof checks and visitor patterns. This narrative unfolds where mobile apps demand reliable state management to avoid costly errors.

Methodological Approaches to State Management

Cousins modeled state as sealed class hierarchies, emitting events via RxJava streams. Using filterIsInstance and when, he ensured exhaustive handling of state types like Loading, Success, or Error. This avoided Java’s polymorphic indirection, where unrelated types forced artificial interfaces. Sealed classes, confined to one file, prevented unintended extensions, ensuring safety.

Analysis of Innovations and Features

Sealed classes innovate by guaranteeing exhaustiveness in when, unlike Java’s instanceof, which risks missing branches. Kotlin’s final-by-default classes eliminate Liskov substitution issues, avoiding polymorphic pitfalls. RxJava integration enables reactive updates, though requires careful ordering. Compared to Java, sealed classes simplify state logic without forced commonality, though complex hierarchies demand discipline.

Implications and Consequences

Cousins’ approach implies safer, more maintainable state management, critical for e-commerce apps. It reduces bugs from unhandled states, enhancing user experience. Consequences include a shift from polymorphic designs, though developers must adapt to sealed class constraints. The pattern encourages adoption in reactive systems.

Conclusion

Cousins’ use of sealed classes redefines state handling at Etsy, leveraging Kotlin’s type safety to create robust, readable mobile architectures.

Links

PostHeaderIcon [KotlinConf2018] Implementing Raft with Coroutines and Ktor: Andrii Rodionov’s Distributed Systems Approach

Lecturer

Andrii Rodionov, a Ph.D. in computer science, is an associate professor at National Technical University and a software engineer at Wix. He leads JUG UA, organizes JavaDay UA, and co-organizes Kyiv Kotlin events. Relevant links: Wix Engineering Blog (publications); LinkedIn Profile (professional page).

Abstract

This article analyzes Andrii Rodionov’s implementation of the Raft consensus protocol using Kotlin coroutines and Ktor. Set in distributed systems, it examines leader election, log replication, and fault tolerance. The analysis highlights innovations in asynchronous communication, with implications for scalable, fault-tolerant key-value stores.

Introduction and Context

Andrii Rodionov presented at KotlinConf 2018 on implementing Raft, a consensus protocol used in systems like Docker Swarm. Distributed systems face consensus challenges; Raft ensures agreement via leader election and log replication. Rodionov’s in-memory key-value store demo leveraged Kotlin’s coroutines and Ktor for lightweight networking, set against the need for robust, asynchronous distributed architectures.

Methodological Approaches to Raft Implementation

Rodionov used coroutines for non-blocking node communication, with async for leader election and channel for log replication. Ktor handled HTTP-based node interactions, replacing heavier JavaNet. The demo showcased a cluster tolerating node failures: Servers transition from follower to candidate to leader, propagating logs via POST requests. Timeouts triggered elections, ensuring fault tolerance.

Analysis of Innovations and Features

Coroutines innovate Raft’s asynchronous tasks, simplifying state machines compared to Java’s thread-heavy approaches. Ktor’s fast startup and lightweight routing outperform JavaNet, enabling efficient cluster communication. The demo’s fault tolerance—handling node crashes—demonstrates robustness. Limitations include coroutine complexity for novices and Ktor’s relative immaturity versus established frameworks.

Implications and Consequences

Rodionov’s implementation implies easier development of distributed systems, with coroutines reducing concurrency boilerplate. Ktor’s efficiency suits production clusters. Consequences include broader Kotlin adoption in systems like Consul, though mastering coroutines requires investment. The demo’s open-source nature invites community enhancements.

Conclusion

Rodionov’s Raft implementation showcases Kotlin’s strengths in distributed systems, offering a scalable, fault-tolerant model for modern consensus-driven applications.

Links

PostHeaderIcon [KotlinConf2018] Performant Multiplatform Serialization in Kotlin: Eric Cochran’s Approach to Code Sharing

Lecturer

Eric Cochran is an Android developer at Pinterest, focusing on performance across the app stack. He contributes to open-source projects, notably the Moshi JSON library. Relevant links: Pinterest Engineering Blog (publications); LinkedIn Profile (professional page).

Abstract

This article analyzes Eric Cochran’s exploration of Kotlin Serialization for multiplatform projects, emphasizing its role in enhancing code reuse across platforms. Set in the context of Pinterest’s performance-driven Android development, it examines methodologies for integrating serialization with data formats and frameworks. The analysis highlights innovations in type safety and performance, with implications for cross-platform scalability and library evolution.

Introduction and Context

Eric Cochran presented at KotlinConf 2018, focusing on Kotlin Serialization’s potential to unify code in multiplatform environments. As an Android developer at Pinterest, Cochran’s work on serialization formats like Moshi informed his advocacy for Kotlin’s experimental library. The context is the growing need for shared logic in apps targeting JVM, JS, and Native, where serialization ensures seamless data handling across diverse runtimes.

Methodological Approaches to Serialization

Cochran outlined Kotlin Serialization’s setup: Annotate data classes with @Serializable to generate compile-time adapters, supporting JSON, Protobuf, and CBOR. Integration with frameworks like OkHttp or Ktor involves custom serializers for complex types. He demonstrated parsing dynamic JSON structures, emphasizing compile-time safety over Moshi’s runtime reflection. Performance optimizations included minimizing allocations and leveraging inline classes. Cochran compared Moshi’s factory-based API, noting its JVM-centric limitations versus Kotlin Serialization’s multiplatform readiness.

Analysis of Innovations and Features

Kotlin Serialization innovates with compile-time code generation, avoiding reflection’s overhead, unlike Moshi’s Java type reliance. It supports multiple formats, enhancing flexibility compared to JSON-centric libraries. Inline classes reduce boxing, boosting performance. Limitations include poor dynamic type handling and manual serializer implementation for custom cases. Compared to Moshi, it offers broader platform support but lacks mature metadata APIs.

Implications and Consequences

The library implies greater code sharing in multiplatform apps, reducing duplication and maintenance. Its performance focus suits high-throughput systems like Pinterest’s. Consequences include a shift toward compile-time solutions, though experimental status requires caution. Future integration with Okio’s multiplatform efforts could resolve reflection issues, broadening adoption.

Conclusion

Cochran’s insights position Kotlin Serialization as a cornerstone for multiplatform data handling, offering a performant, type-safe alternative that promises to reshape cross-platform development.

Links

PostHeaderIcon [KotlinConf2018] Fostering Collaborative Learning: Maria Neumayer and Amal Kakaiya’s Approach to Team-Based Kotlin Adoption

Lecturers

Maria Neumayer is an Android developer at Deliveroo, specializing in UI since 2010. Originally from Austria, she has worked in London at Citymapper, Path, Saffron Digital, and Rummble. Amal Kakaiya, also an Android engineer at Deliveroo, has coded professionally since 2012. A Glasgow native, he is a triathlete based in East London. Relevant links: Deliveroo Tech Blog (publications); Maria Neumayer’s LinkedIn; Amal Kakaiya’s LinkedIn (professional pages).

Abstract

This article examines Maria Neumayer and Amal Kakaiya’s insights on adopting Kotlin collaboratively within Deliveroo’s Android team. Set against the backdrop of transitioning to Kotlin in production, it explores methodologies like dedicated learning hours and enhanced code reviews. The analysis highlights innovations in fostering openness, combating imposter syndrome, and improving engineering culture, with implications for team dynamics and code quality.

Introduction and Context

At KotlinConf 2018, Maria Neumayer and Amal Kakaiya shared their team’s journey of adopting Kotlin for Deliveroo’s consumer Android app. About one and a half years prior, the team embraced Kotlin, recognizing its learning curve as an opportunity for collective growth. This narrative unfolds in a context where individual learning styles vary, yet collaborative approaches can unify teams, enhance code quality, and nurture a culture of inquiry and knowledge-sharing.

Methodological Approaches to Team Learning

The team implemented structured learning strategies. They allocated weekly Kotlin hours for hands-on practice, encouraging experimentation with features like coroutines. Code reviews shifted from mere correctness checks to learning platforms, where developers shared insights on Kotlin idioms. Pair programming and mob sessions facilitated real-time knowledge exchange, while attending cross-disciplinary talks (e.g., backend conferences) broadened perspectives. They also created forums like “Kotlin Era” to discuss and upskill, ensuring inclusivity.

Analysis of Innovations and Features

The innovation lies in treating learning as a team endeavor, not an individual task. Structured Kotlin hours fostered experimentation, reducing fear of failure. Code reviews as learning tools encouraged constructive feedback, leveraging Kotlin’s concise syntax to highlight best practices. Cross-disciplinary exposure added diverse insights, unlike traditional siloed learning. Compared to solo learning, this approach mitigated imposter syndrome by normalizing questions. Challenges included balancing learning with delivery and ensuring all team members engaged equally.

Implications and Consequences

This collaborative model implies stronger team cohesion and faster Kotlin adoption. By sharing knowledge, teams produce idiomatic, maintainable code, enhancing app quality. The cultural shift toward openness reduces psychological barriers, fostering inclusivity. Consequences include improved processes, though maintaining momentum requires sustained effort and leadership support.

Conclusion

Neumayer and Kakaiya’s approach demonstrates that collaborative learning accelerates Kotlin adoption while strengthening engineering culture. By learning together, teams create not only better code but also a supportive, innovative environment.

Links

PostHeaderIcon [KotlinConf2018] Reflections on Kotlin’s Future: Insights from the KotlinConf 2018 Closing Panel

Lecturers

The panel featured JetBrains and community experts, including Kotlin developers and contributors like Jake Wharton and Venkat Subramaniam. Relevant links: JetBrains Blog (publications); Jake Wharton’s LinkedIn; Venkat Subramaniam’s LinkedIn (professional pages).

Abstract

This article synthesizes the KotlinConf 2018 Closing Panel’s discussions on Kotlin’s roadmap, features, and community growth. Contextualized in Kotlin’s rapid adoption, it examines questions on version 1.3, multiplatform, and concurrency models. The analysis highlights innovations like coroutines, with implications for accessibility, tooling, and future development.

Introduction and Context

The KotlinConf 2018 Closing Panel convened experts to reflect on Kotlin’s trajectory post-version 1.2. Topics ranged from release timelines to Kotlin/Native’s concurrency model and beginner accessibility. Set against Kotlin’s appeal to Java developers and its expanding multiplatform scope, the panel addressed community concerns and future directions, emphasizing JetBrains’ commitment to a robust ecosystem.

Methodological Approaches to Panel Discussion

Panelists addressed audience queries systematically. On version 1.3, they outlined stabilization goals, followed by post-1.3 focus on multiplatform libraries. Kotlin/Native’s distinct memory model was justified for safety, contrasting JVM threads. For beginners, they recommended community resources like Kotlin Slack. Coroutines were compared to RxJava, favoring simplicity for sequential tasks. Dokka improvements and GPU programming were acknowledged as future explorations.

Analysis of Innovations and Features

Kotlin 1.3 introduced stable coroutines, enhancing asynchronous programming versus RxJava’s complexity. Kotlin/Native’s concurrency model avoids shared mutable state, unlike iOS or JVM, ensuring safety but requiring adaptation. Multiplatform libraries promise code reuse, though Angular integration remains unexplored. The panel emphasized restraint in using Kotlin’s vast features to maintain readability, addressing its steep learning curve.

Implications and Consequences

The panel’s insights imply Kotlin’s evolution toward a versatile, beginner-friendly language. Coroutines simplify concurrency, but Native’s model may slow adoption. Enhanced tooling like Dokka and potential GPU support could broaden applications. Consequences include a growing community, though developers must balance feature richness with clarity to avoid complexity.

Conclusion

The KotlinConf 2018 Closing Panel illuminated Kotlin’s path as a multiplatform powerhouse, balancing innovation with accessibility, poised for continued growth with community feedback shaping its future.

Links