Posts Tagged ‘GoogleWorkspace’
[KotlinConf2024] Kotlin Multiplatform Powers Google Workspace
At KotlinConf2024, Jason Parachoniak, a Google Workspace engineer, detailed Google’s shift from a Java-based multiplatform system to Kotlin Multiplatform (KMP), starting with Google Docs. For over a decade, Workspace has relied on shared code for consistency across platforms, like Gmail’s synchronization layer. Jason shared how KMP enhances this approach, leveraging Kotlin’s ecosystem for better performance and native interop. The talk highlighted lessons from the migration, focusing on build efficiency, runtime latency, and memory challenges, offering insights for large-scale KMP adoption.
Why Kotlin Multiplatform for Workspace
Google Workspace has long used multiplatform code to ensure consistency, such as identical email drafts across devices in Gmail or uniform document models in Docs. Jason explained that their Java-based system, using transpilers like J2ObjC, was effective but complex. KMP offers a modern alternative, allowing developers to write Kotlin code that compiles to native platforms, improving runtime performance and ecosystem integration. By targeting business logic—everything beyond the UI—Workspace ensures native-feel apps while sharing critical functionality, aligning with user expectations for productivity tools.
Google Docs: The Migration Testbed
The migration began with Google Docs, chosen for its heavily annotated codebase, which tracks build performance, latency, and memory usage. Jason described how Docs is rolling out on KMP, providing metrics to refine the Kotlin compiler and runtime. This controlled environment allowed Google to compare KMP against their legacy system, ensuring parity before expanding to other apps. Collaboration with JetBrains and the Android team has been key, with iterative improvements driven by real-world data, setting a foundation for broader Workspace adoption.
Tackling Build Performance
Build performance posed challenges, as Google’s Bazel-like system resembles clean builds, unlike Gradle’s incremental approach. Jason recounted a 10-minute build time increase after a Kotlin Native update optimized LLVM bitcode generation. While this improved binary size and speed, it slowed builds. Profiling revealed a slow LLVM pass, already fixed in a newer version. Google patched LLVM temporarily, reducing build times from 30 to 8 minutes, and is working with JetBrains to update Kotlin Native’s LLVM, prioritizing stability alongside the K2 compiler rollout.
Optimizing Runtime Latency
Runtime latency, critical for Workspace apps, required Kotlin Native garbage collection (GC) tweaks. Jason noted that JetBrains proactively adjusted GC before receiving Google’s metrics, but further heuristics were needed as latency issues emerged. String handling in the interop layer also caused bottlenecks, addressed with temporary workarounds. Google is designing long-term fixes with JetBrains, ensuring smooth performance across platforms. These efforts highlight KMP’s potential for high-performance apps, provided runtime challenges are systematically resolved through collaboration.
Addressing Memory Usage
Memory usage spikes were a surprise, particularly between iOS 15 and 16. Jason explained that iOS 16’s security-driven constant pool remapping marked Kotlin Native’s vtables as dirty, consuming megabytes of RAM. Google developed a heap dump tool generating HPROF files, compatible with IntelliJ’s Java heap analysis, to diagnose issues. This tool is being upstreamed to Kotlin Native’s runtime, enhancing debugging capabilities. These insights are guiding Google’s memory optimization strategy, ensuring KMP meets Workspace’s stringent performance requirements as the migration expands.
Links:
[GoogleIO2024] What’s New in Google Cloud and Google Workspace: Innovations for Developers
Google Cloud and Workspace offer a comprehensive suite of tools designed to simplify software development and enhance productivity. Richard Seroter’s overview showcased recent advancements, emphasizing infrastructure, AI capabilities, and integrations that empower creators to build efficiently and scalably.
AI Infrastructure and Model Advancements
Richard began with Google Cloud’s vertically integrated AI stack, from foundational infrastructure like TPUs and GPUs to accessible services for model building and deployment. The Model Garden stands out as a hub for discovering over 130 first-party and third-party models, facilitating experimentation.
Gemini models, including 1.5 Pro and Flash, provide multimodal reasoning with expanded context windows—up to two million tokens—enabling complex tasks like video analysis. Vertex AI streamlines customization through techniques like RAG and fine-tuning, supported by tools such as Gemini Code Assist for code generation and debugging.
Agent Builder introduces no-code interfaces for creating conversational agents, integrating with databases and APIs. Security features, including watermarking and red teaming, ensure responsible deployment. Recent updates, as of May 2024, include Gemini 1.5 Flash for low-latency applications.
Data Management and Analytics Enhancements
BigQuery’s evolution incorporates AI for natural language querying, simplifying data exploration. Gemini in BigQuery generates insights and visualizations, while BigQuery Studio unifies workflows for data engineering and ML.
AlloyDB AI embeds vector search for semantic querying, enhancing RAG applications. Data governance tools like Dataplex ensure secure, compliant data handling across hybrid environments.
Spanner’s dual-region configurations and interleaved tables optimize global, low-latency operations. These features, updated in 2024, support scalable, AI-ready data infrastructures.
Application Development and Security Tools
Firebase’s Genkit framework aids in building AI-powered apps, with integrations for observability and deployment. Artifact Registry’s vulnerability scanning bolsters security.
Cloud Run’s CPU allocation during requests improves efficiency for bursty workloads. GKE’s Autopilot mode automates cluster management, reducing operational overhead.
Security enhancements include Confidential Space for sensitive data processing and AI-driven threat detection in Security Command Center. These 2024 updates prioritize secure, performant app development.
Workspace Integrations and Productivity Boosts
Workspace APIs enable embedding features like smart chips and add-ons into custom applications. New REST APIs for Chat and Meet facilitate notifications and event management.
Conversational agents via Dialogflow enhance user interactions. These tools, expanded in 2024, foster seamless productivity ecosystems.