Posts Tagged ‘IBM’
[DevoxxUK2024] Processing XML with Kafka Connect by Dale Lane
Dale Lane, a seasoned developer at IBM with a deep focus on event-driven architectures, delivered a compelling session at DevoxxUK2024, unveiling a powerful Kafka Connect plugin designed to streamline XML data processing. With extensive experience in Apache Kafka and Flink, Dale addressed the challenges of integrating XML data into Kafka pipelines, a task often fraught with complexity due to the format’s incompatibility with Kafka’s native data structures like Avro or JSON. His presentation offers practical solutions for developers seeking to bridge external systems with Kafka, transforming XML into more manageable formats or generating XML outputs for legacy systems. Through clear examples, Dale illustrates how this open-source plugin enhances flexibility and efficiency in Kafka Connect pipelines, empowering developers to handle diverse data integration scenarios with ease.
Understanding Kafka Connect Pipelines
Dale begins by demystifying Kafka Connect, a robust framework for moving data between Kafka and external systems. He outlines two primary pipeline types: source pipelines, which import data from external systems into Kafka, and sink pipelines, which export Kafka data to external destinations. A source pipeline typically involves a connector to fetch data, optional transformations to modify or filter it, and a converter to serialize the data into formats like Avro or JSON for Kafka topics. Conversely, a sink pipeline starts with a converter to deserialize Kafka data, followed by transformations and a connector to deliver it to an external system. This foundational explanation sets the stage for understanding where and how XML processing fits into these workflows, ensuring developers grasp the pipeline’s modular structure before diving into specific use cases.
Converting XML for Kafka Integration
A common challenge Dale addresses is integrating XML data from external systems, such as IBM MQ or XML-based web services, into Kafka’s ecosystem, which favors structured formats. He introduces the Kafka Connect plugin, available on GitHub under an Apache license, as a solution to parse XML into structured records early in the pipeline. For instance, using an IBM MQ source connector, the plugin can transform XML documents from a message queue into a generic structured format, allowing subsequent transformations and serialization into JSON or Avro. Dale demonstrates this with a weather API that returns XML strings, showing how the plugin converts these into structured objects for further processing, making them compatible with Kafka tools that struggle with raw XML. This approach significantly enhances the usability of external data within Kafka’s ecosystem.
Generating XML Outputs from Kafka
For scenarios where external systems require XML, Dale showcases the plugin’s ability to convert Kafka’s JSON or Avro messages into XML strings within a sink pipeline. He provides an example using a Kafka topic with JSON messages destined for an IBM MQ system, where the plugin, integrated as part of the sink connector, transforms structured data into XML before delivery. Another case involves an HTTP sink connector posting to an XML-based web service, such as an XML-RPC API. Here, the pipeline deserializes JSON, applies transformations to align with the API’s payload requirements, and uses the plugin to produce an XML string. This flexibility ensures seamless communication with legacy systems, bridging modern Kafka workflows with traditional XML-based infrastructure.
Enhancing Pipelines with Schema Support
Dale emphasizes the plugin’s schema handling capabilities, which add robustness to XML processing. In source pipelines, the plugin can reference an external XSD schema to validate and structure XML data, which is then paired with an Avro converter to submit schemas to a registry, ensuring compatibility with Kafka’s schema-driven ecosystem. In sink pipelines, enabling schema inclusion generates an XSD alongside the XML output, providing a clear description of the data’s structure. Dale illustrates this with a stock price connector, where enabling schema support produces XML events with accompanying XSDs, enhancing interoperability. This feature is particularly valuable for maintaining data integrity across systems, making the plugin a versatile tool for complex integration tasks.
Links:
[DevoxxUS2017] Eclipse OMR: A Modern, Open-Source Toolkit for Building Language Runtimes by Daryl Maier
At DevoxxUS2017, Daryl Maier, a Senior Software Developer at IBM, introduced Eclipse OMR, an open-source toolkit for building high-performance language runtimes. With two decades of experience in compiler development, Daryl shared how OMR repurposes components of IBM’s J9 Java Virtual Machine to support diverse dynamic languages without imposing Java semantics. His session highlighted OMR’s potential to democratize runtime technology, fostering innovation across language ecosystems. This post explores the core themes of Daryl’s presentation, emphasizing OMR’s role in advancing runtime development.
Unlocking JVM Technology with OMR
Daryl Maier opened by detailing the Eclipse OMR project, which extracts core components of the J9 JVM, such as its compiler and garbage collector, for broader use. Unlike building languages atop Java, OMR provides modular, high-performance tools for creating custom runtimes. Daryl’s examples showcased OMR’s flexibility in supporting languages beyond Java, drawing from his work at IBM’s Canada Lab to illustrate its potential for diverse applications.
Compiler and Runtime Innovations
Transitioning to technical specifics, Daryl explored OMR’s compiler technology, designed for just-in-time (JIT) compilation in dynamic environments. He contrasted OMR with LLVM, noting its lightweight footprint and optimization for runtime performance. Daryl highlighted OMR’s garbage collection and code generation capabilities, which enable efficient, scalable runtimes. His insights underscored OMR’s suitability for dynamic languages, offering developers robust tools without the overhead of traditional compilers.
Active Development and Use Cases
Daryl discussed active OMR projects, including integrations with existing runtimes to enhance debuggability and performance. He referenced a colleague’s upcoming demo on OMR’s tooling interfaces, illustrating practical applications. Drawing from IBM’s extensive runtime expertise, Daryl showcased how OMR supports innovative use cases, from scripting languages to domain-specific runtimes, encouraging developers to leverage its modular architecture.
Engaging the Developer Community
Concluding, Daryl invited developers to contribute to Eclipse OMR, emphasizing its open-source ethos. He highlighted collaboration opportunities, noting contact points with project co-leads Mark and Charlie. Daryl’s call to action, rooted in IBM’s commitment to open-source innovation, encouraged attendees to explore OMR’s GitHub repository and participate in shaping the future of language runtimes.
Links:
[DevoxxUS2017] New Computer Architectures: Explore Quantum Computers & SyNAPSE Neuromorphic Chips by Peter Waggett
At DevoxxUS2017, Dr. Peter Waggett, Director of IBM’s Emerging Technology group at the Hursley Laboratory, delivered a thought-provoking session on next-generation computer architectures, focusing on quantum computers and IBM’s TrueNorth neuromorphic chip. With a background in radio astronomy and extensive research in cognitive computing, Peter explored how these technologies address the growing demand for processing power in a smarter, interconnected world. This post delves into the core themes of Peter’s presentation, highlighting the potential of these innovative architectures.
Quantum Computing: A New Frontier
Peter Waggett introduced quantum computing, explaining its potential to solve complex problems beyond the reach of classical systems. He described how quantum computers manipulate atomic spins using MRI-like systems, leveraging quantum entanglement and superposition. Drawing from his work at IBM, Peter highlighted ongoing research to make quantum computing accessible, emphasizing its role in advancing fields like cryptography and material science, despite challenges like helium shortages impacting hardware.
TrueNorth: Brain-Inspired Computing
Delving into neuromorphic computing, Peter showcased IBM’s TrueNorth chip, a brain-inspired architecture with 1 million neurons and 256 synapses, consuming just 73mW. Unlike traditional processors, TrueNorth challenges conventions like exact data representation and synchronicity, enabling low-power sensory perception for IoT and mobile applications. Peter’s examples illustrated TrueNorth’s scalability, positioning it as a cornerstone of IBM’s cognitive hardware ecosystem for transformative applications.
Addressing Scalability and Efficiency
Peter discussed the scalability of new architectures, comparing TrueNorth’s energy efficiency to traditional compute fabrics. He highlighted how neuromorphic chips optimize for error tolerance and energy-frequency trade-offs, ideal for IoT’s sensory demands. His insights, grounded in IBM’s client-focused projects, underscored the need for innovative designs to meet the computational needs of a connected planet, from smart cities to autonomous devices.
Building a Developer Community
Concluding, Peter emphasized the importance of fostering a developer community to advance these technologies. He encouraged collaboration through IBM’s research initiatives, noting the need for skilled engineers to tackle challenges like helium scarcity and system design. Peter’s vision for accessible platforms, inspired by his radio astronomy background, invited developers to explore quantum and neuromorphic computing, driving innovation in cognitive systems.
Links:
[DevoxxFR2014] Browser IDEs and Why You Don’t Like Them: A Deep Dive into Cloud-Based Development Challenges and Opportunities
Lecturer
Ken Walker, a seasoned software engineer at IBM in Ottawa, Canada, leads the Orion project at the Eclipse Foundation. With extensive experience in software tooling, Walker has been instrumental in advancing web-based development environments. His work focuses on bridging the gap between traditional desktop IDEs and emerging cloud-based solutions, emphasizing accessibility and collaboration. As a key contributor to the Eclipse ecosystem, he leverages IBM’s long-standing involvement in open-source initiatives, including the Eclipse Foundation’s formation in 2004, to drive innovation in developer tools.
Abstract
The transition to cloud-based development environments has promised seamless collaboration, instant access, and reduced setup overhead, yet browser-based Integrated Development Environments (IDEs) like Orion face skepticism from developers accustomed to robust desktop tools such as IntelliJ IDEA or Visual Studio. This lecture, delivered at Devoxx France 2014, explores the reasons behind this resistance, dissecting the technical and usability shortcomings of browser IDEs while highlighting their unique strengths. Through a detailed comparison of desktop and cloud-based development workflows, Ken Walker examines performance bottlenecks, integration challenges, and user experience gaps that deter adoption. He also showcases Orion’s innovative features, such as real-time collaboration and cloud deployment integration, to demonstrate its potential. The discussion concludes with a forward-looking perspective on how evolving web technologies could make browser IDEs indispensable, offering insights for developers considering hybrid workflows in modern software engineering.
The Evolution of IDEs and the Cloud Paradigm Shift
Integrated Development Environments have evolved significantly since the 1980s, when tools like Turbo Pascal provided basic editing and compilation for single languages. The 1990s introduced cross-platform IDEs like Eclipse and NetBeans, which embraced modular architectures to support diverse languages and tools. These desktop IDEs excelled in performance, leveraging local hardware for fast code completion, debugging, and refactoring. However, the rise of cloud computing in the late 2000s sparked a shift toward browser-based IDEs, promising accessibility across devices, automatic updates, and collaborative features akin to Google Docs.
Ken Walker highlights that this shift has not been universally embraced. Developers often find browser IDEs lacking in responsiveness, particularly for tasks like code analysis or large-scale refactoring. This stems from browser sandboxing, which restricts access to local resources, and the inherent limitations of JavaScript execution compared to native applications. For instance, real-time syntax highlighting in a browser IDE may lag when processing thousands of lines, whereas desktop tools like IntelliJ leverage multithreading and local caching for near-instantaneous feedback.
Integration with local development environments poses another challenge. Desktop IDEs seamlessly interact with local file systems, Git clients, and build tools like Maven. In contrast, browser IDEs rely on cloud storage or WebSocket-based synchronization, which can introduce latency or data consistency issues during network disruptions. Walker cites user feedback from the Eclipse community, noting that developers often struggle with configuring browser IDEs to replicate the seamless toolchains of desktop counterparts.
Why Developers Resist Browser IDEs
Walker delves into specific pain points that fuel developer skepticism. One major issue is the lack of feature parity with desktop IDEs. Advanced debugging, a cornerstone of development, is less robust in browser environments. For example, Orion’s debugging relies on remote sessions, which can falter over unstable connections, making it difficult to step through code or inspect complex object states. In contrast, tools like Visual Studio offer graphical debuggers with real-time memory visualization, which browser IDEs struggle to replicate due to browser API constraints.
User experience gaps further compound resistance. Keyboard shortcuts, critical for productivity, often conflict with browser defaults (e.g., Ctrl+S for saving vs. browser save-page functionality), requiring developers to relearn bindings or configure overrides, which vary across browsers like Chrome, Firefox, and Safari. Touch-based devices exacerbate usability issues, as precise cursor placement or multi-line editing becomes cumbersome without mouse input, particularly on tablets.
Collaboration, a touted benefit of browser IDEs, can also disappoint. While real-time editing is possible, poorly handled concurrent changes lead to merge conflicts, especially in large teams. Orion’s Git integration supports basic workflows like commits and pulls, but complex operations like rebasing or resolving merge conflicts lack the intuitive interfaces of desktop tools. Walker acknowledges these issues but argues that they reflect growing pains rather than inherent flaws, as web technologies continue to mature.
Orion’s Strengths and Innovations
Despite these challenges, Orion offers compelling advantages that desktop IDEs struggle to match. Its cloud-native design enables instant project sharing: developers can fork a GitHub repository, edit code in the browser, and push changes without local setup. This lowers barriers for open-source contributors and simplifies onboarding for distributed teams. For example, a developer can share a workspace URL, allowing colleagues to edit code or review changes in real time, a feature that requires additional plugins in desktop IDEs.
Orion integrates seamlessly with cloud platforms like Heroku and AWS, enabling direct deployment from the browser. This streamlines workflows for web developers, who can preview applications without leaving the IDE. Walker demonstrates a live example where a JavaScript application is edited, tested, and deployed to a cloud server in under a minute, showcasing the potential for rapid prototyping.
Recent advancements in web technologies bolster Orion’s capabilities. WebAssembly enables computationally intensive tasks like code analysis to run efficiently in browsers, narrowing the performance gap with native tools. Service workers provide offline support, caching code to allow editing during network outages. Orion’s plugin architecture further enhances flexibility, allowing developers to add custom tools, such as live CSS previews or integration with CI/CD pipelines like Jenkins.
Comparing Desktop and Cloud Workflows
Desktop IDEs excel in performance and integration. IntelliJ IDEA, for instance, uses indexed codebases for instant autocomplete and refactoring across millions of lines. Local Git clients provide robust version control, and native debuggers offer granular control. However, these tools require significant setup—installing Java, configuring plugins, and ensuring compatibility across operating systems—which can hinder collaboration in distributed teams.
Browser IDEs prioritize accessibility. Orion requires only a browser, eliminating installation barriers and ensuring consistency across devices. For educational settings or hackathons, this is transformative: participants can start coding instantly without worrying about Java versions or environment variables. Walker cites Orion’s use in Eclipse community workshops, where novices and experts collaborate seamlessly on shared projects.
The trade-off lies in complexity. Desktop IDEs handle large, monolithic codebases better, while browser IDEs shine for web-focused or lightweight projects. Walker proposes a hybrid model: use browser IDEs for quick edits, prototyping, or collaborative tasks, and desktop IDEs for heavy-duty development like systems programming or enterprise applications.
Future Directions for Browser IDEs
Emerging web standards promise to address current limitations. WebGPU, for instance, will enable hardware-accelerated graphics, improving performance for tasks like code visualization. Progressive Web Apps (PWAs) enhance offline capabilities, making browser IDEs viable in low-connectivity environments. Integration with AI-driven tools, such as GitHub Copilot, could provide intelligent code suggestions, further closing the gap with desktop IDEs.
Walker envisions browser IDEs evolving into primary tools as browser performance approaches native levels. Projects like CodeSandbox and Replit, which emerged post-2014, validate this trajectory, offering robust cloud IDEs with growing adoption. Orion’s open-source nature ensures community-driven enhancements, with plugins for languages like Python and Go expanding its scope.
Conclusion: A Balanced Perspective on Cloud Development
Browser IDEs like Orion represent a paradigm shift, offering unmatched accessibility and collaboration but facing hurdles in performance and integration. While developer resistance is understandable given the maturity of desktop tools, rapid advancements in web technologies suggest a convergence of capabilities. By adopting a hybrid approach—leveraging browser IDEs for lightweight tasks and desktop IDEs for complex projects—developers can maximize productivity. Walker’s talk at DevoxxFR2014 underscores the potential for browser IDEs to reshape development, encouraging the audience to explore tools like Orion while acknowledging areas for improvement.
Links
[DevoxxBE2012] Building Modular Applications with Enterprise OSGi
At DevoxxBE2012, Tim Ward and Holly Cummins, both seasoned experts in OSGi and enterprise technologies, delivered a comprehensive exploration into leveraging Enterprise OSGi for constructing modular applications. Tim, affiliated with Paremus and a key contributor to Apache Aries, alongside Holly from IBM, who focuses on WebSphere and performance tooling, guided attendees through the intricacies of transforming traditional Java EE setups into dynamic, OSGi-based systems. Their session bridged familiar concepts like WAR files and dependency injection with OSGi’s modularity, addressing common pain points in class path management.
They initiated the discussion by outlining Enterprise OSGi’s origins and relevance. Emerging in 2010, this specification enhances OSGi’s core, which has powered embedded systems and IDEs like Eclipse for over a decade, to better suit server-side enterprise needs. Tim and Holly emphasized how it integrates seamlessly with application servers, enabling features such as web applications, database interactions, and transaction management within an OSGi framework.
A key theme was the modularity crisis in large Java EE projects. They illustrated how sprawling WAR files, often exceeding server sizes like Tomcat’s 7MB core, accumulate unnecessary libraries, leading to class path hell. By contrasting tangled dependency graphs with OSGi’s structured approach, they demonstrated how explicit module definitions simplify maintenance and foster cleaner architectures.
Embracing OSGi Modularity Basics
Delving deeper, Tim and Holly explained OSGi’s bundle model, where each bundle—a JAR with added metadata—declares imports and exports via manifests. This enforces visibility rules, preventing accidental dependencies and promoting intentional design. They highlighted how bundles resolve dynamically, allowing runtime adaptability without redeployments.
The duo addressed common misconceptions, asserting that OSGi simplifies both complex and straightforward tasks. For instance, they showcased how dependency injection via Blueprint or Spring works effortlessly in OSGi, maintaining developer familiarity while adding modularity benefits.
They also touched on third-party library integration, noting challenges with non-OSGi JARs but solutions through tools that convert them into bundles. This ensures compatibility, reducing the bloat from redundant inclusions like JavaMail APIs.
Transitioning from WAR to WAB
A pivotal segment focused on evolving WAR files into Web Application Bundles (WABs). Tim and Holly demonstrated this migration, starting with a standard WAR and incorporating OSGi manifests to define it as a bundle. This shift enables deployment in OSGi containers like Apache Karaf or WebSphere Liberty, preserving servlet functionality while gaining modularity.
They illustrated error handling advantages: OSGi fails fast on missing dependencies, unlike runtime surprises in traditional setups. Through live examples, they showed bundles starting only when requirements are met, enhancing reliability.
Furthermore, they explored dynamism, where services can be added or removed at runtime, updating applications without downtime. This transparency in remoting and service interactions aligns with Java EE goals but executes more fluidly in OSGi.
Handling Dependencies and Repositories
Tim and Holly then examined dependency management, advocating explicit declarations to avoid hidden assumptions. They introduced bundle repositories, akin to Maven but tailored for OSGi, which automatically provision required bundles. This centralizes library control, aiding compliance in regulated environments.
In demonstrations, they deployed applications across servers like Liberty and Karaf, resolving dependencies on-the-fly. For instance, adding Joda Time via a repository revived a stalled bundle, showcasing practical modularity.
They stressed architectural enforcement: OSGi’s rules prevent poor practices, but good design remains essential. Tools like Eclipse plugins aid in visualizing and managing these structures.
Demonstrating Dynamism and Best Practices
The session culminated in hands-on demos, where a simple web app evolved into a dynamic OSGi system. Starting with a basic servlet, they integrated services for runtime changes, like toggling UI elements without restarts.
Tim and Holly concluded by reinforcing OSGi’s power in enforcing scalable, maintainable systems. They recommended resources, including their book “Enterprise OSGi in Action,” for further learning. Their presentation underscored how Enterprise OSGi not only resolves class path issues but elevates enterprise development to new levels of flexibility and efficiency.
Links:
[DevoxxFR2012] Jazz Platform: Fostering Collaborative Software Development Through Integrated Tools
Lecturer
Florent Benoit leads the OW2 EasyBeans open-source project and contributes significantly to the OW2 JOnAS application server. An expert in OSGi and Java EE, he provides architectural guidance on major Bull projects. Member of the Java EE 6 expert group for EJB 3.1 specifications, Florent holds a Master’s in Computer Engineering from Joseph Fourier University, Grenoble. He speaks at open-source conferences like JavaOne and Solutions Linux. Alexis Gaches specializes in automating software development lifecycles. Joining the Jazz movement in 2008, he architects Jazz solutions for IBM Rational, collaborating with French enterprises on agile practices for application management.
Abstract
This article assesses Florent Benoit and Alexis Gaches’s overview of IBM’s Jazz platform, aimed at streamlining collaborative software development from requirements to deployment. It dissects tools for requirements management, architecture modeling, implementation, building, testing, and project oversight. Positioned as a response to fragmented processes, the analysis reviews integration mechanisms, open-source alignments, and deployment options. Through demonstrations, it evaluates benefits for agility, traceability, and efficiency, alongside implications for organizational adoption and tool interoperability in diverse environments.
Rationale and Architecture of Jazz Platform
Jazz addresses silos in development by promoting unified collaboration. Florent outlines its genesis: enhancing processes across lifecycle stages—requirements, design, coding, builds, tests, management. Core philosophy: Tools should interconnect, enabling traceability from user stories to code commits.
Architecture leverages Eclipse for IDE integration, with Rational Team Concert (RTC) as hub. RTC supports SCM, work items, builds via Jazz Team Server. Open Services for Lifecycle Collaboration (OSLC) standardizes integrations, allowing third-party tools like Jira to link.
Alexis emphasizes agility: Iterative planning, dashboards for metrics, reducing manual handoffs.
Key Tools and Functionalities
Requirements Composer manages specs, linking to work items. Quality Manager handles testing, integrating with RTC for defect tracking.
Implementation uses Eclipse with RTC plugins for code management, supporting SVN/Git via bridges. Builds automate via Ant/Jenkins, with traceability to changesets.
Demonstrations showcase scenarios: From story creation to code delivery, highlighting real-time updates and approvals.
Deployment options: On-premise or cloud (JazzHub), with free tiers for small teams/academia.
Integration with Open-Source and Legacy Systems
Jazz embraces open-source: Eclipse foundation, OSLC for extensibility. Migrations from ClearCase/SVN use connectors, preserving history.
Challenges: Cultural shifts toward transparency; tool learning curves. Benefits: Reduced cycle times, improved quality via automated traceability.
Future Directions and Community Engagement
IBM’s openness: Public development on jazz.net, inviting contributions. Academic JazzHub fosters education.
Implications: Enhances enterprise agility, but requires commitment. In global teams, it bridges geographies; for startups, free tools lower barriers.
Jazz exemplifies integrated ALM, driving efficient, collaborative delivery.