Recent Posts
Archives

Archive for the ‘General’ Category

PostHeaderIcon [DevoxxFR2013] Security for Enterprises in a Cloudy and Mobile World

Lecturer

Ludovic Poitou serves as Product Manager at ForgeRock, overseeing directory products, and holds the position of General Manager for ForgeRock France. With a background in open-source Java and LDAP, he previously worked at Sun Microsystems as a developer and architect for directory solutions, later engaging in community management.

Abstract

Ludovic Poitou examines evolving enterprise security demands amid mobile proliferation, social networks, and cloud computing. Centering on identity management, he analyzes ForgeRock’s Open Identity Stack—an open-source Java solution—detailing standards like OAuth, OpenID Connect, and SCIM. The discussion evaluates impacts on information systems infrastructure and application architecture, advocating adaptive strategies for secure access in hybrid environments.

Shifting Paradigms: Mobile, Cloud, and Social Influences on Security

Poitou identifies three transformative trends reshaping information security: ubiquitous mobile devices, pervasive social platforms, and cloud services adoption. These necessitate reevaluating traditional perimeters, as data flows beyond firewalls to diverse endpoints.

Mobile introduces BYOD challenges—personal devices accessing corporate resources—demanding granular controls. Cloud shifts storage and processing externally, requiring federated trust. Social networks amplify identity federation needs for seamless yet secure interactions.

At the core lies identity management: provisioning, authentication, authorization, and storage across lifecycles. ForgeRock, emerging post-Sun acquisition, builds on open-source projects like OpenDJ (LDAP server) to deliver comprehensive solutions.

Core Components of Open Identity Stack: Directory, Access, and Federation

ForgeRock’s stack comprises OpenDJ for LDAP-based storage, OpenAM for access management, and OpenIDM for identity administration. OpenDJ handles scalable directories; OpenAM manages SSO, federation; OpenIDM orchestrates provisioning.

Poitou highlights Java foundations enabling portability. Development centers in Grenoble support global operations.

This modular approach allows tailored deployments, integrating with existing systems while supporting modern protocols.

Emerging Standards: OAuth, OpenID Connect, and SCIM for Interoperability

Addressing federation, Poitou details OAuth 2.0 for delegated authorization—clients obtain tokens without credentials. Variants include authorization code for web, implicit for browsers.

OpenID Connect layers identity atop OAuth, providing ID tokens (JWT) with user claims. This enables authenticated APIs, profile sharing.

SCIM standardizes user/group provisioning via REST, simplifying cloud integrations. Poitou contrasts with LDAP’s genericity, noting SCIM’s user-centric focus.

Code illustration (conceptual OAuth flow):

// Client requests token
HttpResponse response = client.execute(new HttpPost("token_endpoint"));

// Server validates, issues JWT
JWTClaimsSet claims = new JWTClaimsSet.Builder()
    .subject(userId)
    .build();
SignedJWT signedJWT = new SignedJWT(header, claims);

These standards facilitate secure, standardized exchanges.

Architectural Implications: Token-Based Authorization and Device Management

Traditional session cookies falter in mobile/cloud; tokens prevail. Applications validate JWTs statelessly, reducing server load.

Poitou discusses administrative token generation—pre-authorizing apps/devices without logins. OpenAM supports this for seamless access.

Infrastructure evolves: decouple authentication from apps via gateways. Hybrid models blend on-premise directories with cloud federation.

Challenges include token revocation, scope management. Solutions involve introspection endpoints, short-lived tokens.

Practical Deployment and Future Considerations

ForgeRock’s stack deploys flexibly—on-premise, cloud, hybrid. OpenDJ scales horizontally; OpenAM clusters for high availability.

Poitou stresses user-centric policies: dynamic authorizations based on context (location, device).

Emerging: UMA for resource owner control. Standards mature via IETF, OpenID Foundation.

Enterprises must adapt architectures for agility, ensuring compliance amid fluidity.

Links:

PostHeaderIcon [DevoxxFR2013] Java EE 7 in Detail

Lecturer

David Delabassee is a Principal Product Manager in Oracle’s GlassFish team. Previously at Sun for a decade, he focused on end-to-end Java, related technologies, and tools. Based in Belgium, he contributes to Devoxx Belgium’s steering committee.

Abstract

David Delabassee’s overview details Java EE 7’s innovations, emphasizing developer simplicity and HTML5 support. Covering WebSockets, JSON-P, JAX-RS 2, JMS 2, concurrency, caching, and batch processing, he demonstrates features via GlassFish. The analysis explores alignments with modern needs like cloud and modularity, implications for productivity, and forward compatibility.

Evolution and Key Themes: Simplifying Development and Embracing Modern Web

Delabassee notes Java EE 6’s (2009) popularity, with widespread server adoption. Java EE 7, nearing finalization, builds on this via JCP, comprising 13 updated, 4 new specs.

Themes: ease of development (defaults, pruning), web enhancements (HTML5 via WebSockets), alignment with trends (cloud, multi-tenancy). Pruning removes outdated techs like EJB CMP; new APIs address gaps.

GlassFish 4, the reference implementation, enables early testing. Delabassee demos features, stressing community feedback.

Core API Enhancements: WebSockets, JSON, and REST Improvements

WebSocket (JSR 356): Enables full-duplex, bidirectional communication over single TCP. Annotate endpoints (@ServerEndpoint), handle messages (@OnMessage).

@ServerEndpoint("/echo")
public class EchoEndpoint {
    @OnMessage
    public void echo(String message, Session session) {
        session.getBasicRemote().sendText(message);
    }
}

JSON-P (JSR 353): Parsing/processing API with streaming, object models. Complements JAX-RS for RESTful services.

JAX-RS 2 (JSR 339): Client API, filters/interceptors, async support. Client example:

Client client = ClientBuilder.newClient();
WebTarget target = client.target("http://example.com");
Response response = target.request().get();

These foster efficient, modern web apps.

Messaging and Concurrency: JMS 2 and Utilities for EE

JMS 2 simplifies: annotation-based injection (@JMSConnectionFactory), simplified API for sending/receiving.

@Inject
JMSContext context;

@Resource(lookup="myQueue")
Queue queue;

context.send(queue, "message");

Concurrency Utilities (JSR 236): Managed executors, scheduled tasks in EE context. Propagate context to threads, avoiding direct Thread creation.

Batch Applications (JSR 352): Framework for chunk/step processing, job management. XML-defined jobs with readers, processors, writers.

Additional Features and Future Outlook: Caching, CDI, and Java EE 8

Though JCache (JSR 107) deferred, it enables standardized distributed caching, usable on EE 7.

CDI 1.1 enhances: @Vetoed for exclusions, alternatives activation.

Java EE 8 plans: modularity, cloud (PaaS/SaaS), further HTML5. Community shapes via surveys.

Delabassee urges Adopt-a-JSR participation for influence.

Implications for Enterprise Development: Productivity and Adaptability

Java EE 7 boosts productivity via simplifications, aligns with web/cloud via new APIs. Demos show practical integration, like WebSocket chats or batch jobs.

Challenges: Learning curve for new features; benefits outweigh via robust, scalable apps.

Forward, EE 7 paves for EE 8’s evolutions, ensuring Java’s enterprise relevance.

Links:

PostHeaderIcon [DevoxxBE2012] What’s New in Groovy 2.0?

Guillaume Laforge, the Groovy Project Lead and a key figure in its development since its inception, provided an extensive overview of Groovy’s advancements. Guillaume, employed by the SpringSource division of VMware at the time, highlighted how Groovy enhances developer efficiency and runtime speed with each iteration. He began by recapping essential elements from Groovy 1.8 before delving into the innovations of version 2.0, emphasizing its role as a versatile language on the JVM.

Guillaume underscored Groovy’s appeal as a scripting alternative to Java, offering dynamic capabilities while allowing modular usage for those not requiring full dynamism. He illustrated this with examples of seamless integration, such as embedding Groovy scripts in Java applications for flexible configurations. This approach reduces boilerplate and fosters rapid prototyping without sacrificing compatibility.

Transitioning to performance, Guillaume discussed optimizations in method invocation and arithmetic operations, which contribute to faster execution. He also touched on library enhancements, like improved date handling and JSON support, which streamline common tasks in enterprise environments.

A significant portion focused on modularity in Groovy 2.0, where the core is split into smaller jars, enabling selective inclusion of features like XML processing or SQL support. This granularity aids in lightweight deployments, particularly in constrained settings.

Static Type Checking for Reliability

Guillaume elaborated on static type checking, a flagship feature allowing early error detection without runtime overhead. He demonstrated annotating classes with @TypeChecked to enforce type safety, catching mismatches in assignments or method calls at compile time. This is particularly beneficial for large codebases, where dynamic typing might introduce subtle bugs.

He addressed extensions for domain-specific languages, ensuring type inference works even in complex scenarios like builder patterns. Guillaume showed how this integrates with IDEs for better code completion and refactoring support.

Static Compilation for Performance

Another cornerstone, static compilation via @CompileStatic, generates bytecode akin to Java’s, bypassing dynamic dispatch for speed gains. Guillaume benchmarked scenarios where this yields up to tenfold improvements, ideal for performance-critical sections.

He clarified that dynamic features remain available selectively, allowing hybrid approaches. This flexibility positions Groovy as a bridge between scripting ease and compiled efficiency.

InvokeDynamic Integration and Future Directions

Guillaume explored JDK7’s invokedynamic support, optimizing dynamic calls for better throughput. He presented metrics showing substantial gains in invocation-heavy code, aligning Groovy closer to Java’s performance.

Looking ahead, he previewed Groovy 2.1 enhancements, including refined type checking for DSLs and complete invokedynamic coverage. For Groovy 3.0, a revamped meta-object protocol and Java 8 lambda compatibility were on the horizon, with Groovy 4.0 adopting ANTLR4 for parsing.

In Q&A, Guillaume addressed migration paths and community contributions, reinforcing Groovy’s evolution as responsive to user needs.

His session portrayed Groovy as maturing into a robust, adaptable toolset for modern JVM development, balancing dynamism with rigor.

Links:

PostHeaderIcon “Apache Maven Dependency Management” by Jonathan Lalou, was published by Packt

Abstract

I am glad and proud to announce the publication of “Apache Maven Dependency Management”, by Packt.

Direct link: https://www.packtpub.com/apache-maven-dependency-management/book

Alternate locations: Amazon.com, Amazon.co.uk, Barnes & Noble.

On this occasion, I’d like to thank all Packt team for allowing me achieving this project.

What you will learn from this book

  • Learn how to use profiles, POM, parent POM, and modules
  • Increase build-speed and decrease archive size
  • Set, rationalize, and exclude transitive dependencies
  • Optimize your POM and its dependencies
  • Migrate projects to Maven including projects with exotic dependencies

In Detail

Managing dependencies in a multi-module project is difficult. In a multi-module project, libraries need to share transitive relations with each other. Maven eliminates this need by reading the project files of dependencies to figure out their inter-relations and other related information. Gaining an understanding of project dependencies will allow you to fully utilize Maven and use it to your advantage.

Aiming to give you a clear understanding of Maven’s functionality, this book focuses on specific case studies that shed light on highly useful Maven features which are often disregarded. The content of this book will help you to replace homebrew processes with more automated solutions.

This practical guide focuses on the variety of problems and issues which occur during the conception and development phase, with the aim of making dependency management as effortless and painless as possible. Throughout the course of this book, you will learn how to migrate from non-Maven projects to Maven, learn Maven best practices, and how to simplify the management of multiple projects. The book emphasizes the importance of projects as well as identifying and fixing potential conflicts before they become issues. The later sections of the book introduce you to the methods that you can use to increase your team’s productivity. This book is the perfect guide to help make you into a proud software craftsman.

Approach

An easy-to-follow, tutorial-based guide with chapters progressing from basic to advanced dependency management.

Who this book is for

If you are working with Java or Java EE projects and you want to take advantage of Maven dependency management, then this book is ideal for you. This book is also particularly useful if you are a developer or an architect. You should be well versed with Maven and its basic functionalities if you wish to get the most out of this book.

Table of content

  • Preface
  • Chapter 1: Basic Dependency Management
  • Chapter 2: Dependency Mechanism and Scopes
  • Chapter 3: Dependency Designation (advanced)
  • Chapter 4: Migration of Dependencies to Apache Maven
  • Chapter 5: Tools within Your IDE
  • Chapter 6: Release and Distribute
  • Appendix: Useful Public Repositories
  • Index

PostHeaderIcon [DevoxxFR2013] JCP & Adopt a JSR Workshop

Lecturer

Patrick Curran chairs the Java Community Process (JCP), overseeing membership, processes, and Executive Committee. With over 20 years in software, including 15 at Sun, he led Java Conformance Engineering and chaired related councils. Active in W3C and OASIS.

Arun Gupta directs Developer Advocacy at Red Hat, focusing on JBoss Middleware. A Java EE founding member at Sun, he drove global adoption; at Oracle, he launched Java EE 7.

Mike Seghers, an IT consultant since 2001, specializes in Java enterprise web apps using frameworks like Spring, JSF. Experienced in RIA and iOS, he engages developer communities.

Abstract

Patrick Curran, Arun Gupta, and Mike Seghers’s workshop guides joining the Java Community Process (JCP) and participating in Adopt-a-JSR. They explain membership, transparency, and tools for JUG involvement like hackathons. Focusing on Java EE 8, the session analyzes collaboration benefits, demonstrating practical contributions for standard evolution.

Understanding JCP: Membership and Participation Pathways

Curran outlines JCP membership: free for individuals via jcp.org, requiring agreements; paid for corporations/non-profits ($2,000-$5,000). Java User Groups join as associates, nominating representatives.

Adopt-a-JSR encourages JUGs to engage JSRs: review specs, test implementations, provide feedback. This democratizes development, ensuring community input.

Gupta details Java EE 8 focus: HTML5, cloud, modularity. Adopt-a-JSR aids via mailing lists, issue trackers, wikis.

Practical Engagement: Tools and Initiatives for Collaboration

Tools include mailing lists for discussions, JIRA for bugs, GitHub for code. JUGs organize hack days, building samples.

Seghers demos Belgian JUG’s app: uses JSF, EJB, JPA for urban travelers game. Source on GitHub, integrates WebSockets.

This hands-on approach educates, uncovers issues early.

Case Studies: Global Adopt-a-JSR Impact

Examples: London JUG’s multiple JSR contributions; SouJava’s CDI focus; Morocco JUG’s hackathons. Chennai JUG built apps; Egypt JUG presented at conferences.

These illustrate visibility, skill-building, influence on standards.

Broader Implications: Enhancing Transparency and Community

JCP 2.8 mandates open Expert Groups, encouraging participation. Adopt-a-JSR amplifies this, benefiting platforms via diverse input.

Curran urges minimal commitments: feedback, testing. Gupta highlights launch opportunities.

Workshop fosters collaborative ecosystem, strengthening Java’s future.

Links:

PostHeaderIcon [DevoxxFR2013] Live Coding: A WOA Application in 50 Minutes

Lecturer

Guillaume Bort co-founded Zenexity, specializing in Web Oriented Architecture. Previously a J2EE expert, he developed web frameworks for large enterprises, creating Play Framework to prioritize simplicity. He leads Play’s development.

Sadek Drobi, Zenexity’s CTO, focuses on enterprise applications, bridging problem and solution domains. As a programming languages expert, he contributes to Play Framework’s core team.

Abstract

Guillaume Bort and Sadek Drobi’s live coding demonstrates building a Web Oriented Architecture (WOA) application using Play Framework and Scala. Consuming Twitter’s API, they handle JSON, integrate MongoDB, and stream real-time data via Server-Sent Events to an HTML5 interface. The session analyzes reactive programming, asynchronous handling, and scalability, showcasing Play’s efficiency for modern web apps.

Setting Up the Foundation: Play Framework and Twitter API Integration

Bort and Drobi initiate with Play Framework, creating a project via activator. They configure routes for homepage and stream endpoints, using Scala’s async for non-blocking I/O.

Consuming Twitter’s search API: construct URLs with keywords like “DevoxxFR”, include entities for images. Use WS (WebService) for HTTP requests, parsing JSON responses.

They extract tweet data: user, text, images. Handle pagination with since_id for subsequent queries, building a stream of results.

This setup leverages Play’s stateless design, ideal for scalability.

Building Reactive Streams: Enumerators and Asynchronous Processing

To create a continuous stream, they employ Enumerators and Iteratees. Poll Twitter periodically (e.g., every 5 seconds), yielding new tweets.

Code uses concurrent scheduling:

val tweets: Enumerator[Tweet] = Enumerator.generateM {
  Future {
    // Fetch and parse tweets
    Some(newTweets)
  }
}

Flatten to a single stream. Handle errors with recover, ensuring resilience.

This reactive approach processes data as it arrives, avoiding blocking and enabling real-time updates.

Persisting Data: Integrating MongoDB with ReactiveMongo

For storage, integrate ReactiveMongo: asynchronous, non-blocking driver. Define Tweet case class, insert via JSONCollection.

val collection = db.collection[JSONCollection]("tweets")
collection.insert(tweetJson)

Query for latest since_id. Use find with sort/take for efficient retrieval.

This maintains asynchrony, aligning with WOA’s distributed nature.

Streaming to Clients: Server-Sent Events and HTML5 Interface

Output as EventSource: chunked response with JSON events.

Ok.chunked(tweets &> EventSource()).as("text/event-stream")

Client-side: JavaScript EventSource listens, appending images with animations.

Handle dynamics: form submission triggers stream, updating UI.

This enables push updates, enhancing interactivity without WebSockets.

Optimizing for Scalability: Backpressure and Error Handling

Address overload: use onBackpressureBuffer to queue, drop, or fail. Custom strategies compress or ignore excess.

Play’s Akka integration aids actor-based concurrency.

Implications: Builds resilient, scalable apps handling high loads gracefully.

Collaborative Development: Live Insights and Community Resources

Session highlights rapid prototyping: zero slides, GitHub commits for following.

Drobi comments on concepts like futures unification in Scala 2.10, inter-library interoperability.

They encourage exploring Play docs, plugins for extensions.

This methodology fosters understanding of reactive paradigms, preparing for distributed systems.

Links:

PostHeaderIcon [DevoxxBE2012] Why & How: JSON Validation with JSON Schema and Jackson

Stephane Rondal, co-founder of Arexo Consulting and a Java EE expert, introduced JSON Schema validation using Jackson. Stephane, with decades in software architecture, explained JSON’s ubiquity in web 2.0, mobile, and RESTful services, yet noted lacking structural validation compared to XML.

He advocated JSON Schema for defining constraints like types, formats, and cardinalities, ensuring data integrity. Benefits include self-documenting APIs, early error detection, and improved interoperability.

Drawbacks: added complexity, performance overhead, and evolving standards (draft 3 then, now higher).

Stephane demonstrated schema creation for documents with headers and items, specifying required fields and enums.

Using Jackson’s JsonSchema module, he validated instances, catching issues like type mismatches.

Implementing Validation in Practice

Stephane integrated validation post-parsing, using ObjectMapper and JsonNode for tree traversal. Tests showed valid/invalid responses, with errors reported clearly.

He recommended the Jackson-based validator for its maintenance and spec adherence.

Links:

PostHeaderIcon [DevoxxBE2012] Back to the Future: Taking Arduino Back to Its Java Roots to Move Forward

James Caska, creator of VirtualBreadboard and Muvium, presented on revitalizing Arduino by reconnecting it to its Java origins. James, focused on bridging software and hardware, argued that Arduino’s evolution from Processing has led to fragmentation, and his Muvium V18’O plugin restores Java compatibility for enhanced development.

He traced Arduino’s lineage: from Processing (Java-based), forking to Wiring, then Arduino, and MPIDE. This divergence created “Not-Quite-Java” (NQJ), limiting features like objects and exceptions.

Muvium integrates an Ahead-Of-Time compiler, USB HID programmer, and emulator into Processing, supporting full Java. Benefits include ecosystem ties, teaching suitability, dynamic features, and emulation for testing.

James demonstrated installation, emulation of circuits, and code execution on emulated Arduino, showing seamless virtual-to-real transitions.

He emphasized Java’s advantages for complex projects, with threading and libraries expanding Arduino’s scope.

Historical Context and Evolution Challenges

James outlined Processing’s artistic roots, evolving to Wiring for hardware, then Arduino’s accessibility focus. Forks caused incompatibilities, straying from Java’s power.

Muvium reintroduces Java, compiling to bytecode for microcontrollers like PIC, with potential AVR/ARM ports.

Practical Demonstration and Features

In demos, James showed VBB emulating breadboards, integrating Muvium for Java coding. Features like garbage collection and inheritance simplify sharing.

Emulation tests code virtually, ideal for education and collaboration.

Future Expansions and Community Call

James discussed multicore support leveraging Java threads, and FPGA integrations for smaller footprints.

He invited contributions to Frappuccino libraries, broadening Arduino’s appeal to Java developers.

James’s talk positioned Muvium as a forward step, merging Arduino’s simplicity with Java’s robustness.

Links:

PostHeaderIcon How to compile both Java classes and Groovy scripts with Maven?

Case

Your project includes both Java classes and Groovy scripts. You would like to build all of them at the same time with Maven: this must be possible, because, after all, Groovy scripts are run on a Java Virtual Machine.

Solution

In your pom.xml, configure the maven-compiler-plugin as follows:
[xml] <plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<version>3.0</version>
<configuration>
<compilerId>groovy-eclipse-compiler</compilerId>
<source>1.6</source>
<target>1.6</target>
</configuration>
<dependencies>
<dependency>
<groupId>org.codehaus.groovy</groupId>
<artifactId>groovy-eclipse-compiler</artifactId>
<version>2.8.0-01</version>
</dependency>
<dependency>
<groupId>org.codehaus.groovy</groupId>
<artifactId>groovy-eclipse-batch</artifactId>
<version>2.1.5-03</version>
</dependency>
</dependencies>
</plugin>[/xml]
With such setup, default compiler (which cannot compile Groovy scripts parallelly of Java sources) will be replaced with Eclipse’s one (which can).

PostHeaderIcon Comment desactiver le NFC sur une carte bancaire?

Disclaimer: tout ce qui suit dans cet article est presente a des fins informatives. Toute action que vous pourrez entreprendre sera a vos risques et perils, et pourrait endommager votre carte de credit.

Cas

Les grandes banques francaises ont mis en place le NFC, ou “paiement sans contact” pour les marketteux. Les nouvelles cartes de credit, emises depuis fin 2012 apparemment, sont toutes equipees du NFC. Quel est le probleme? Eh bien tout simplement que la mise en place du NFC sur les cartes bancaires n’a pas ete securisee… Il s’agit d’un echec cuisant de la part des societes editrices (Visa, Master Card, etc.), qui n’ont pas ete capables d’entourer leur projet d’un peu de rigueur, technique et fonctionnelle, assurant un minimum de securisation : on ne demande pas non plus un coffre fort, mais juste qu’un quidam dans le metro ne puisse pas chipper les numeros de carte bleue des autres voyageurs.
Plus de details concernant les failles du NFC dans ces liens:

De mon cote, apres avoir joue avec du code librement telechargeable sur le net (comme readnfccc) sur mon telephone (un Nexus S de base avec “lecteur” NFC), la decision a ete claire, nette et immediate: pas de NFC sur ma carte!

J’ai donc informe mon banquier de mon refus il y a quelques mois. Las, lors du renouvellement de ma carte, la nouvelle comportait bien du NFC. J’ai bataille avec mon banquier, et j’ai meme demarre les demarches pour fermer mon compte. Cependant, meme s’il est desactive (Boursorama par exemple le propose), le module NFC sera present sur les cartes de toutes les banques. J’etais donc bien ennuye et j’ai donc refuse d’activer ma carte pendant plusieurs mois.

Trois choix s’offraient alors a moi: capituler en rase campagne, vivre de cheques et d’especes, ou bien faire le travail de messieurs les banquiers et securiser moi-meme ma carte: geek power!

Bobine, induction… Physique du NFC

En resume: la carte bancaire contient une puce NFC (je dirais meme RFID, mais je ne suis pas sur de mon coup), branchee a une bobine electrique plate, egalement integree a la carte bancaire. Lorsque la carte bancaire se trouve soumise a un champ magnetique, un courant est cree par induction (vecteur F = q * produitVectoriel(vecteur vitesse, vecteur champ magnetique), d’apres mes souvenirs).

Par consequent, pour desactiver le NFC, il suffit de detruire la puce NFC ou de casser l’induction, en coupant la bobine.

Suppression du NFC

Puce

La premiere solution est mentionnee par plusieurs sites americains: en effet, sur les cartes US (le “paiment sans contact” s’y appelle “PayWave”), il n’y a qu’une bande magnetique, et l’emplacement de la puce NFC est visible a l’oeil nu en inclinant la carte: la surface de la carte y est legerement creusee. Il suffit alors de detruire ou retirer la puce NFC, avec un cutteur, un compas, ou tout autre objet tranchant.

Sur une carte francaise cela n’est pas possible: en effet, nos cartes de credit comportent une puce (utilisee pour le paiment chez les commercants) en plus de la bande magnetique (utilisee pour les retraits d’especes en agences). Or, la puce NFC est physiquement superposee a la puce propre a la carte… Donc, a moins d’y aller au microscope, impossible de detruire la puce NFC sans rendre la carte inutilisable pour les achats.

Bobine

Reste donc la faille de la bobine.

La premiere etape consiste a reconnaitre le parcours de la bobine dans la carte, et identifier un endroit d’incision. Pour cela, se mettre dans le noir complet et utiliser une source de lumiere puissante, type torche. Le flash du Nexus S n’est pas parfait, mais il donne de bons resultats.
La photo suivante donne une idee du parcours de la bobine sur une carte Visa factice delivree par BNP Paribas:


Parcours de la bobine NFC (lien Google+)

Attention! les circuits varient selon les banques, les societes emetrices des cartes, voire meme le type de carte (Electron, classique, Gold, Black).

Il convient donc de couper le circuit en un endroit, quelconque, tout en prenant soin de ne pas toucher a la bande magnetique ni a la puce de la carte.

Dans l’image suivante est representee (en rouge gras) l’incision que j’ai faite a la carte, a l’aide d’une bete paire de ciseaux: du cote droit de la carte, d’une largeur de 10mm, parallelement a la longueur, et a un niveau de 38mm en partant de la base de la carte.


Incision dans la bobine NFC d’une carte de credit (lien Google+)

Pour rappel, une carte de credit standard obeit a la norme ISO-7810 ID-1 et sa taille est de 86mm sur 54mm.

Conclusion

Apres cette operation chirurgicale -n’exagerons pas non plus-, j’ai pu verifier que le NFC etait totalement desactive, mais que les retraits d’especes ainsi que les paiements classiques chez les commercants, c’est-a-dire en assurant un contact physique et en entrant le code PIN, continuaient a fonctionner en France. Pour l’heure, je n’ai teste ni aux USA ni en Inde :-D, mais j’ai bon espoir qu’il n’y ait pas de souci.

Toute personne de tendance paranoiaque en terme de securite informatique pourra egalement calfeutrer sa carte bancaire dans une feuille d’aluminium hermetique, operant l’effet d’une cage de Faraday.

Cela etant, je reste stupefait par l’enorme echec de Visa et Master Card dans la mise en place du NFC. Un tel manquement aux regles de securite elementaires est consternant. Comment peut-on pretendre developper le commerce en laissante beantes de telles failles de securite?

De meme, je reste stupefait de la legerete avec laquelle les banques traitent ce scandale du NFC: il est du ressort des banques de detail de fournir des moyens de paiement un minimum securises. A tout le moins, on aurait attendu des banques qu’elles laissent le choix d’avoir une carte avec ou sans NFC, et non d’imposer d’office un choix plus que contestable. Ce n’etait pas a moi de securiser ma carte, c’etait a ma banque de m’en fournir une suffisamment blindee ; de plus, je doute de la capacite de Mme Michu d’effectuer les memes operations que moi.

Bref: carton rouge pour toutes les institutions impliquees dans la diffusion des cartes NFC!