Recent Posts
Archives

Posts Tagged ‘TDD’

PostHeaderIcon [DevoxxUK2025] Software Excellence in Large Orgs through Technical Coaching

Emily Bache, a seasoned technical coach, shared her expertise at DevoxxUK2025 on fostering software excellence in large organizations through technical coaching. Drawing on DORA research, which correlates high-quality code with faster delivery and better organizational outcomes, Emily emphasized practices like test-driven development (TDD) and refactoring to maintain code quality. She introduced technical coaching as a vital role, involving short, interactive learning hours and ensemble programming to build developer skills. Her talk, enriched with a refactoring demo and insights from Hartman’s proficiency taxonomy, offered a roadmap for organizations to reduce technical debt and enhance team performance.

The Importance of Code Quality

Emily began by referencing DORA research, which highlights capabilities like test automation, code maintainability, and small-batch development as predictors of high-performing teams. She cited a study by Adam Tornhill and Marcus Borie, showing that poor-quality code can increase development time by up to 124%, with worst-case scenarios taking nine times longer. Technical debt, or “cruft,” slows feature delivery and makes schedules unpredictable. Practices like TDD, refactoring, pair programming, and clean architecture are essential to maintain code quality, ensuring software remains flexible and cost-effective to modify over time.

Technical Coaching as a Solution

In large organizations, Emily noted a gap in technical leadership, with architects often focused on high-level design and teams lacking dedicated tech leads. Technical coaches bridge this gap, working part-time across teams to teach skills and foster a quality culture. Unlike code reviews, which reinforce existing knowledge, coaching proactively builds skills through hands-on training. Emily’s approach involves collaborating with architects and tech leads, aligning with organizational goals while addressing low-level design practices like TDD and refactoring, which are often neglected but critical for maintainable code.

Learning Hours for Skill Development

Emily’s learning hours are short, interactive sessions inspired by Sharon Bowman’s training techniques. Developers work in pairs on exercises, such as refactoring katas (e.g., Tennis Refactoring Kata), to practice skills like extracting methods and naming conventions. A demo showcased decomposing a complex method into readable, well-named functions, emphasizing deterministic refactoring tools over AI assistants, which excel at writing new code but struggle with refactoring. These sessions teach vocabulary for discussing code quality and provide checklists for applying skills, ensuring developers can immediately use what they learn.

Ensemble Programming for Real-World Application

Ensemble programming brings teams together to work on production code under a coach’s guidance. Unlike toy exercises, these sessions tackle real, complex problems, allowing developers to apply TDD and refactoring in context. Emily highlighted the collaborative nature of ensembles, where senior developers mentor juniors, fostering team learning. By addressing production code, coaches ensure skills translate to actual work, bridging the gap between training and practice. This approach helps teams internalize techniques like small-batch development and clean design, improving code quality incrementally.

Hartman’s Proficiency Taxonomy

Emily introduced Hartman’s proficiency taxonomy to explain skill acquisition, contrasting it with Bloom’s thinking-focused taxonomy. The stages—familiarity, comprehension, conscious effort, conscious action, proficiency, and expertise—map the journey from knowing a skill exists to applying it fluently in production. Learning hours help developers move from familiarity to conscious effort with exercises and feedback, while ensembles push them toward proficiency by applying skills to real code. Coaches tailor interventions based on a team’s proficiency level, ensuring steady progress toward mastery.

Getting Started with Technical Coaching

Emily encouraged organizations to adopt technical coaching, ideally led by tech leads with management support to allocate time for mentoring. She shared resources from her Samman Coaching website, including kata descriptions and learning hour guides, available through her nonprofit society for technical coaches. For mixed-experience teams, she pairs senior developers with juniors to foster mentoring, turning diversity into a strength. Her book, Samman Technical Coaching, and monthly online meetups provide further support for aspiring coaches, aiming to spread best practices and elevate code quality across organizations.

Links:

PostHeaderIcon Training: Test Driven Development and Unit Testing Cooking Book

Last Tuesday I led a training session about Test Driven Development (TDD) in general, and Unit Tests in particular. I presented a series of best practices related to unit tests, as a set of “receipts”.
The target audience is developpers, either Java coders or other ones. The exercises have no interest from a Java perspective, but are intented at teaching the main concepts and tricks of unit testing.

The exercices are the following:

  • basics
  • methods within methods
  • layers and mocks
  • private methods
  • exceptions
  • loggers
  • new Date()

Here is the presentation:

The original OpenOffice file is hosted by GoogleDocs: Test Driven Development and Unit Testing Cooking Book

The source code of exercises and corrections are available at this link.

PostHeaderIcon How to export Oracle DB content to DBUnit XML flatfiles?

Case

From an Agile and TDD viewpoint, performing uni tests on DAO is a requirement. Sometimes, instead of using DBUnit datasets “out of the box”, the developper need test on actual data. In the same vein, when a bug appears on production, isolating and reproducing the issue is a smart way to investigate, and, along the way, fix it.
Therefore, how to export actual data from Oracle DB (or even MySQL, Sybase, DB2, etc.) to a DBUnit dataset as a flat XML file?

Here is a Runtime Test I wrote on this subject:

Fix

Spring

Edit the following Spring context file, setting the login, password, etc.
[xml]
<?xml version="1.0" encoding="UTF-8"?>
<beans xmlns="http://www.springframework.org/schema/beans"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://www.springframework.org/schema/beans http://www.springframework.org/schema/beans/spring-beans-2.5.xsd">

<!– don’t forget to write this, otherwise the application will miss the driver class name, and therfore the test will fail–>
<bean id="driverClassForName" class="org.springframework.beans.factory.config.MethodInvokingFactoryBean">
<property name="targetClass" value="java.lang.Class"/>
<property name="targetMethod" value="forName"/>
<property name="arguments">
<list>
<value>oracle.jdbc.driver.OracleDriver</value>
</list>
</property>
</bean>
<bean id="connexion" class="org.springframework.beans.factory.config.MethodInvokingFactoryBean"
depends-on="driverClassForName">
<property name="targetClass" value="java.sql.DriverManager"/>
<property name="targetMethod" value="getConnection"/>
<property name="arguments">
<list>
<value>jdbc:oracle:thin:@host:1234:SCHEMA</value>
<value>myLogin</value>
<value>myPassword</value>
</list>
</property>
</bean>

<bean id="databaseConnection" class="org.dbunit.database.DatabaseConnection">
<constructor-arg ref="connexion"/>
</bean>
<bean id="queryDataSet" class="org.dbunit.database.QueryDataSet">
<constructor-arg ref="databaseConnection"/>
</bean>
</beans>[/xml]

The bean driverClassForName does not look to be used ; anyway, if Class.forName("oracle.jdbc.driver.OracleDriver") is not called, then the test will raise an exception.
To ensure driverClassForName is created before the bean connexion, I added a attribute depends-on="driverClassForName". The other beans will be created after connexion, since Spring will deduce the needed order of creation via the explicit dependency tree.

Java

[java]public class Oracle2DBUnitExtractor extends TestCase {
private QueryDataSet queryDataSet;

@Before
public void setUp() throws Exception {
final ApplicationContext applicationContext;

applicationContext = new ClassPathXmlApplicationContext(
"lalou/jonathan/Oracle2DBUnitExtractor-applicationContext.xml");
assertNotNull(applicationContext);

queryDataSet = (QueryDataSet) applicationContext.getBean("queryDataSet");

}

@Test
public void testExportTablesInFile() throws DataSetException, IOException {
// add all the needed tables ; take care to write them in the right order, so that you don’t happen to fall on dependencies issues, such as ones related to foreign keys

queryDataSet.addTable("MYTABLE");
queryDataSet.addTable("MYOTHERTABLE");
queryDataSet.addTable("YETANOTHERTABLE");

// Destination XML file into which data needs to be extracted
FlatXmlDataSet.write(queryDataSet, new FileOutputStream("myProject/src/test/runtime/lalou/jonathan/output-dataset.xml"));

}
}[/java]

PostHeaderIcon How to unit test Logger calls?

Case

Sometimes, you need test the Log4J’s loggers are called with the right parameters. How to perform these tests from with JUnit?

Let’s take an example: how to test these simple class and method?

[java]public class ClassWithLogger {
private static final Logger LOGGER = Logger.getLogger(ClassWithLogger.class);

public void printMessage(Integer foo){
LOGGER.warn(&quot;this is the message#&quot; + foo);
}
}[/java]

Example

Define an almost empty Log4J appender, such as:

[java]public class TestAimedAppender extends ArrayList&lt;String&gt; implements Appender {
private final Class clazz;

public TestAimedAppender(Class clazz) {
super();
this.clazz = clazz;
}

@Override
public void addFilter(Filter newFilter) {
}

@Override
public Filter getFilter() {
return null;
}

@Override
public void clearFilters() {
}

public void close() {
}

@Override
public void doAppend(LoggingEvent event) {
add(event.getRenderedMessage());
}

@Override
public String getName() {
return &quot;TestAppender for &quot; + clazz.getSimpleName();
}

@Override
public void setErrorHandler(ErrorHandler errorHandler) {
}

@Override
public ErrorHandler getErrorHandler() {
return null;
}

@Override
public void setLayout(Layout layout) {
}

@Override
public Layout getLayout() {
return null;
}

@Override
public void setName(String name) {
}

public boolean requiresLayout() {
return false;
}
}[/java]

Then create a TestCase with two fields:

[java]public class ClassWithLoggerUnitTest {
private ClassWithLogger classWithLogger;
private TestAimedAppender appender;

}
[/java]

In the setup, remove all appenders, create an instance of our appender, and then add it to the logger related to the class which we want to test:

[java] @Before
public void setUp() throws Exception {
final Logger classWithLoggerLogger = Logger.getLogger(ClassWithLogger.class);
classWithLoggerLogger.removeAllAppenders();
appender = new TestAimedAppender(ClassWithLogger.class);
classWithLoggerLogger.addAppender(appender);
appender.clear();

classWithLogger = new ClassWithLogger();
}[/java]

Then write the following test. The code is documented:

[java] @Test
public void testPrintMessage() throws Exception {
final String expectedMessage = &quot;this is the message#18&quot;;

// empty the appender
appender.clear();
// check it is actually empty before any call to the tested class
assertTrue(appender.isEmpty());

// call to the tested class
classWithLogger.printMessage(18);

// check the appender is no more empty
assertFalse(appender.isEmpty());
assertEquals(1, appender.size());
// check the content of the appender
assertEquals(expectedMessage, appender.get(0));

}[/java]

Conclusion

This basic example shows how to perform tests on logger, without overriding the original code or using mocks. Of course, you can improve this basic example, for instance in discriminating owing to the log level (INFO, WARN, ERROR, etc.), use generics, and even any other fantasy ;-).

PostHeaderIcon No source code is available for type org.junit.Assert; did you forget to inherit a required module?

Case

You run a GWT application, with a a service layer. Those services are tested through unit tests, which may use EasyMock, among other frameworks. Of course, you hinted at related jars, such us JUnit, by a <scope>test</scope> in your pom.xml.

Yet, when you run the GWT application with a Jetty light container, you get the following message:

Compiling module lalou.jonathan.gwt.client.MyModule

Validating newly compiled units

[java][ERROR] Errors in ‘file:/C:/eclipse/workspace/…/test/unit/lalou/jonathan/gwt/client//MyServiceUnitTest.java’
[ERROR] Line 26: No source code is available for type org.easymock.MockControl; did you forget to inherit a required module?
[ERROR] Line 76: No source code is available for type org.junit.Assert; did you forget to inherit a required module?[/java]

Fix

Since Maven2 and GWT scopes are fully independant, you have to modify you *.gwt.xml. Replace:

[xml] &lt;source path=’client’/&gt;[/xml]

with:

[xml]&lt;source path=’client’ excludes=&quot;**/*UnitTest.java,**/*RuntimeTest.java&quot;/&gt;[/xml]

NB: Never forget that Google teams work with Ant, and not with Maven!