Showing posts with label integration testing. Show all posts
Showing posts with label integration testing. Show all posts

31 August 2015

Empowering Testers

Developers need to be able to test
I strongly believe that we (software developers) have to be able to test our work. How else can we make sure that "it works". At the minimum we write unit tests for the code we just created. The whole idea of Test Driven Development is based on finding, prioritising and running unit (and sometimes acceptance) tests against our code. Even people that do not like TDD, put a strong focus on automated tests. Paul Gerrard recently put it nicely as "if they can't test they aren't developers". Unfortunately not all developers are able to do so. I often meet young developers who have no experience in deriving test cases from requirements or finding more than the most obvious test scenarios at all. They write tests because they want to make sure their program work, but their tests are not focused, testing too much or nothing at all. More skill and experience is needed.

We can code it! (for Mother Jones magazine)Need testers be able to code?
When developers need to be able to test, do testers need to be able to code? This is an old debate. As hard-core coder I believe that testers need to be able to code. Period. For example at one my of previous employers, the testers were asked to automate their tests using Rational Function Tester. They had to write Java code to drive the test engine. And the code they wrote was abysmal. I do not blame them, nobody had taught them how to do it. But the code they wrote was important for the project's success, so it better had some quality at least. What was needed were Test Automation Engineers. As the test automation code is much simpler than the regular code, there is no need for explicit conditionals or loops in test cases, I concede that these testers (the engineers) do not have to know as much about coding as developers do. They do not have to know Design Patterns or SOLID principles, but they need at least to know naming, duplication and to a certain extend, abstraction.

Manual testing is immoral
Uncle Bob has explained years ago why manual testing is a bad thing. Probably he meant that manual checking is immoral, because the testing community likes to distinguish between aspects of the testing process that machines can do (checking) versus those that only skilled humans can do (testing). As there are more than 130 types of software testing, many of them can only be done by a skilled human, e.g. exploratory testing. I do not know how many testers do these kind of interesting things, but most of what I see is either (immoral) manual checking or test automation engineering, which is kind of coding.

Austrian Testing Board
Earlier this year I was invited to the Austrian Testing Board to give a presentation about Deliberate Practice. While Deliberate Practice is popular with developers, e.g. Coding Dojos or Code Retreats, it can be applied to all kind of learning. For example there are Testing Dojos and Testautomation (Code) Retreats. My presentation was received well and I enjoyed a heated discussions afterwards because - of course - I had provoked the "testers" a bit ;-) One of the attendees, Gerd Weishaar, was surprised to hear about my approach: Doing TDD, automating all tests (because "immorality") and running only a few integrated tests (because Test Pyramid and Scam). I also told him about my hate for UI test automation tools because of their brittleness. As the VP Product Management at Tricentis, a company focusing on software (end-to-end) test automation, he had a very different view than mine: Almost no developers writing unit tests, groups of testers who work offshore, doing manual testing. We immediately became friends. ;-)

Empowering Testers
Gerd's vision was to enable the testers who are not able to code, empowering them to do their job - to get something tested - without any technical knowledge. Despite our contradicting views I got curious: Would it be possible to test for example a web application without any technical knowledge? How much technical background is necessary to understand the application at all? How much automation can be achieved without being able to write a script? And most interestingly, how much support can be put into a testing tool, just by making it usable for the testers. How far can we go?

Magnif-eyedTricentis Tosca Testsuite
The tool Gerd was talking about was Tosca, a functional end-to-end testing tool. I had heard about it but never used it myself. Gerd encouraged me to have a look at it and provided me with online material, but I failed to allocate time for that. In the end he invited me to one of their monthly instructor-led training, free of charge. Although I had never planned to dedicate three full days (!) to checking out Tosca, I accepted his invitation, because I find well prepared instructor-led training an efficient way to get started with a new topic. The training was tough, three full days working the tool on various exercises. The trainer and the other participants knew a lot about testing and we had great discussions. (If you need to work with Tosca on your job I strongly encourage you to take their instructor-led training. Afterwards you will be productive using the Tosca Testsuite.)

Small Disclaimer
I attended the three day training free of charge. Gerd asked for feedback and a blog post about it in return. This is this blog post. I liked the trainer and in the end I liked Tosca, so I am positively biassed. Still all the open questions from the beginning of the article remain.

Training
I had two goals for the training: First I wanted to learn more about the mind-set of "classic testers" and second I wanted to check Gerd's idea of empowering non-technical/non-coder testers. Tosca is very powerful. It includes tools for project planning, risk management, test case design as well as manual and automated testing. It took me three days to get an idea what the tool is capable of. I do not know how much more time one would need without the development background that I have. I am used to such kind of tools, after all Eclipse or IDEA are not easy to use either. So training is necessary to use Tosca. Maybe three days are enough to explain it to someone who has no idea about coding, maybe five days are needed. Anything less than ten years will do. As the tool only offers a fixed limit of combinations, it is possible to get familiar with it fast.

Context Sensitivity
Tosca is a professional tool. The user interface is well done and it supports power users by providing tab ordering and consistent keyboard short-cuts. Unfortunately some of its many options and functions are really hidden deep in dialogues or require you to type in certain text combinations. This is not usable in the general sense. I guess developers are used to writing text, after all that is the code we write day after day. Still we expect context sensitive help and code completion, something that non-coders need even more. So I guess there needs some work to be done in a few places, but that can be fixed easily, and Gerd told me they are planning to improve this.

Tosca Test Case Design
The coolest things of Tosca is its Test Case Design functionality. It enables the formulation of abstract test cases, working with happy path, boundary values and negative cases, to create a conceptual model of all tests necessary to cover the given functionality. While this modelling does not involve any kind of coding related skills, it definitely requires skilled test professionals with analytical thinking and a high level of abstraction. As I consider these skills important coding skills, the test case design requires at least someone "like a coder".

Tosca Test Automation
While it is possible to make test case organisation or execution easy to use, the real problem is always the automation. I cannot imagine a non-coder being able to create reasonable test case automation. To hide complexity, Tosca's automated test cases are constructed from building blocks which hide the real automation code, e.g. pressing a button on a web page. It might be possible to create an automated test case just by arranging and configuring the existing automation steps in the right order, but usually that is not the case, because new automation steps have to be created on the fly. Also the test automation allows for conditionals, repetitions (loops) and templates (macros), which are typical coding constructs. If someone has to work with assignments, conditionals and repetitions, she is already a programmer.

Where to nextHow far can we go?
The underlying question is if it is possible to lower the requirements for technical work by providing smarter user interfaces that support the way we work? I hope so. Enabling people to get some tests automated without knowledge in coding is a good example for this. Even if we can make the tools just a bit more usable our users will feel the difference. Tosca contains some interesting approaches regarding this. Reducing complexity lets users work on a higher level of abstraction. Unfortunately abstractions are sometimes leaky and suddenly we are back in the low level world and have to type "magic" text into plain text fields. For now there are still areas in Tosca where you need to be a coder and you have to stick to software development principles to get good results. But I like Gerd's vision and I am curious how far they will go to empower non technical testers.

20 April 2015

Maven Integration Tests in Extra Source Folder

On one of my current projects we want to separate the fast unit tests from the slow running integration and acceptance tests. Using Maven this is not possible out of the box because Maven only supports two source folders, main and test. How can we add another source and resource folder, e.g. it (for integration test)? Let's assume a project layout like that:
project
|-- pom.xml
`-- src
    |-- main
        `-- java
    |-- test
        `-- java
            `-- UnitTest.java
    `-- it
        `-- java
            `-- IntegrationIT.java
We use the regular Maven Surefire to execute our unit tests, i.e. all the tests in the src/test/java folder. The plugin definition in the pom.xml is as expected.
        <plugin>
            <!-- run the regular tests -->
            <groupId>org.apache.maven.plugins</groupId>
            <artifactId>maven-surefire-plugin</artifactId>
            <version>2.18</version>
        </plugin>
And we use Maven Failsafe to execute the integration tests. If you do not know Failsafe, it is much like the Surefire plugin, but with different defaults and usually runs during the integration test phase.
        <plugin>
            <!-- run the integration tests -->
            <groupId>org.apache.maven.plugins</groupId>
            <artifactId>maven-failsafe-plugin</artifactId>
            <version>2.18.1</version>
            <executions>
                <execution>
                    <goals>
                        <goal>integration-test</goal>
                        <goal>verify</goal>
                    </goals>
                </execution>
            </executions>
        </plugin>
By default the Failsafe plugin does not look for class names ending with Test but ending with IT, making it possible to mix both kind of tests in the same src/test/java folder, but we do not want that. The "trick" to have another source folder for integration tests is to use the Build Helper Maven Plugin and add it as test source.
        <plugin>
            <groupId>org.codehaus.mojo</groupId>
            <artifactId>build-helper-maven-plugin</artifactId>
            <version>1.9.1</version>
            <executions>
                <execution>
                    <id>add-integration-test-source-as-test-sources</id>
                    <phase>generate-test-sources</phase>
                    <goals>
                        <goal>add-test-source</goal>
                    </goals>
                    <configuration>
                        <sources>
                            <source>src/it/java</source>
                        </sources>
                    </configuration>
                </execution>
            </executions>
        </plugin>
Now src/it/java is added as test source as well, as seen during Maven execution. After the compile phase Maven logs
[INFO] [build-helper:add-test-source {execution: add-integration-test-source-as-test-sources}]
[INFO] Test Source directory: .\src\it\java added.
There is still only one test source for Maven but at least we have two folders in the file system. All test sources get merged and during the test compile phase we see
[INFO] [compiler:testCompile {execution: default-testCompile}]
[INFO] Compiling 2 source files to .\target\test-classes
showing that all classes in both src/test/java and src/it/java are compiled at the same time. This is important to know because class names must not clash and tests still need different naming conventions like *Test and *IT. Now mvn test will only execute fast unit tests and can be rerun many times during development.
[INFO] [surefire:test {execution: default-test}]
[INFO] Surefire report directory: .\target\surefire-reports

-------------------------------------------------------
 T E S T S
-------------------------------------------------------
Running UnitTest
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.01 sec - in UnitTest

Results :

Tests run: 1, Failures: 0, Errors: 0, Skipped: 0
Only if we want, using mvn verify, the integration tests are executed.
[INFO] [failsafe:integration-test {execution: default}]
[INFO] Failsafe report directory: .\target\failsafe-reports

-------------------------------------------------------
 T E S T S
-------------------------------------------------------
Running IntegrationIT
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0 sec - in IntegrationIT

Results :

Tests run: 1, Failures: 0, Errors: 0, Skipped: 0

[INFO] [failsafe:verify {execution: default}]
[INFO] Failsafe report directory: .\target\failsafe-reports
Now we can run the slow tests before we check in the code or after major changes a few times during the day.

8 August 2011

Maven Plugin Harness Woes

Last year I figured out how to use the Maven Plugin Harness and started using it. I added it to my Maven plugin projects. Recently I started using DEV@cloud. DEV@cloud is a new service which contains Jenkins and Maven source repositories. CloudBees, the company behind DEV@cloud, offers a free subscription with reduced capabilities which are more than enough for small projects. I set up all my projects there in no time, but had serious problems with the integration tests.

Using a local Maven repository other than ~/.m2/repository
Repository (unnamed)You don't have to use the default repository location. It's possible to define your own in the user's settings.xml or even in the global settings. But I guess most people just use the default. On the other hand in an environment like DEV@cloud, all the different builds from different users must be separated. So CloudBees decided that each Jenkins job has its own Maven repository inside the job's workspace. That is good because the repository is deleted together with the project.

Problem
The Testing Harness embeds Maven, i.e. forks a new Maven instance. It fails to relay the modified settings to this new process. During the execution of the integration test a new local repository is created and the original local one is used as a remote one (called "local-as-remote"). But without any hints, Maven uses ~/.m2/repository. So the true local repository is not found and all needed artefacts are downloaded again. This takes a lot of time (and wastes bandwidth). Dependencies that exist only in the local repository, e.g. snapshots of dependent projects, are not found and the integration test fails.

Solution
RepositoryTool.findLocalRepositoryDirectoy() uses an instance of MavenSettingsBuilder to get the settings. Its only implementing class is DefaultMavenSettingsBuilder and it tries to determine the repository location from the value of the system property maven.repo.local. Then it reads the user settings and in the end it uses ~/.m2/repository. The solution is to set the maven.repo.local system property whenever the local repository is not under ~/.m2/repository. Add -Dmaven.repo.local=.repository into the field for Goals and Options of the Jenkins job configuration.

Using additional dependencies while building integration test projects
indirectionAfter the plugin under test is built and installed into the new local repository the Maven Plugin Harness runs Maven against the integration test projects inside the src/test/resources/it folder. The approach which I described last year forks a Maven with pom, properties and goals defined by the JUnit test method.

Problem
The integration tests know the location of the new local repository (because it is set explicitly) and are able to access the plugin under test. But they know nothing about the local-as-remote repository. They can only access all artefacts which have been "downloaded" from the local-as-remote repository during the build of the plugin under test. So the problem is similar to the previous problem but occurs only when an integration test project needs additional artefacts. For example a global ruleset Maven module might consists of XML ruleset configuration files. The test module depends on the Checkstyle plugin and executes it using the newly build rulesets. So the object under test (the rules XML) is tested indirectly through the invocation of Checkstyle but the ruleset module itself does not depend on Checkstyle.

Solution
All POMs used during integration test have to be "mangled", not just the POM of the plugin under test. The method manglePomForTestModule(pom) is defined in the ProjectTool but it's protected and not accessible. So I copied it to AbstractPluginITCase and applied it to the integration test POMs.

Using settings other than ~/.m2/repository/settings.xml
Cluster ConfigurationIf you need artefacts from repositories other than Maven Central you usually add them to your settings.xml. Then you refer to them in the Jenkins job configuration. Behind the scenes Jenkins calls Maven with the parameter -s custom_settings.xml.

Problem
Similar to the repository location, the custom settings' path is not propagated to the embedded Maven and it uses the default settings. This causes no problems if all needed artefacts are either in the local-as-remote repository or can be downloaded from Maven Central. For example a Global Ruleset might contain some Macker Architecture Rules. The snapshot of the Macker Maven Plugin is deployed by another build job into the CloudBees snapshot repository. The test module depends on this Macker plugin and runs it using the newly built rulesets.

Solution
AbstractPluginITCase calls BuildTool's createBasicInvocationRequest() to get an InvocationRequest and subsequently executes this request. Using any system property the InvocationRequest can be customised:
if (System.getProperty(ALT_USER_SETTINGS_XML_LOCATION) != null) {
   File settings =
      new File(System.getProperty(ALT_USER_SETTINGS_XML_LOCATION));
   if (settings.exists()) {
      request.setUserSettingsFile(settings);
   }
}
Then the value of the used system property is added into the field for Jenkins' Goals and Options: -s custom_settings.xml -Dorg.apache.maven.user-settings=custom_settings.xml.

Alas I'm not a Maven expert and it took me quite some time to solve these problems. They are not specific to CloudBees but they result from using non default settings. Other plugins that fork Maven have similar problems.

5 September 2010

Maven Plugin Testing Tools

Obviously your MOJOs (read Maven Plugins) need to be tested as detailed as all your other classes. Here are some details on using the Maven Plugin Testing Tools and how I used them for the Macker Maven Plugin.

Dog in harnessUnit Testing
To unit test a MOJO create a JUnit test case and extend AbstractMojoTestCase. It comes with the maven-plugin-testing-harness:
<dependency>
  <groupId>org.apache.maven.plugin-testing</groupId>
  <artifactId>maven-plugin-testing-harness</artifactId>
  <version>1.2</version>
  <scope>test</scope>
</dependency>
Note that version 1.2 of the maven-plugin-testing-harness is still Maven 2.0 compatible, as will be version 1.3. The alpha release of version 2.0 is already based on Maven 3. In the test case the MOJO is initialised with a pom fragment and executed. The basic usage is well documented in the Maven Plugin Harness documentation. The usual folder structure for plugin unit tests is
.
 |-- pom.xml
 \-- src
     |-- main
     \-- test
         |-- java
         |   \-- MojoTest.java
         \-- resources
             \-- unit
                 |-- test-configuration1
                 |   \-- resources for this test if any
                 |-- test-configuration1-plugin-config.xml
                 \-- test-configuration2-plugin-config.xml
The class MojoTest contains methods testConfiguration1() and testConfiguration2(). There are several Maven projects using this approach, just do a code search.

Stubs
If your MOJO needs more complex parameters, e.g. a MavenProject or an ArtifactRepository, these have to be provided as stubs. Stubs are simple implementations of real Maven objects, e.g.
public class org.apache.maven.plugin.testing.stubs.ArtifactStub
  implements org.apache.maven.artifact.Artifact
and have to be configured in the pom fragment (test-configuration1-plugin-config.xml) as described in the Maven Plugin Testing Harness Cookbook. See also MackerMojoTest in the Macker Maven Plugin for an extensive example. There are several stubs to simulate Maven objects such as ArtifactHandler or ArtifactResolver. Creating stubs gets cumbersome when you need more of Maven's internals. Every object and every method has to be stubbed out.

Integration Testing
Integration testing is done with the maven-plugin-testing-tools:
<dependency>
  <groupId>org.apache.maven.plugin-testing</groupId>
  <artifactId>maven-plugin-testing-harness</artifactId>
  <version>1.2</version>
  <scope>test</scope>
</dependency>
<dependency>
  <groupId>org.apache.maven.plugin-testing</groupId>
  <artifactId>maven-plugin-testing-tools</artifactId>
  <version>1.2</version>
  <scope>test</scope>
</dependency>
<dependency>
  <groupId>org.codehaus.plexus</groupId>
  <artifactId>plexus-utils</artifactId>
  <version>1.5.6</version>
  <scope>test</scope>
</dependency>
Note that the testing tools 1.2 specifically need plexus-utils version 1.5.6. The usual folder structure for plugin integration tests is
.
 |-- pom.xml
 \-- src
     |-- main
     \-- test
         |-- java
         |   \-- MojoIT.java
         \-- resources
             \-- it
                 |-- test-case1
                 |   |-- pom.xml
                 |   \-- main
                 |       \-- classes and data for this test
                 \-- test-case2
The class MojoIT contains the test methods testCase1() and testCase2(). Folder test-case1 contains a full Maven module that uses the MOJO under test in some way.

safety netThe test uses PluginTestTool, BuildTool and the other tools from maven-plugin-testing-tools to create a local repository, install the Maven module under test into it and execute the Maven build of test-case1/pom.xml. The Plugin Testing Tools site provides more detail about this process and the various tools.

Setup
The Plugin Testing Tools do not provide an abstract test case. Each test has to create it's own AbstractPluginITCase. A good example is the AbstractEclipsePluginIT of the Eclipse Plugin. It contains methods to build the module, execute poms against it and verify created artefacts. As far as I know this is the only example available. AbstractOuncePluginITCase is a modified copy as well as AbstractMackerPluginITCase.

Execution
Integration tests should be executed in the integration-test phase.
<build>
  <plugins>
    <plugin>
      <artifactId>maven-surefire-plugin</artifactId>
      <executions>
        <execution>
          <phase>integration-test</phase>
          <goals>
            <goal>test</goal>
          </goals>
          <configuration>
            <includes>
              <include>**/*IT.java</include>
            </includes>
            <excludes>
              <exclude>specified only to override config
                       from default execution</exclude>
            </excludes>
          </configuration>
        </execution>
      </executions>
    </plugin>
  </plugins>
</build>
alternative routeThe PluginTestTool modifies the pom.xml to skip all tests, so the tests are not invoked again when the module is build by the integration test.

Help Mojo Workaround
Unfortunately there is a problem with the above approach. It works fine for modules which do not contain Maven Plugins. But it fails during integration test preparation when the help-mojo of maven-plugin-plugin is executed. Fortunately the guys from the Eclipse Plugin found a workaround:
<plugin>
  <groupId>org.apache.maven.plugins</groupId>
  <artifactId>maven-plugin-plugin</artifactId>
  <!-- lock down to old version as newer version aborts
       build upon no mojos as required during ITs -->
  <version>2.4.3</version>
  <executions>
    <!-- disable execution, makes IT preparation using
         maven-plugin-testing-tools fail (see
         target/test-build-logs/setup.build.log) -->
    <execution>
      <id>help-mojo</id>
      <configuration>
        <extractors>
          <extractor />
        </extractors>
      </configuration>
    </execution>
  </executions>
</plugin>
But now the plugin can't be built from scratch any more. So they used a profile to run the integration tests and disable help-mojo execution.
<profiles>
  <profile>
    <id>run-its</id>
    <build>
      <plugins>
        <plugin>
          <artifactId>maven-surefire-plugin</artifactId>
          ...
        </plugin>
        <plugin>
          <artifactId>maven-plugin-plugin</artifactId>
          ...
        </plugin>
      </plugins>
    </build>
  </profile>
</profiles>
Wrap-up
So far so good. The BuildTool has to activate the profile run-its (as it has to skip help-mojo execution). This could be done by setting a certain property, let's call it ProjectTool:packageProjectArtifact. Then the profile would only be activated during integration test preparation.
<profiles>
  <profile>
    <id>run-its</id>
    ...
    <activation>
      <property>
        <name>ProjectTool:packageProjectArtifact</name>
      </property>
    </activation>
  </profile>
</profiles>
I've submitted a patch for that, but in the meantime I had to copy the BuildTool into my own plugin, ugh. (I was working towards a clean solution throughout this post but in the end all gets messed up.) The whole plugin testing can be seen in action in the Macker Maven Plugin.

Acknowledgement
Experimenting with the Maven Plugin Testing Tools was part of a System One Research Day. Thank you System One for supporting Open Source :-)