Software Testing

Please Note: Work in Progress

This page will describe the testing approach for the services and  solutions developed at the MPDL.

=Module Tests= are Unit tests which based on knowledge of the internal program design and code. Should be done by developers. Following test types are suggested:
 * Interface Tests should check all methods in the modules interface. [required]
 * Class Name convention:
 * src/test/java/InterfaceTest.java
 * example: src/test/java/CitationStyleHandlerInterfaceTest.java
 * The class should be explicitely declared in the module pom.xml
 * maven-surefire-plugin
 * false
 * **/CitationStyleHandlerInterfaceTest.java
 * **/TestCitationStylesSubstantial.java
 * The existence of the required Interface test class can be checked later by parsing of pom.xml (e.g with command line utility)
 * maven-surefire-plugin
 * false
 * **/CitationStyleHandlerInterfaceTest.java
 * **/TestCitationStylesSubstantial.java
 * The existence of the required Interface test class can be checked later by parsing of pom.xml (e.g with command line utility)
 * **/TestCitationStylesSubstantial.java
 * The existence of the required Interface test class can be checked later by parsing of pom.xml (e.g with command line utility)
 * The existence of the required Interface test class can be checked later by parsing of pom.xml (e.g with command line utility)
 * The existence of the required Interface test class can be checked later by parsing of pom.xml (e.g with command line utility)
 * The existence of the required Interface test class can be checked later by parsing of pom.xml (e.g with command line utility)
 * The existence of the required Interface test class can be checked later by parsing of pom.xml (e.g with command line utility)
 * The existence of the required Interface test class can be checked later by parsing of pom.xml (e.g with command line utility)
 * The existence of the required Interface test class can be checked later by parsing of pom.xml (e.g with command line utility)


 * All other Unit tests for the module should be presented in a separate test class. [optional]

Open Questions

 * How to define scope of parameter values of tested interfaces ? (e.g. dataacquistion, test all fetching formats for all sources? is one exemplary source enough?) Proposal:
 * For enumerated parameters
 * Pass all parameters in loop
 * Pass parameter which is not in the scope and check exception handling
 * For non-regular parameters, like XML input
 * Check wellformed/malformed XML
 * Check validity of output XML against the current XML Scheme (validation service?)
 * Check output substantially
 * Test exception handling on the different types of errors (e.g. for levels: fatal, errors, warnings) ?

=Integration Tests= are assumed to be used for combined parts of an application to determine if they function together correctly. Integration tests are only available for services which have dependencies to other services and should be called in the jboss context. Integration Tests should be performed for:
 * 1) HTTP interfaces (REST, unapi, other)
 * 2) * Different tools already exist which we can use (as mvn dependency or java code generator).
 * 3) * Tests should be generic, therefore we could introduce a httpInterface.xml which provides the links for the interfaces to test (not static in source code). Thus testing a new interface should only demand entering the url in this httpInterface.xml
 * 4) * Example in python
 * 5) EJB interfaces

Location: Currently, the integration tests are implemented in _ears.

Test Process

 * Check if the service can communicate to the required others (e.g. response code 200 for HTTP, AssertNotNull for EJB) [required]
 * Check if the outcome of a method is the specified, desired one (e.g. valid escidoc xml). [optional]
 * Test input should check valid and invalid parameters

Open Questions

 * What is the essence of integration tests:
 * testing of the service ability to invoke related services in jboss context + testing of the basic service functionality itself (calling of the service interfaces with only one set of parameters, service work/doesn't work)
 * tests in previous item + entire substantial service testing
 * Integration Test Service for each project (common_services, pubman, etc) vs./additional to _ears.

Nice to have: Hierarchy of our services as list or graph. Following parameters can be presented on the graph edges:
 * type of dependency (REST/SOAP/EJB)
 * internal/external for escidoc project
 * required/optional

=Functional Tests= Black box testing coupled with functional requirements of service/soulution. They are proceeded by
 * 1) automated functional tests on solution level
 * 2) * According to functionalities desc in CoLab. To be defined together with SVM. See example of reqs and [[Media:PubMan_QA_Tests.xls|testing plan]].
 * 3) manual functional tests of solutions by SVM
 * 4) automated gui tests (Selenium Tests, HTMLUnit) on solution level
 * 5) * To be defined with GUI according to test scenarios in JIRA, like for PubMan
 * 6) manual gui/browser tests by GUI group

Location: _logic_ear

Open Questions
=Performance Tests= Testing of services/solutions under the heavy loads; heavy repetition of certain actions or inputs; input of huge XMLs, fulttexts; large search queries, etc.

Location: _logic_ear and/or test framework

Open Questions

 * Performance tests should be defined in requirements documentation or QA or test plans.

=System Tests= Black box testing that is based on overall requirements specifications; it covers all combined parts of system. It does not cover integration testing, which is done to ensure that the software works well with other software products with which it has to be integrated.

In escidoc context the systems tests are complete bundle of the Functional Tests which should prove system functionality as whole.

Location: ?

Open Questions
=Bug Testing = Critical (regularly occurring) bugs should be identified together with SVM and checked with a appropriate test (Selenium or JUnit) on hand of its nature.
 * 1) Check 'now and again' bugs for test candidates
 * 2) Specify criteria for new bugs becoming test candidates

Location: Module where the bug took place.

Open Questions
=Test Resources=
 * 1) Common Test Resources
 * 2) * Should be created together with SVM (item with full md, previously critical items etc.)
 * 3) * Should be stored in one place (which one?)
 * 4) Specific Test Resources
 * 5) * To be defined individually by each module and saved in

=Schedule=
 * 1) Identify existing tests
 * 2) Create appropriate, generic test resources
 * 3) Enhance module tests according to specification
 * 4) Implement integration tests according to specification
 * 5) Check and specify what system tests are necessary for which solution
 * 6) Identify and implement bug tests

=Test Process on example of DAAS= According to the current specification on this page a full testing process of the Dataacquistion Service should look like follows:
 * 1) Module test of all interface methods (e.g. doFetch("arxiv", "arxiv:123", "oai_dc"), explainSources) with validation of outcome.
 * 2) Integration Test
 * 3) unapi e.g. http get: http://localhost:8080/dataacquisition/download/unapi?id=bmc:1472-6890-9-1&format=AJP_application/odt
 * 4) ejb e.g. doFetch("arxiv", "arxiv:123", "apa") (here DAAS needs to talk to transformation service)
 * 5) System test, e.g. selenium test of a user logging in, go to import page choose a source enter an identifier and save the item.

=Tools=
 * 1) RESTClient is easy Java-based REST testing open source tool
 * 2) soapUI is the leading open source tool for SOAP and REST testing, that comes with plug-ins for the following tools/IDE’s:
 * 3) * Maven 1.X/2.X plugin
 * 4) * NetBeans 5.5/6.0
 * 5) * IntelliJ IDEA 6+
 * 6) * Eclipse 3.2+
 * 7) * JBossWS/JBossIDE 2.0.0+
 * 8) Selenium is open source software testing framework for Web UI testing
 * 9) * Maven plugin
 * 10) PushToTest TestMaker is complete open source test automation platform that is appropriate for:
 * 11) * Application Testing – Avoid Downtime, Qualify Patches, Updates, Hardware Changes
 * 12) * Integration Testing – Surface Performance Issues When Services Call Services
 * 13) * Regression Testing – Surface Functionality Issues Before Customers Do
 * 14) * Tool & Library Testing – Optimize Performance at the Object Level
 * 15) * XML Optimization – Improve performance and scalability – AJAX, SOAP, REST
 * 16) * Performance Testing – Better forecast CPU, Network, Memory needs to meet SLAs

=References=
 * PubMan test scenarios in JIRA
 * SOA and Integration Testing