Software Testing

From MPDLMediaWiki
Revision as of 12:08, 6 July 2011 by YaoYan (talk | contribs) (Reverted edits by YaoYan (talk) to last revision by Makarenko)
Jump to navigation Jump to search

Please Note: Work in Progress


This page will describe the testing approach for the services and solutions developed at the MPDL.

Module Tests[edit]

are Unit tests which based on knowledge of the internal program design and code. Should be done by developers. Following test types are suggested:

  • Interface Tests should check all methods in the modules interface. [required]
    • Class Name convention:
src/test/java/<name_of_module_interface>InterfaceTest.java
example: src/test/java/CitationStyleHandlerInterfaceTest.java
  • The class should be explicitely declared in the module pom.xml
...
<plugin>
<artifactId>maven-surefire-plugin</artifactId>
<configuration>
<skip>false</skip>
<includes>
<include>**/CitationStyleHandlerInterfaceTest.java</include>
<!-- possible other test classes -->
<include>**/TestCitationStylesSubstantial.java</include>
</includes>
</configuration>
</plugin>
...
The existence of the required Interface test class can be checked later by parsing of pom.xml (e.g with command line utility)
  • All other Unit tests for the module should be presented in a separate test class. [optional]


Open Questions[edit]

  • How to define scope of parameter values of tested interfaces ? (e.g. dataacquistion, test all fetching formats for all sources? is one exemplary source enough?)
    Proposal:
    • For enumerated parameters
      • Pass all parameters in loop
      • Pass parameter which is not in the scope and check exception handling
    • For non-regular parameters, like XML input
      • Check wellformed/malformed XML
      • Check validity of output XML against the current XML Scheme (validation service?)
      • Check output substantially
    • Test exception handling on the different types of errors (e.g. for levels: fatal, errors, warnings) ?

Integration Tests[edit]

are assumed to be used for combined parts of an application to determine if they function together correctly. Integration tests are only available for services which have dependencies to other services and should be called in the jboss context. Integration Tests should be performed for:

  1. HTTP interfaces (REST, unapi, other)
    • Different tools already exist which we can use (as mvn dependency or java code generator).
    • Tests should be generic, therefore we could introduce a httpInterface.xml which provides the links for the interfaces to test (not static in source code). Thus testing a new interface should only demand entering the url in this httpInterface.xml
    • Example in python
  2. EJB interfaces

Location: Currently, the integration tests are implemented in _ears.

Test Process[edit]

  • Check if the service can communicate to the required others (e.g. response code 200 for HTTP, AssertNotNull for EJB) [required]
  • Check if the outcome of a method is the specified, desired one (e.g. valid escidoc xml). [optional]
  • Test input should check valid and invalid parameters

Open Questions[edit]

  • What is the essence of integration tests:
    • testing of the service ability to invoke related services in jboss context + testing of the basic service functionality itself (calling of the service interfaces with only one set of parameters, service work/doesn't work)
    • tests in previous item + entire substantial service testing
  • Integration Test Service for each project (common_services, pubman, etc) vs./additional to _ears.

Nice to have: Hierarchy of our services as list or graph. Following parameters can be presented on the graph edges:

  • type of dependency (REST/SOAP/EJB)
  • internal/external for escidoc project
  • required/optional

Functional Tests[edit]

Black box testing coupled with functional requirements of service/soulution. They are proceeded by

  1. automated functional tests on solution level
    • According to functionalities desc in CoLab. To be defined together with SVM. See example of reqs and testing plan.
  2. manual functional tests of solutions by SVM
  3. automated gui tests (Selenium Tests, HTMLUnit) on solution level
    • To be defined with GUI according to test scenarios in JIRA, like for PubMan
  4. manual gui/browser tests by GUI group

Location: <solution>_logic_ear

Open Questions[edit]

Performance Tests[edit]

Testing of services/solutions under the heavy loads; heavy repetition of certain actions or inputs; input of huge XMLs, fulttexts; large search queries, etc.

Location: <solution>_logic_ear and/or test framework

Open Questions[edit]

  • Performance tests should be defined in requirements documentation or QA or test plans.

System Tests[edit]

Black box testing that is based on overall requirements specifications; it covers all combined parts of system. It does not cover integration testing, which is done to ensure that the software works well with other software products with which it has to be integrated.

In escidoc context the systems tests are complete bundle of the Functional Tests which should prove system functionality as whole.

Location: ?

Open Questions[edit]

Bug Testing[edit]

Critical (regularly occurring) bugs should be identified together with SVM and checked with a appropriate test (Selenium or JUnit) on hand of its nature.

  1. Check 'now and again' bugs for test candidates
  2. Specify criteria for new bugs becoming test candidates

Location: Module where the bug took place.

Open Questions[edit]

Test Resources[edit]

  1. Common Test Resources
    • Should be created together with SVM (item with full md, previously critical items etc.)
    • Should be stored in one place (which one?)
  2. Specific Test Resources
    • To be defined individually by each module and saved in src/test/resources

Schedule[edit]

  1. Identify existing tests
  2. Create appropriate, generic test resources
  3. Enhance module tests according to specification
  4. Implement integration tests according to specification
  5. Check and specify what system tests are necessary for which solution
  6. Identify and implement bug tests

Test Process on example of DAAS[edit]

According to the current specification on this page a full testing process of the Dataacquistion Service should look like follows:

  1. Module test of all interface methods (e.g. doFetch("arxiv", "arxiv:123", "oai_dc"), explainSources()) with validation of outcome.
  2. Integration Test
    1. unapi e.g. http get: http://localhost:8080/dataacquisition/download/unapi?id=bmc:1472-6890-9-1&format=AJP_application/odt
    2. ejb e.g. doFetch("arxiv", "arxiv:123", "apa") (here DAAS needs to talk to transformation service)
  3. System test, e.g. selenium test of a user logging in, go to import page choose a source enter an identifier and save the item.

Tools[edit]

  1. RESTClient is easy Java-based REST testing open source tool
  2. soapUI is the leading open source tool for SOAP and REST testing, that comes with plug-ins for the following tools/IDE’s:
    • Maven 1.X/2.X plugin
    • NetBeans 5.5/6.0
    • IntelliJ IDEA 6+
    • Eclipse 3.2+
    • JBossWS/JBossIDE 2.0.0+
  3. Selenium is open source software testing framework for Web UI testing
    • Maven plugin
  4. PushToTest TestMaker is complete open source test automation platform that is appropriate for:
    • Application Testing – Avoid Downtime, Qualify Patches, Updates, Hardware Changes
    • Integration Testing – Surface Performance Issues When Services Call Services
    • Regression Testing – Surface Functionality Issues Before Customers Do
    • Tool & Library Testing – Optimize Performance at the Object Level
    • XML Optimization – Improve performance and scalability – AJAX, SOAP, REST
    • Performance Testing – Better forecast CPU, Network, Memory needs to meet SLAs

References[edit]