Have questions? Stuck? Please check our FAQ for some common questions and answers.

This wiki documents the current development version of ONOS (master). Refer to the Wiki Archives for documentation for all previous versions of ONOS.

Skip to end of metadata
Go to start of metadata

A few quick tips to get started contributing to OnosSystemTest:

1) Subscribe to the Testing mailing list <onos-test@onosproject.org>. All mailing lists can be found here.

2) Sign up for an account on onosproject.org so as to be able to commit test codes to gerrit.onosproject.org.

3) If you are not familiar with Gerrit workflow, please get a quick tutorial on it: Gerrit Workflow for System Test Development.

4) Learn about the TestON Framework for authoring test cases here: OnosSystemTest/TestON Tutorial

Fundamentals on authoring a test case

Because we are targeting all tests to be automated in an CI env, there are several fundamental principles that we have to adhere to when writing test cases and driver files:

  1. Portability - allowing the test case be run in other similar test environments; at the same time easier for us to merge the community contributed test in our production environment;
  2. Stability - the test should take in considerations of various environment (e.g. response time of starting onos on VMs vs on Bare metals), therefore, test case should have stability in passing or failing results;
  3. Clarity - we should try all efforts to make the Python cases easy to understand and follow;
  4. Debuggability - use logging capability liberally and catch all cases of exceptions, and make sure that test failures don't go silently.

In order to achieve the objectives, it is essential to have a good understanding of the Jenkins - TestON interactions and abstractions in running a test case. The following diagram illustrates the interaction and abstraction.



TestON Scripting General Guidelines:

  1. Pre-requisites  and testbed environment setting - should be set manually, and/or with CI framework, e.g. through Jenkins jobs:
    1.  (First time setup only)TestStations:
      • should have a root account of "sdn" (password: rocks) to run TestON cli (so all hosts in the test infrastructure has the same sdn/rocks credential)
      • should be able to logon to all nodes specified in the cell. See this guide to setup ssh keys.
    2. (First time setup only)"onos Bench" and cells: set up per "ONOS from Scratch"
    3. (First time setup only)Mininet (OCN) host: set up per "Mininet Walkthrough"
      Note: it is possible to run "TestStation", "onos Bench" and "Mininet" on the same host, with .topo file set up accordingly.
    4. Clean up TestON and MN before each test run
    5. Set OnosSystemTest version, and git pull "OnosSystemTest" from gerrit.onosproject.org
    6. Set onos version, and git pull onos from gerrit.onosproject.org
    7. Compile/build onos
    8. Setup onos JVM related configurations if default is not desirable
    9. Run test cases - see Scripting guidelines below
    10. (Optional) Run post-test tasks, such as data storage, result publishing, etc.
    11. (Optional)Teardown: onos uninstall and mn cleanup.

  1. OnosSystemTest/TestON Script should perform in:
    1. Test suite naming, see Test Plans.
    2. README file
      • Should describe the test topology to run the default test case.
      • Should explain the main goal of the test
      • Provide any additional requirements needed to run the test
    3. testname.params
      • Use env variable names to reference to components - no static IP's
      • Move any hardcoded values to this file - e.g. sleep times, counters, file names... etc.
      • If any modifications need to be made to the test, it should be done in this file
    4. testname.topo
      • Use env variable names to reference to components - no static IP's;
      • Leave out any passwords for the login information - Password-less login should be setup
    5. testname.py
      • Log onos version/ commit
      • Set prompt, terminal type
      • Handel test case dependencies
      • Explicit onos cell creation and setting
      • Explicit activation of bundles/apps
      • Test case specific app configurations for non-default values - cell apps should be specified in the params file
      • Test dependencies should be stored in the Dependency folder located in the test folder - Mininet topologies and helper functions should all be in the Dependency folder
      • Test log should log the relevant config information - Log is cheap; be as verbose as possible to help the debugging process
      • Avoid static reference to paths, files - put any static references in the .params file
      • Check and log summary of onos exceptions, errors, warnings - Ideally, this should  be done after each test case
      • Handling of Test results - write to test log and  /tmp file for post-test processing purposes - Try to assert on every test result so that it can be shown on the wiki

  • No labels