From Fedora Project Wiki

Line 128: Line 128:


'''Step 1'''- Identify all the methods by which we initiate testing (COMPLETED)
'''Step 1'''- Identify all the methods by which we initiate testing (COMPLETED)
* URL install source available
* DVD image(s) available
* CD image(s) available


'''Step 2'''- Identify test drivers that trigger off of those events (COMPLETED)
'''Step 2'''- Identify test drivers that trigger off of those events (COMPLETED)
 
* URL install source available
** url_install.py
* DVD image(s) available
** iso_sanity.py
** dvd_install.py
* CD image(s) available
** iso_sanity.py
** cd_install.py
'''Step 3'''- Review test possibilities for each test driver (INPROGRESS)
'''Step 3'''- Review test possibilities for each test driver (INPROGRESS)
 
* Possible dvd_install.py tests *
** different kickstart delivery methods (ks=)
*** ks=hd:  (default)
*** ks=http:
*** ks=nfs:
*** ks=file:
*** ks=cdrom:
*** ks=floppy
** Different package sets (%package)
*** @core
*** @core + @base (aka Minimal)
*** --default (aka Default)
** Different install.img sources (stage2=)
*** local DVD (default)
*** Remote install sources
** Different package sources (repo=)
*** local DVD (default)
*** Remote repositories (url, nfs)
** Partitioning scenarios
*** autopart
*** autopart + luks
*** RAID partitioning tests
*** LVM partitioning tests
*** specific root partitions (ext2, ext3, ext4, btrfs,xfs ...)
** Different block device tests
*** Hardware RAID
*** BIOS RAID
*** iSCSI
*** SATA, PATA, SCSI, VIO
** Display methods
*** graphical
*** text-mode
*** VNC graphical
*** cmdline
'''Step 4'''- Prioritize test possibilities for each test driver
'''Step 4'''- Prioritize test possibilities for each test driver



Revision as of 06:41, 29 April 2010

QA.png


Overview

The purpose of AutoQA install test automation is to simplify testing, reduce the test execution time and improve efficiency. The AutoQA install test project should address the following problems:

  • The system should be easy to allow for tester and developer use
  • Have clear documentation for customizing and creating new tests
  • Support test execution using existing Fedora infrastructure services, but not require them
  • Test results are easy to verify

Problem Space

The Fedora installer is a complicated software application that often requires significant setup time to properly and efficiently test. Installer failures typically come from the following areas:

Image Sanity

  1. ISO file size is too large (or small)
  2. Invalid SHA256 checksum
  3. Invalid implanted ISO md5sum
  4. Install environment - anaconda has specific application, library and config file format requirements.
  5. Versions check

Boot Methods

  1. Boot media improperly built (PXE, boot.iso, CD/DVD, efidisk.img)
  2. Installer fails to boot as a KVM guest
  3. Installer fails to boot as a XEN guest

Install Source

  1. Unable to detect install.img media
  2. Unable to transition to stage#2 installer

Kickstart Delivery

  1. Ks.cfg could not be obtained from specified location (http, ftp, nfs, hd, initrd)
  2. Install fails to proceed in accordance with the directives in the ks.cfg file
  3. Install improperly sets up networking based on command-line and kickstart network parameters (boot with ksdevice=eth1, ks.cfg contains eth2)

User Interface

  1. X driver problems while transitioning to graphical install
  2. Screen corruption during text-mode install
  3. VNC fails to start
  4. Serial console redirection improperly setup

Storage Devices

  1. Fail to detect existing storage device(s)
  2. Failure to clear stale data off of existing devices
  3. Unable to add iSCSI volumes

Partitioning

  1. Failure detecting existing partition scheme (lvm, mdraid, dmraid, luks)
  2. Failure when attempting to resize existing partitions
  3. Failures while attempting to re-use existing partitions
  4. Improperly clearing stale information from disks
  5. Unable to consistently resize an existing filesystem
  6. General failures while attempting to manually partition a system

Package Repository

  1. Unable to read metadata from package repositories (http, ftp, nfs, media)
  2. Failures while adding or modifying existing package repositories

Package Set

  1. Network timeout while retrieving packages
  2. Dependency problems while resolving package list
  3. File conflicts during package install
  4. Package order and install errors in install.log
  5. Improperly formatted comps.xml data

Boot loader configuration

  1. Unable to properly detect other operating systems
  2. Failure while setting up chainloader for another OS

Upgrade system

  1. Failure to detect previously installed systems
  2. Errors while attempting to update bootloader configuration during upgrade
  3. Package upgrade errors in upgrade.log

Recovery

  1. Rescue mode fails to detect existing installations (lvm, raid, luks)
  2. Rescue mode fails to establish networking
  3. Problems saving traceback information (local disk, bugzilla, remote server)
  4. Anaconda unable to download and use updates.img (from install source, local media or URL)
  5. Unable to transition to debug mode

Proposed Solution

First, in order to provide a consistent and documented test approach, the existing Fedora Install test plan [1] will be revisited. The test plan will be adjusted to ensure proper test coverage for the failure scenarios listed above. Existing test cases will be reviewed for accuracy. New test cases will be created using the Template:QA/Test_Case template. Finally, the test plan will be adjusted to match the improved Fedora Release Criteria [2]. This includes adjusting the test case priority to match milestone criteria.

Next, in order to reduce the setup/execution time, improve efficiency and to provide test results on a more consistent basis, a subset of test cases will be chosen for automation. Tests will be written in python and will be developed and executed on a system supporting KVM virtualization. Test scripts will be responsible for preparing a virtual install environment, initiating a kickstart install and validating the results. Once an initial batch of tests exist, they will be formally integrated into the AutoQA project.

Last, a method will be developed for collecting test results into a single test result matrix. Results may be posted to the wiki directly, or a custom turbogears application may be needed to display results [3]. The results will be easily accessible for testers and the installer development team.

Scope

The project will be divided into several phases.

Phase#1 - proof of concept

  • Pass pass Revise Fedora Install test plan to ensure adequate test coverage exists for failure scenarios listed above
  • Pass pass Select a small, but representative, subset of test cases from the install test plan to automate
  • Pass pass Create python scripts to prepare KVM-based virtual environments for testing, initiate kickstart installs, and validate results
  • Pass pass Investigate methods for leveraging GUI automation to aid in automating applicable test cases

Phase#2 - implementation

  • Inprogress inprogress Identify and automate remainder of test cases from the install test plan
  • Identify and integrate anaconda built-in unit tests into tests
  • Identify test event triggers which will be used to automatically initiate testing.
  • Identify test cases where GUI automation will be required

Phase#3 - integration

  • Create appropriate control files and test wrappers to allow for scheduling tests through AutoQA (see Writing_AutoQA_Tests)
  • Develop or update AutoQA test event hooks to accommodate new test events (see Writing_AutoQA_Hooks)
  • Implement initial test result dashboard intended to eventually replace the wiki test matrix. The dashboard will also support FAS user test result submission. This will likely rely on

Active Ingredients

  • AutoQA must be packaged and available in Fedora infrastructure
  • A valid backend test harness is packaged and available in Fedora infrastructure (Autotest currently being used by AutoQA)
  • Knowledge of TurboGears or TurboGears2
  • A new AutoQA test watcher may be required to initiate automated tests whenever CD and DVD ISO images are available for testing (post-iso-build)
  • Adding installer unit test execution and results display is dependent on the availability and environmental requirements of installer unit tests

Roadmap

Step 1- Identify all the methods by which we initiate testing (COMPLETED)

  • URL install source available
  • DVD image(s) available
  • CD image(s) available

Step 2- Identify test drivers that trigger off of those events (COMPLETED)

  • URL install source available
    • url_install.py
  • DVD image(s) available
    • iso_sanity.py
    • dvd_install.py
  • CD image(s) available
    • iso_sanity.py
    • cd_install.py

Step 3- Review test possibilities for each test driver (INPROGRESS)

  • Possible dvd_install.py tests *
    • different kickstart delivery methods (ks=)
      • ks=hd: (default)
      • ks=http:
      • ks=nfs:
      • ks=file:
      • ks=cdrom:
      • ks=floppy
    • Different package sets (%package)
      • @core
      • @core + @base (aka Minimal)
      • --default (aka Default)
    • Different install.img sources (stage2=)
      • local DVD (default)
      • Remote install sources
    • Different package sources (repo=)
      • local DVD (default)
      • Remote repositories (url, nfs)
    • Partitioning scenarios
      • autopart
      • autopart + luks
      • RAID partitioning tests
      • LVM partitioning tests
      • specific root partitions (ext2, ext3, ext4, btrfs,xfs ...)
    • Different block device tests
      • Hardware RAID
      • BIOS RAID
      • iSCSI
      • SATA, PATA, SCSI, VIO
    • Display methods
      • graphical
      • text-mode
      • VNC graphical
      • cmdline

Step 4- Prioritize test possibilities for each test driver

Step 5- Write test drivers using requirements gathered in steps #3-#4

Step 6- Write kickstart files to exercise the tests identified in step#4

Results

Discussion Points

  • List unresolved or active discussion

Comments?

To leave a comment, use the Talk page for this proposal.

Owner

Fedora QA team

References

Other documentation:

  1. General libvirt home - http://libvirt.org
  2. Examples on using libvirt and cobbler to aid in semi-automated install testing - http://jlaska.livejournal.com/tag/cobbler
  3. General Autotest information
  4. KVM is using autotest
  5. GUI automation using dogtail - https://fedorahosted.org/dogtail
  6. GUI automation using autotest step files - http://www.linux-kvm.org/page/KVM-Autotest/Steps