From Fedora Project Wiki
QA.png


This page provides a high-level roadmap for implementing the Is_anaconda_broken_proposal project. More detailed tasks can be found in autoqa TRAC roadmap. We follow these steps to define the methods by which we initiate testing


First, in order to provide a consistent and documented test approach, the existing Fedora Install test plan [1] will be revisited. The test plan will be adjusted to ensure proper test coverage for the failure scenarios listed above. Existing test cases will be reviewed for accuracy. New test cases will be created using the Template:QA/Test_Case template. Finally, the test plan will be adjusted to match the improved Fedora Release Criteria [2]. This includes adjusting the test case priority to match milestone criteria.

Next, in order to reduce the setup/execution time, improve efficiency and to provide test results on a more consistent basis, a subset of test cases will be chosen for automation. Tests will be written in python and will be developed and executed on a system supporting KVM virtualization. Test scripts will be responsible for preparing a virtual install environment, initiating a kickstart install and validating the results. Once an initial batch of tests exist, they will be formally integrated into the AutoQA project.

Last, a method will be developed for collecting test results into a single test result matrix. Results may be posted to the wiki directly, or a custom turbogears application may be needed to display results [3]. The results will be easily accessible for testers and the installer development team.


The project will be divided into several phases.

Phase#1 - proof of concept

  • Pass pass Revise Fedora Install test plan to ensure adequate test coverage exists for failure scenarios listed above
  • Pass pass Select a small, but representative, subset of test cases from the install test plan to automate
    The following test cases are selected:
    • Rawhide Acceptance Test Plan [[1]]
    • DVD.iso Installation
    • Boot.iso/Netinst.iso Installation
    • Live.iso Installation
    • upgrade an exiting system
    • system with basic video driver
    • Rescue installed system
    • Memory test [[2]]
  • Inprogress inprogress Create python scripts to prepare KVM-based virtual environments for testing, initiate kickstart installs, and validate results
    • Pass pass Check the virtualization environment system sanity
    • Inprogress inprogress virt-install tree and ISO media. Now virt-install tree and ISO are split
    • Pass pass Print out test results
  • Inprogress inprogress Investigate methods for leveraging GUI automation to aid in automating applicable test cases
    • Pass pass Open Virt Viewer of the guest
    • Pass pass Input text into the GUI
    • Close the Virt Viewer of the guest

Phase#2 - implementation

Implement the selected test cases

  • Pass pass Rawhide Acceptance Test Plan [[3]]
    • Pass pass repodata validity[[4]]
    • Pass pass comps.xml validity[[5]]
    • Pass pass Core package dependency closure[[6]]
    • Pass pass Core package existence[[7]]
    • Pass pass installer image existence[[8]]
    • Pass pass Kernel boot[[9]]
    • Pass pass Anaconda loader fetching stage2[[10]]
    • Pass pass Anaconda stage2 disk probe[[11]]
    • Pass pass Anaconda package install[[12]]
  • Inprogress inprogress DVD.iso Installation
    • Inprogress inprogress mediakit_ISO size[[13]]
    • Inprogress inprogress mediakit_ISO checksums[[14]]
    • Inprogress inprogress mediakit repoclosure[[15]]
    • Inprogress inprogress mediakit file conflicts[[16]]
    • Inprogress inprogress boot methods[[17]]
    • Inprogress inprogress install source[[18]]
  • Boot.iso/Netinst.iso Installation
    • mediakit ISO size[[19]]
    • mediakit ISO checksums[[20]]
    • boot methods[[21]]
    • install source[[22]]
  • Live.iso Installation
    • media ISO size[[23]]
    • mediakit ISO checksums[[24]]
  • upgrade an exiting system[[25]]
    • perform a default installation of the previous release
    • install the current release
  • system with basic video driver[[26]]
  • Rescue installed system[[27]]
  • Memory test [[28]]

Automate remainder of test cases from the install test plan

  • Rawhide Acceptance Test Plan
    • Anaconda bootloader setup[[29]]
    • X startup/basic display configuration[[30]]
    • X basic input handing[[31]]
    • basic network connectivity[[32]]
    • yum update functionality[[33]]
  • DVD installation
    • additional http repository[[34]]
    • additional ftp repository[[35]]
    • additional mirrorlist repository[[36]]
    • additional nfs repository[[37]]
  • Boot.iso/netinst.iso installation
    • http repository[[38]]
  • Live.iso installation
    • install source live image[[39]]

Implement the general test cases

  • Anaconda user interface graphical [[40]]
  • Anaconda user interface basic video driver[[41]]
  • Anaconda user interface text [[42]]
  • Anaconda user interface VNC [[43]]
  • Anaconda user interface cmdline [[44]]
  • parse different repository: cdrom, http,ftp(anonymout, non anonymus), nfs, nfsiso, hard drive[[45]][[46]]
  • parse different kickstart delivery: http, file, hard drive, nfs [[47]] [[48]][[49]][[50]]
  • different packages selections: default, minimal [[51]][[52]]
  • different partation: autopart, autopart encrypted, autopart shrink install, autopart use free space, ext4 on native device, ext3 on native device, no swap, software raid
  • rescue mode
    • update.img via url installation source local media[[53]] [[54]][[55]]
    • Anaconda save traceback to remote system/ bugzilla/ disk /debug mode [[56]][[57]][[58]][[59]]
  • upgrade
    • new boot loader[[60]]
    • skip boot loader[[61]]
    • updte boot loader[[62]]
    • encrypted root[[63]]
    • skip boot loader text mode[[64]]
    • update boot loader text mode[[65]]
  • preupgrade
    • preupgrade[[66]]
    • preupgrade from older release[[67]]



Phase#3 - integration

  • Identify test event triggers which will be used to automatically initiate testing*
  • Create appropriate control files and test wrappers to allow for scheduling tests through AutoQA (see Writing_AutoQA_Tests)
  • Develop or update AutoQA test event hooks to accommodate new test events (see Writing_AutoQA_Hooks)
  • Implement initial test result dashboard intended to eventually replace the wiki test matrix. The dashboard will also support FAS user test result submission. This will likely rely on