From Fedora Project Wiki

 
(7 intermediate revisions by the same user not shown)
Line 39: Line 39:
==Phase#2 - implementation==
==Phase#2 - implementation==
===Implement the selected test cases===
===Implement the selected test cases===
* {{result|pass|}}
* {{result|pass|[[QA:Rawhide_Acceptance_Test_Plan]]}}
* {{result|pass|[[QA:Rawhide_Acceptance_Test_Plan]]}}
** {{result|pass|system sanity}}
** {{result|pass|system sanity}}
Line 53: Line 52:
** {{result|pass|[[QA:Anaconda_storage_probe_test_case]]}}
** {{result|pass|[[QA:Anaconda_storage_probe_test_case]]}}
** {{result|pass|[[QA:Anaconda_package_install_test_case]]}}
** {{result|pass|[[QA:Anaconda_package_install_test_case]]}}
* {{result|inprogress|URL Installation}}
* {{result|inprogress|RATS Installation}}
** {{result|passsystem sanity}}
** {{result|pass|system sanity}}
** {{result|pass|parse arguments}}
** {{result|pass|parse arguments}}
** {{result|none|prepare kickstart}}
** {{result|pass|prepare kickstart}}
*** {{result|pass|[[QA/TestCases/KickstartKsHttpServerKsCfg]]}}
*** {{result|pass|[[QA/TestCases/KickstartKsHttpServerKsCfg]]}}
*** {{result|pass|[[QA/TestCases/KickstartKsFilePathKsCfg]]}}
*** {{result|pass|[[QA/TestCases/KickstartKsFilePathKsCfg]]}}
Line 65: Line 64:
*** {{result|pass|[[QA:Comps_Validity_Test_Case]] }}
*** {{result|pass|[[QA:Comps_Validity_Test_Case]] }}
*** {{result|pass|[[QA:Core_package_dependency_closure_test_case]]}}
*** {{result|pass|[[QA:Core_package_dependency_closure_test_case]]}}
*** {{result|inprogress|[[QA:Core_package_existence_test_case]]}}
*** {{result|none|[[QA:Core_package_existence_test_case]]}}
** {{result|pass|[[QA:Installer_image_presence_test_case]]}}
** {{result|pass|[[QA:Installer_image_presence_test_case]]}}
** {{result|pass|[[QA:Kernel_simple_boot_test_case]]}}
** {{result|pass|[[QA:Kernel_simple_boot_test_case]]}}
Line 71: Line 70:
** {{result|pass|[[QA:Anaconda_storage_probe_test_case]]}}
** {{result|pass|[[QA:Anaconda_storage_probe_test_case]]}}
** {{result|pass|[[QA:Anaconda_package_install_test_case]]}}
** {{result|pass|[[QA:Anaconda_package_install_test_case]]}}
**{{result|none|Colose virtio serial port}}
** {{result|pass|Iterate different ksfile,eg. fedora,updates,updates-testing}}
**{{result|none|Close virtio serial port}}
* {{result|inprogress| DVD.iso Installation}}
* {{result|inprogress| DVD.iso Installation}}
** {{result|pass|system sanity}}
** {{result|pass|system sanity}}
Line 86: Line 86:
***{{result|none|repo on cdrom}}
***{{result|none|repo on cdrom}}
***{{result|none|repo on hd}}
***{{result|none|repo on hd}}
** download remote media to local, since checksums and repoclosure does not work with remote media.
** {{result|none|download remote media to local, since checksums and repoclosure does not work with remote media}}
** {{result|pass|[[QA:Testcase_Mediakit_ISO_Size]]}}
** {{result|pass|[[QA:Testcase_Mediakit_ISO_Size]]}}
** {{result|pass|[[QA:Testcase_Mediakit_ISO_Checksums]]}}
** {{result|pass|[[QA:Testcase_Mediakit_ISO_Checksums]]}}
Line 94: Line 94:
** {{result|none|[[QA/TestCases/BootMethodsDvd]]}}
** {{result|none|[[QA/TestCases/BootMethodsDvd]]}}
** {{result|none|[[QA/TestCases/InstallSourceDvd]]}}
** {{result|none|[[QA/TestCases/InstallSourceDvd]]}}
**{{result|none|Colose virtio serial port}}
** {{result|inprogress|Iterate all ksfiles}}
**{{result|none|Close virtio serial port}}
* {{result|none|Hard Drive Installation}}
** {{result|none|system sanity}}
** {{result|none|parse arguments, (kickstart, repo, architecture, image)}}
** {{result|none|prepare kickstart}}
*** {{result|none|create physical disk on guest to store image}}
*** {{result|none|create kickstart disk on guest if kick start is local}}
*** {{result|none|[[QA/TestCases/KickstartKsHttpServerKsCfg|kickstart on server (ks=http|ftp|nfs)]]}}
*** {{result|none|[[QA/TestCases/KickstartKsFilePathKsCfg|kickstart ks=file://path]]}}
**** {{result|none|compose kickstart to initrd}}
*** {{result|none|[[QA/TestCases/KickstartKsHdDevicePathKsCfg|kicstart on hard drive (ks=hd:)]]}}
*** {{result|none|kickstart on cdrom (ks=cdrom)}}
** {{result|none|prepare repo}}
***{{result|none|repo on http, ftp nfs, nfsiso server}}
***{{result|none|repo on cdrom}}
***{{result|none|repo on hd}}
** {{result|none|[[QA:Testcase_Mediakit_ISO_Size]]}}
** {{result|none|[[QA:Testcase_Mediakit_ISO_Checksums]]}}
** {{result|none|[[QA:Testcase_Mediakit_Repoclosure]]}}
** {{result|none|[[QA:Testcase_Mediakit_FileConflicts]]}}
** create a function to report mediakit sanity status
** {{result|none|[[QA/TestCases/BootMethodsDvd]]}}
** {{result|none|[[QA/TestCases/InstallSourceDvd]]}}
**{{result|none|Close virtio serial port}}
* Boot.iso/Netinst.iso Installation
* Boot.iso/Netinst.iso Installation
** system sanity
** system sanity
Line 158: Line 182:
* Develop or update [[AutoQA]] test event hooks to accommodate new test events (see [[Writing_AutoQA_Hooks]])
* Develop or update [[AutoQA]] test event hooks to accommodate new test events (see [[Writing_AutoQA_Hooks]])
* Implement initial test result dashboard intended to eventually replace the wiki test matrix.  The dashboard will also support FAS user test result submission.  This will likely rely on
* Implement initial test result dashboard intended to eventually replace the wiki test matrix.  The dashboard will also support FAS user test result submission.  This will likely rely on
== References ==
<references/>

Latest revision as of 07:02, 1 June 2012

This page provides a high-level roadmap for implementing the Is_anaconda_broken_proposal project. More detailed tasks can be found in autoqa TRAC roadmap. We follow these steps to define the methods by which we initiate testing


First, in order to provide a consistent and documented test approach, the existing Fedora Install test plan [1] will be revisited. The test plan will be adjusted to ensure proper test coverage for the failure scenarios listed above. Existing test cases will be reviewed for accuracy. New test cases will be created using the Template:QA/Test_Case template. Finally, the test plan will be adjusted to match the improved Fedora Release Criteria [2]. This includes adjusting the test case priority to match milestone criteria.

Next, in order to reduce the setup/execution time, improve efficiency and to provide test results on a more consistent basis, a subset of test cases will be chosen for automation. Tests will be written in python and will be developed and executed on a system supporting KVM virtualization. Test scripts will be responsible for preparing a virtual install environment, initiating a kickstart install and validating the results. Once an initial batch of tests exist, they will be formally integrated into the AutoQA project.

Last, a method will be developed for collecting test results into a single test result matrix. Results may be posted to the wiki directly, or a custom turbogears application may be needed to display results [3]. The results will be easily accessible for testers and the installer development team.


The project will be divided into several phases.

Phase#1 - proof of concept

  • Pass pass Revise Fedora Install test plan to ensure adequate test coverage exists for failure scenarios listed above
  • Pass pass Select a small, but representative, subset of test cases from the install test plan to automate
    The following test cases are selected:
  • Pass pass Create python scripts to prepare KVM-based virtual environments for testing, initiate kickstart installs, and validate results
    • Pass pass Check the virtualization environment system sanity
    • Pass pass virt-install tree and ISO media.
    • Pass pass Print out test results
  • Pass pass Create python scripts to parse differenet parameters
  • Investigate methods for leveraging GUI automation to aid in automating applicable test cases
    • Pass pass Open Virt Viewer of the guest
    • Pass pass Input text into the GUI
    • Close the Virt Viewer of the guest

Phase#2 - implementation

Implement the selected test cases

Automate remainder of test cases from the install test plan

Implement the general test cases

Create kick start database to cover test cases that can be covered with kick start

Phase#3 - integration

  • Identify test event triggers which will be used to automatically initiate testing
  • Create appropriate control files and test wrappers to allow for scheduling tests through AutoQA (see Writing_AutoQA_Tests)
  • Develop or update AutoQA test event hooks to accommodate new test events (see Writing_AutoQA_Hooks)
  • Implement initial test result dashboard intended to eventually replace the wiki test matrix. The dashboard will also support FAS user test result submission. This will likely rely on

References