(137 intermediate revisions by 2 users not shown) | |||
Line 3: | Line 3: | ||
This page provides a high-level roadmap for implementing the [[Is_anaconda_broken_proposal]] project. More detailed tasks can be found in [https://fedorahosted.org/autoqa/milestone/Automate%20installation%20test%20plan autoqa TRAC roadmap]. We follow these steps to define the methods by which we initiate testing | This page provides a high-level roadmap for implementing the [[Is_anaconda_broken_proposal]] project. More detailed tasks can be found in [https://fedorahosted.org/autoqa/milestone/Automate%20installation%20test%20plan autoqa TRAC roadmap]. We follow these steps to define the methods by which we initiate testing | ||
First, in order to provide a consistent and documented test approach, the existing Fedora Install test plan <ref>[[QA:Fedora_13_Install_Test_Plan]]</ref> will be revisited. The test plan will be adjusted to ensure proper test coverage for the failure scenarios listed above. Existing test cases will be reviewed for accuracy. New test cases will be created using the [[Template:QA/Test_Case]] template. Finally, the test plan will be adjusted to match the improved Fedora Release Criteria <ref>[[Fedora Release Criteria]]</ref>. This includes adjusting the test case priority to match milestone criteria. | |||
Next, in order to reduce the setup/execution time, improve efficiency and to provide test results on a more consistent basis, a subset of test cases will be chosen for automation. Tests will be written in python and will be developed and executed on a system supporting KVM virtualization. Test scripts will be responsible for preparing a virtual install environment, initiating a kickstart install and validating the results. Once an initial batch of tests exist, they will be formally integrated into the [[AutoQA]] project. | |||
Last, a method will be developed for collecting test results into a single test result matrix. Results may be posted to the wiki directly, or a custom turbogears application may be needed to display results <ref>For a similar project see [http://jlaska.fedorapeople.org/irb.png screenshot] of ''is rawhide broken'' and [http://git.fedorahosted.org/git/?p=autoqa.git;a=tree;f=front-ends/israwhidebroken;hb=HEAD source code] and the [[QA:Rawhide_Acceptance_Test_Plan]].</ref>. The results will be easily accessible for testers and the installer development team. | |||
< | |||
<!-- Describe the scope of any project work or properties that will be affected by the proposal --> | |||
The project will be divided into several phases. | |||
==Phase#1 - proof of concept== | |||
* {{result|pass|Revise Fedora Install test plan to ensure adequate test coverage exists for failure scenarios listed above}} | |||
| | * {{result|pass|Select a small, but representative, subset of test cases from the install test plan to automate}}The following test cases are selected: | ||
** [[QA:Rawhide_Acceptance_Test_Plan]] | |||
** DVD.iso Installation | |||
** Boot.iso/Netinst.iso Installation | |||
** Live.iso Installation | |||
** upgrade an exiting system | |||
** system with basic video driver | |||
** Rescue installed system | |||
** [[QA:Testcase_Memtest86]] | |||
* {{result|pass|Create python scripts to prepare KVM-based virtual environments for testing, initiate kickstart installs, and validate results}} | |||
** {{result|pass|Check the virtualization environment system sanity}} | |||
** {{result|pass|virt-install tree and ISO media.}} | |||
** {{result|pass|Print out test results}} | |||
* {{result|pass| Create python scripts to parse differenet parameters}} | |||
** {{result|pass|parse different repository: cdrom, http,ftp(anonymout, non anonymus), nfs, nfsiso, hard drive, [[QA:Testcase_Ftp_Repository]], [[QA:Testcase_Nfs_Repository]]}} | |||
** {{result|pass|parse different kickstart delivery: http, file, hard drive, nfs [[QA/TestCases/KickstartKsHttpServerKsCfg]], [[QA/TestCases/KickstartKsFilePathKsCfg]], [[QA/TestCases/KickstartKsHdDevicePathKsCfg]], [[QA/TestCases/KickstartKsNfsServerPathKsCfg]]}} | |||
* Investigate methods for leveraging GUI automation to aid in automating applicable test cases | |||
** {{result|pass| Open Virt Viewer of the guest}} | |||
** {{result|pass| Input text into the GUI}} | |||
** Close the Virt Viewer of the guest | |||
| | |||
| | |||
| | |||
| | |||
| | |||
| | |||
| | |||
| | |||
| | |||
| | |||
| | |||
| | |||
|text | |||
== | ==Phase#2 - implementation== | ||
== | ===Implement the selected test cases=== | ||
* {{result|pass|[[QA:Rawhide_Acceptance_Test_Plan]]}} | |||
** {{result|pass|system sanity}} | |||
** {{result|pass|parse arguments}} | |||
** {{result|pass|Tree Compose Sanity}} | |||
*** {{result|pass|[[QA:Repodata_validity_test_case]]}} | |||
*** {{result|pass|[[QA:Comps_Validity_Test_Case]]}} | |||
*** {{result|pass|[[QA:Core_package_dependency_closure_test_case]]}} | |||
*** {{result|pass|[[QA:Core_package_existence_test_case]]}} | |||
** {{result|pass|[[QA:Installer_image_presence_test_case]]}} | |||
** {{result|pass|[[QA:Kernel_simple_boot_test_case]]}} | |||
** {{result|pass|[[QA:Anaconda_stage2_fetch_test_case]]}} | |||
** {{result|pass|[[QA:Anaconda_storage_probe_test_case]]}} | |||
** {{result|pass|[[QA:Anaconda_package_install_test_case]]}} | |||
* {{result|inprogress|RATS Installation}} | |||
** {{result|pass|system sanity}} | |||
** {{result|pass|parse arguments}} | |||
** {{result|pass|prepare kickstart}} | |||
*** {{result|pass|[[QA/TestCases/KickstartKsHttpServerKsCfg]]}} | |||
*** {{result|pass|[[QA/TestCases/KickstartKsFilePathKsCfg]]}} | |||
*** {{result|none|[[QA/TestCases/KickstartKsHdDevicePathKsCfg]]}} | |||
*** {{result|none|kickstart on cdrom (ks=cdrom)}} | |||
** {{result|pass|Tree Compose Sanity}} | |||
*** {{result|pass|[[QA:Repodata_validity_test_case]]}} | |||
*** {{result|pass|[[QA:Comps_Validity_Test_Case]] }} | |||
*** {{result|pass|[[QA:Core_package_dependency_closure_test_case]]}} | |||
*** {{result|none|[[QA:Core_package_existence_test_case]]}} | |||
** {{result|pass|[[QA:Installer_image_presence_test_case]]}} | |||
** {{result|pass|[[QA:Kernel_simple_boot_test_case]]}} | |||
** {{result|pass|[[QA:Anaconda_stage2_fetch_test_case]]}} | |||
** {{result|pass|[[QA:Anaconda_storage_probe_test_case]]}} | |||
** {{result|pass|[[QA:Anaconda_package_install_test_case]]}} | |||
** {{result|pass|Iterate different ksfile,eg. fedora,updates,updates-testing}} | |||
**{{result|none|Close virtio serial port}} | |||
* {{result|inprogress| DVD.iso Installation}} | |||
** {{result|pass|system sanity}} | |||
** {{result|pass|parse arguments, (kickstart, repo, architecture are parsed)}} | |||
** {{result|pass|prepare kickstart}} | |||
*** {{result|pass|create kickstart disk on guest if kick start is local}} | |||
*** {{result|pass|[[QA/TestCases/KickstartKsHttpServerKsCfg|kickstart on server (ks=http|ftp|nfs)]]}} | |||
*** {{result|none|[[QA/TestCases/KickstartKsFilePathKsCfg|kickstart ks=file://path]]}} | |||
**** {{result|none|compose kickstart to initrd}} | |||
*** {{result|none|[[QA/TestCases/KickstartKsHdDevicePathKsCfg|kicstart on hard drive (ks=hd:)]]}} | |||
*** {{result|none|kickstart on cdrom (ks=cdrom)}} | |||
** {{result|inprogress|prepare repo}} | |||
***{{result|pass|repo on http, ftp nfs, nfsiso server}} | |||
***{{result|none|repo on cdrom}} | |||
***{{result|none|repo on hd}} | |||
** {{result|none|download remote media to local, since checksums and repoclosure does not work with remote media}} | |||
** {{result|pass|[[QA:Testcase_Mediakit_ISO_Size]]}} | |||
** {{result|pass|[[QA:Testcase_Mediakit_ISO_Checksums]]}} | |||
** {{result|pass|[[QA:Testcase_Mediakit_Repoclosure]]}} | |||
** {{result|pass|[[QA:Testcase_Mediakit_FileConflicts]]}} | |||
** create a function to report mediakit sanity status | |||
** {{result|none|[[QA/TestCases/BootMethodsDvd]]}} | |||
** {{result|none|[[QA/TestCases/InstallSourceDvd]]}} | |||
** {{result|inprogress|Iterate all ksfiles}} | |||
**{{result|none|Close virtio serial port}} | |||
* {{result|none|Hard Drive Installation}} | |||
** {{result|none|system sanity}} | |||
** {{result|none|parse arguments, (kickstart, repo, architecture, image)}} | |||
** {{result|none|prepare kickstart}} | |||
*** {{result|none|create physical disk on guest to store image}} | |||
*** {{result|none|create kickstart disk on guest if kick start is local}} | |||
*** {{result|none|[[QA/TestCases/KickstartKsHttpServerKsCfg|kickstart on server (ks=http|ftp|nfs)]]}} | |||
*** {{result|none|[[QA/TestCases/KickstartKsFilePathKsCfg|kickstart ks=file://path]]}} | |||
**** {{result|none|compose kickstart to initrd}} | |||
*** {{result|none|[[QA/TestCases/KickstartKsHdDevicePathKsCfg|kicstart on hard drive (ks=hd:)]]}} | |||
*** {{result|none|kickstart on cdrom (ks=cdrom)}} | |||
** {{result|none|prepare repo}} | |||
***{{result|none|repo on http, ftp nfs, nfsiso server}} | |||
***{{result|none|repo on cdrom}} | |||
***{{result|none|repo on hd}} | |||
** {{result|none|[[QA:Testcase_Mediakit_ISO_Size]]}} | |||
** {{result|none|[[QA:Testcase_Mediakit_ISO_Checksums]]}} | |||
** {{result|none|[[QA:Testcase_Mediakit_Repoclosure]]}} | |||
** {{result|none|[[QA:Testcase_Mediakit_FileConflicts]]}} | |||
** create a function to report mediakit sanity status | |||
** {{result|none|[[QA/TestCases/BootMethodsDvd]]}} | |||
** {{result|none|[[QA/TestCases/InstallSourceDvd]]}} | |||
**{{result|none|Close virtio serial port}} | |||
* Boot.iso/Netinst.iso Installation | |||
** system sanity | |||
** [[QA:Testcase_Mediakit_ISO_Size]] | |||
** [[QA:Testcase_Mediakit_ISO_Checksums]] | |||
** [[QA/TestCases/BootMethodsBootIso]] | |||
** [[QA/TestCases/InstallSourceBootIso]] | |||
* Live.iso Installation | |||
** system sanity | |||
** [[QA:Testcase_Mediakit_ISO_Size]] | |||
** [[QA:Testcase_Mediakit_ISO_Checksums]] | |||
* [[QA:Testcase_Anaconda_Upgrade_New_Bootloader|upgrade an exiting system]] | |||
** perform a default installation of the previous release | |||
** install the current release | |||
* [[QA:Testcase_Anaconda_User_Interface_Basic_Video_Driver]] | |||
* [[QA:Testcase_Anaconda_rescue_mode]] | |||
* [[QA:Testcase_Memtest86|Memory test - memtest86]] | |||
===Automate remainder of test cases from the install test plan=== | |||
* Rawhide Acceptance Test Plan | |||
** [[QA:Anaconda_bootloader_setup_test_case]] | |||
** [[QA:X_basic_display_test_case]] | |||
** [[QA:X_basic_input_handling_test_case]] | |||
** [[QA:Network_basic_test_case]] | |||
** [[QA:Yum_simple_update_test_case]] | |||
* DVD installation | |||
** [[QA:Testcase_Additional_Http_Repository]] | |||
** [[QA:Testcase_Additional_Ftp_Repository]] | |||
** [[QA:Testcase_Additional_Mirrorlist_Repository]] | |||
** [[QA:Testcase_Additional_NFS_Repository]] | |||
* Boot.iso/netinst.iso installation | |||
** [[QA:Testcase_Http_Repository]] | |||
* Live.iso installation | |||
** [[QA:TestCases/Install_Source_Live_Image]] | |||
===Implement the general test cases=== | |||
* [[QA:Testcase_Anaconda_User_Interface_Cmdline]] | |||
* [[QA:Testcase_Anaconda_updates.img_via_URL]], [[QA:Testcase_Anaconda_updates.img_via_installation_source]], [[QA:Testcase_Anaconda_updates.img_via_local_media]] | |||
* [[QA:Testcase_Anaconda_save_traceback_to_remote_system]], [[QA:Testcase_Anaconda_save_traceback_to_bugzilla]], [[QA:Testcase_Anaconda_save_traceback_to_disk]], [[QA:Testcase_Anaconda_traceback_debug_mode]] | |||
* upgrade | |||
** [[QA:Testcase_Anaconda_Upgrade_New_Bootloader]] | |||
** [[QA:Testcase_Anaconda_Upgrade_Skip_Bootloader]] | |||
** [[QA:Testcase_Anaconda_Upgrade_Update_Bootloader]] | |||
** [[QA:Testcase_Anaconda_Upgrade_Encrypted_Root]] | |||
** [[QA:Testcase_Anaconda_Upgrade_Skip_Bootloader_Text_Mode]] | |||
** [[QA:Testcase_Anaconda_Upgrade_Update_Bootloader_Text_Mode]] | |||
* preupgrade | |||
** [[QA:Testcase_Preupgrade]] | |||
** [[QA:Testcase_Preupgrade_from_older_release]] | |||
* | === Create kick start database to cover test cases that can be covered with kick start=== | ||
* | * [[QA:Testcase_Anaconda_User_Interface_Graphical]] | ||
* | * [[QA:Testcase_Anaconda_User_Interface_Text]] | ||
* | * [[QA:Testcase_Anaconda_User_Interface_VNC]] | ||
* | * different packages selections: [[QA/TestCases/PackageSetsDefaultPackageInstall]], [[QA/TestCases/PackageSetsMinimalPackageInstall]] | ||
* different partition: [[QA:Testcase_Anaconda_autopart_install]], [[QA:Testcase_Anaconda_autopart_(encrypted)_install]], [[QA:Testcase_Anaconda_autopart_(shrink)_install]], [[QA:Testcase_Anaconda_autopart_(use_free_space)_install]], [[QA/TestCases/PartitioningExt4OnNativeDevice]], [[QA/TestCases/PartitioningExt3OnNativeDevice]], [[QA/TestCases/PartitioningXFSOnNativeDevice]], [[QA/TestCases/PartitioningBtrFSOnNativeDevice]], [[QA/TestCases/PartitioningNoSwap]], [[QA:Testcase_Partitioning_On_Software_RAID]] | |||
* [[QA:Testcase_Anaconda_rescue_mode]] | |||
* | |||
== | ==Phase#3 - integration== | ||
*Identify test event triggers which will be used to automatically initiate testing | |||
* Create appropriate control files and test wrappers to allow for scheduling tests through AutoQA (see [[Writing_AutoQA_Tests]]) | |||
* Develop or update [[AutoQA]] test event hooks to accommodate new test events (see [[Writing_AutoQA_Hooks]]) | |||
* Implement initial test result dashboard intended to eventually replace the wiki test matrix. The dashboard will also support FAS user test result submission. This will likely rely on | |||
== References == | |||
==References== | |||
<references/> | <references/> |
Latest revision as of 07:02, 1 June 2012
This page provides a high-level roadmap for implementing the Is_anaconda_broken_proposal project. More detailed tasks can be found in autoqa TRAC roadmap. We follow these steps to define the methods by which we initiate testing
First, in order to provide a consistent and documented test approach, the existing Fedora Install test plan [1] will be revisited. The test plan will be adjusted to ensure proper test coverage for the failure scenarios listed above. Existing test cases will be reviewed for accuracy. New test cases will be created using the Template:QA/Test_Case template. Finally, the test plan will be adjusted to match the improved Fedora Release Criteria [2]. This includes adjusting the test case priority to match milestone criteria.
Next, in order to reduce the setup/execution time, improve efficiency and to provide test results on a more consistent basis, a subset of test cases will be chosen for automation. Tests will be written in python and will be developed and executed on a system supporting KVM virtualization. Test scripts will be responsible for preparing a virtual install environment, initiating a kickstart install and validating the results. Once an initial batch of tests exist, they will be formally integrated into the AutoQA project.
Last, a method will be developed for collecting test results into a single test result matrix. Results may be posted to the wiki directly, or a custom turbogears application may be needed to display results [3]. The results will be easily accessible for testers and the installer development team.
The project will be divided into several phases.
Phase#1 - proof of concept
- Revise Fedora Install test plan to ensure adequate test coverage exists for failure scenarios listed above
- Select a small, but representative, subset of test cases from the install test plan to automateThe following test cases are selected:
- QA:Rawhide_Acceptance_Test_Plan
- DVD.iso Installation
- Boot.iso/Netinst.iso Installation
- Live.iso Installation
- upgrade an exiting system
- system with basic video driver
- Rescue installed system
- QA:Testcase_Memtest86
- Create python scripts to prepare KVM-based virtual environments for testing, initiate kickstart installs, and validate results
- Check the virtualization environment system sanity
- virt-install tree and ISO media.
- Print out test results
- Create python scripts to parse differenet parameters
- parse different repository: cdrom, http,ftp(anonymout, non anonymus), nfs, nfsiso, hard drive, QA:Testcase_Ftp_Repository, QA:Testcase_Nfs_Repository
- parse different kickstart delivery: http, file, hard drive, nfs QA/TestCases/KickstartKsHttpServerKsCfg, QA/TestCases/KickstartKsFilePathKsCfg, QA/TestCases/KickstartKsHdDevicePathKsCfg, QA/TestCases/KickstartKsNfsServerPathKsCfg
- Investigate methods for leveraging GUI automation to aid in automating applicable test cases
- Open Virt Viewer of the guest
- Input text into the GUI
- Close the Virt Viewer of the guest
Phase#2 - implementation
Implement the selected test cases
-
- system sanity
- parse arguments
- Tree Compose Sanity
- RATS Installation
- system sanity
- parse arguments
- prepare kickstart
- Tree Compose Sanity
- Iterate different ksfile,eg. fedora,updates,updates-testing
- DVD.iso Installation
- system sanity
- parse arguments, (kickstart, repo, architecture are parsed)
- prepare kickstart
- create kickstart disk on guest if kick start is local
-
- prepare repo
- repo on http, ftp nfs, nfsiso server
- create a function to report mediakit sanity status
- Iterate all ksfiles
- create a function to report mediakit sanity status
- Boot.iso/Netinst.iso Installation
- Live.iso Installation
- upgrade an exiting system
- perform a default installation of the previous release
- install the current release
- QA:Testcase_Anaconda_User_Interface_Basic_Video_Driver
- QA:Testcase_Anaconda_rescue_mode
- Memory test - memtest86
Automate remainder of test cases from the install test plan
- Rawhide Acceptance Test Plan
- DVD installation
- Boot.iso/netinst.iso installation
- Live.iso installation
Implement the general test cases
- QA:Testcase_Anaconda_User_Interface_Cmdline
- QA:Testcase_Anaconda_updates.img_via_URL, QA:Testcase_Anaconda_updates.img_via_installation_source, QA:Testcase_Anaconda_updates.img_via_local_media
- QA:Testcase_Anaconda_save_traceback_to_remote_system, QA:Testcase_Anaconda_save_traceback_to_bugzilla, QA:Testcase_Anaconda_save_traceback_to_disk, QA:Testcase_Anaconda_traceback_debug_mode
- upgrade
Create kick start database to cover test cases that can be covered with kick start
- QA:Testcase_Anaconda_User_Interface_Graphical
- QA:Testcase_Anaconda_User_Interface_Text
- QA:Testcase_Anaconda_User_Interface_VNC
- different packages selections: QA/TestCases/PackageSetsDefaultPackageInstall, QA/TestCases/PackageSetsMinimalPackageInstall
- different partition: QA:Testcase_Anaconda_autopart_install, QA:Testcase_Anaconda_autopart_(encrypted)_install, QA:Testcase_Anaconda_autopart_(shrink)_install, QA:Testcase_Anaconda_autopart_(use_free_space)_install, QA/TestCases/PartitioningExt4OnNativeDevice, QA/TestCases/PartitioningExt3OnNativeDevice, QA/TestCases/PartitioningXFSOnNativeDevice, QA/TestCases/PartitioningBtrFSOnNativeDevice, QA/TestCases/PartitioningNoSwap, QA:Testcase_Partitioning_On_Software_RAID
- QA:Testcase_Anaconda_rescue_mode
Phase#3 - integration
- Identify test event triggers which will be used to automatically initiate testing
- Create appropriate control files and test wrappers to allow for scheduling tests through AutoQA (see Writing_AutoQA_Tests)
- Develop or update AutoQA test event hooks to accommodate new test events (see Writing_AutoQA_Hooks)
- Implement initial test result dashboard intended to eventually replace the wiki test matrix. The dashboard will also support FAS user test result submission. This will likely rely on
References
- ↑ QA:Fedora_13_Install_Test_Plan
- ↑ Fedora Release Criteria
- ↑ For a similar project see screenshot of is rawhide broken and source code and the QA:Rawhide_Acceptance_Test_Plan.