ExtendedAudioTesting

Differences between revisions 3 and 4
Revision 3 as of 2009-06-02 15:38:19
Size: 10533
Editor: pool-173-76-105-89
Comment: Fleshed out spec
Revision 4 as of 2009-06-08 14:30:08
Size: 8647
Editor: cpc4-oxfd8-0-0-cust39
Comment:
Deletions are marked like this. Additions are marked like this.
Line 10: Line 10:
Extend both manual and automated test cases to test a range of use cases and hardware configurations on a regular basis. Extend both manual and automated audio test cases to test a range of use cases and hardware configurations on a regular basis.
Line 38: Line 38:
Tested systems will have hardware configuration changes made to enable automated testing. This will consist of simply connecting the default audio output to the default audio input with a patch cable. This will allow automated tests to produce a sound on the audio output and listen for it on the input.

Automated Checkbox tests will be created relying on this configuration to test audio.

In the future additional hardware configuration changes will be considered. One possibility is producing a detectable sound in the environment containing the tested systems which can be picked up by internal microphones. (This makes sense only for hardware with internal microphones; server systems are unlikely to benefit from this sort of infrastructure.)
Tested systems will have hardware configuration changes made to enable automated testing. This will consist of simply connecting the default audio output to the default audio input with a patch cable. This will allow automated tests to produce a sound on the audio output and listen for it on the input. Automated Checkbox tests will be created relying on this configuration to test audio.
Line 46: Line 42:
A series of Checkbox tests will be created to test audio. Users who have selected to perform extended audio testing will be presented with these tests. The tests will be designed in such a fashion as to reduce and/or eliminate redundant tests and to keep the number of tests to a minimum. A series of manual Checkbox tests will be created to test audio and added to {{{checkbox-extras}}}. The test dependencies will be designed as to reduce redundant tests.
Line 53: Line 49:

==== Hardware Configuration ====

Hardware will be configured as described in the Design section above.

==== Checkbox Plugins ====
Line 77: Line 67:
An additional suite of Checkbox tests will be written to test audio. This suite will not be presented to users by default; users must indicate that they are interested in performing extended audio testing to execute these tests. An additional suite of Checkbox tests will be written to test audio and added to the checkbox-extras package.
Line 91: Line 81:
== Unresolved issues == == Future work ==
Line 95: Line 85:
In the future additional hardware configuration changes will be considered. One possibility is producing a detectable sound in the environment containing the tested systems which can be picked up by internal microphones.
Line 97: Line 89:
==== Goal ====
 * Extend testing of audio, breadth and depth
 * can be manual or automated
 * What sort of testing would be valuable?
'''Make sure the key bits from here are integrated in the spec:'''
Line 105: Line 94:
 * make sure that we're hitting the right output(s) on devices that have multiple outputs
  * more the case on "more interesting" hardware (pro audio devices) but still valuable on laptops and desktops
Line 110: Line 97:
 * attach cable from output to input and run some sounds through
  * can run automated tests through Checkbox
  * want to check and see if there is not sound and submit that info
  * issues with this
   * may not be able to tell if it is microphone or speaker jack not working
   * if we plug in a USB device to test this, we can narrow down where the problem is but it introduces many new variables into the sound test
 * testing internal microphone might be fairly easy
 * audio fidelity
Line 120: Line 99:
 * "Does sound work?" Datacenter test
  * test playback
   * assume a headset... we can move onto jack sensing later
Line 139: Line 116:
==== Tools ====
Tests may require kernel changes (debugging enabled, rebuild ALSA modules, etc.). May be able to build packages with these modules enabled. (We may want to detect problems more than debug them, so this might not be a huge deal for testing.)
  * hda-analyzer
   * python-based
   * does some automated testing
   * on alsa wiki http://www.alsa-project.org
  * hda-verb
   * simple C program
   * allows sending of HDA commands to devices for testing

Summary

Extend both manual and automated audio test cases to test a range of use cases and hardware configurations on a regular basis.

Release Note

Extended audio testing is available. Certain tests are conduced automatically by the System Testing application, but extended tests are available for users who want to help ensure that a wide variety of audio hardware works correctly with Ubuntu.

Rationale

Functioning audio is crucial to the Ubuntu desktop experience. Users should be able to expect sound to work properly from install onward. Testing audio in a variety of configurations across a range of hardware will help us to ensure that our sound support meets user expectations.

User stories

  • George wants to assist with the Ubuntu development effort but has no development or testing experience. He runs Checkbox and performs some simple audio tests. He is pleased that his efforts will help indicate which hardware needs further development efforts in Ubuntu.
  • Harriet has a wide array of audio hardware. Some of this hardware is not properly supported by Ubuntu. She is able to run Checkbox in an extended mode and report bugs on nonfunctioning hardware.
  • Ronald runs multiple automated tests on a wide variety of systems on a daily basis. He wants to be able to tell when a system is having audio problems and file appropriate bugs. He can run Checkbox in an automated fashion across these machines and see which ones have simple audio issues.

Assumptions

  • Automated testing:
    • Hardware can be appropriately configured to perform loopback tests.
    • External sound sources are available for testing internal microphones.
  • Manual testing:
    • A wide array of hardware exists and can be tested by the user community.

Design

Automated Testing

Tested systems will have hardware configuration changes made to enable automated testing. This will consist of simply connecting the default audio output to the default audio input with a patch cable. This will allow automated tests to produce a sound on the audio output and listen for it on the input. Automated Checkbox tests will be created relying on this configuration to test audio.

Manual Testing

A series of manual Checkbox tests will be created to test audio and added to checkbox-extras. The test dependencies will be designed as to reduce redundant tests.

Testing will start at the top of the audio stack -- if the most complex audio test cases pass, then it can be assumed that lower-level parts of the audio stack are functioning properly. Users will also be asked what hardware they have available and will be prompted to test only the hardware that they indicate they have.

Implementation

Automated Testing

A Checkbox test or suite will be written to generate a sound on the audio output and listen for it on the input. This MUST generate a pass/fail value as follows:

  • Silence is detected, as indicated by a recorded audio file of sufficiently small size: FAIL

  • Audio is detected, as indicated by a recorded audio file of sufficiently large size: PASS

  • No audio hardware is detected in the list of hardware:
    • This MAY generate a SKIP value in normal system audio testing, as there is currently no way to indicate expected hardware existence on a per-machine basis.

    • This SHOULD generate a FAIL value on hardware known to have audio hardware installed. (E.g. OEM hardware.) Ideally Checkbox should be able to be made aware of expected hardware detection, but this currently can be covered via Extended Manual Testing as described below.

The size threshold of the file indicating silence or recorded audio will be determined during implementation.

For the purposes of this specification, it will be sufficient to determine if audio or silence is detected by the Checkbox test. In the future, more advanced analysis of received audio may be performed to test for the correctness of the generated/received audio.

Manual Testing

Default Tests

The current Checkbox test that simply produces a sound and asks if the user heard it will remain in place. This will remain the default audio test presented to users in a standard Checkbox test.

Extended Tests

An additional suite of Checkbox tests will be written to test audio and added to the checkbox-extras package.

These tests MUST present the user with a list of detected audio hardware and ask the user to confirm their existence. The tests SHOULD ask the user if any audio hardware exists in the system that was not automatically detected. The tests SHOULD also prompt the user to select (via check boxes or similar mechanism) what hardware they possess that cannot be automatically detected, such as microphones, headphones, non-connected USB audio devices, etc.

For each detected piece of hardware, the suite MUST execute a series of tests. These tests SHOULD start at the 'top' of the audio stack; i.e. tests should start with the most complex case. If the initial case succeeds, it can be assumed that all lower level tests would also succeed, and thus they SHOULD not be presented to the user.

As an example, in order to test the internal microphone and speakers of a system, Checkbox might ask the user to speak into the microphone and then play back the recorded audio. If the user indicates that he or she heard the proper recorded audio, it can be assumed that both the internal microphone and speakers are working properly and no more tests for these devices need to be executed. If, however, the user indicates that the recorded audio was improper or not heard, Checkbox might begin by testing the speakers and then testing various aspects of the microphone.

Finally, for each selected piece of hardware that cannot be automatically detected, Checkbox SHOULD perform a series of tests. These tests MAY vary depending on the hardware and SHOULD strive to test as much of the capabilities of the hardware as possible. These tests also SHOULD start at the top of the relevant parts of audio stack as described above.

Test/Demo Plan

Testing will be performed by the Checkbox team executing the Checkbox test scripts during development. After this the automated tests will be rolled out to the certification testing environment. As this is a controlled environment with tests run and analysed daily, this will be a sufficient test for including these tests in the next release of Checkbox.

Future work

The proposed automated audio testing is a large leap forward from its current state but still fairly basic. Future specifications should develop more in-depth testing procedures.

In the future additional hardware configuration changes will be considered. One possibility is producing a detectable sound in the environment containing the tested systems which can be picked up by internal microphones.

BoF agenda and discussion

Make sure the key bits from here are integrated in the spec:

Desired Testing

  • verify that all inputs and outputs (both digital and analog) work for each piece of tested hardware
    • test that plugging in headphones mutes speakers, plugging mic mutes internal mic, etc.
  • test each audio layer independently
    • test that driver works
  • tests will likely require manual parts as well as automatic

Baseline Testing

  • test recording
    • easiest support for automation is looping mic <-> headphones

    • can we find a level changer so that the headphone output is detected as a mic?
  • What can users test?
    • We should focus on top-down testing
      • start with gstreamer pipeline first
      • if it works from the highest level, don't go down through the rest of the stack -- assume it works
    • ask if they have hardware to test... headphones, mics, etc.
      • if they have the right hardware, ask them to plug it in and give a test
      • make sure we don't ask them to plug in something their hardware doesn't support
        • e.g. don't ask them to plug in a line-level input if they don't have one
      • all sorts of hardware that can be tested
        • USB sound devices
        • bluetooth devices (hard to setup / configure in Ubuntu)
        • 1/8" jacks, etc.


CategorySpec

QATeam/Specs/ExtendedAudioTesting (last edited 2009-06-12 13:24:13 by cpc4-oxfd8-0-0-cust39)