[ insert project name ] on the CommCare platform. Each CommCare build should adhere to the following testing plan before deploying live data collection to ensure data quality and that the objectives and purpose of the data collection are met.
Mobile Data Collection Lead (This individual is the person actively involved in the building of the application):
Each build must have a testing team that consists of a minimum of 2 individuals. At least 1 member of each testing team must meet the following criteria:
Step 1: Form Testing Team
- Based in the field (where the deployment will take place)
- Have intimate knowledge of the project / survey tool
- Did NOT actively build the survey
- IO Based
Step 2. Test
The testing team will test the following:
FRONT-END (user facing)
Ensure skip patterns are correct (using the paper copy as a reference)
Ensure the application flows in an appropriate manner and that each question is clear and not confusing
Questions and answer options are clearly written and can be easily understood by the enumerator and respondent. Response types (text, number, multiple choice) are clearly labeled
- Are there any misspelled words or poor grammar
- Do questions and answer options fit on the screen for all questions
- Were question groups created to put them on the same screen, where relevant?
Example: If the question has an ‘other (specify)’ as a response. The field to enter other should appear in the same screen rather than the next page to avoid confusion.
Translation – If the application is translated into another language, someone who speaks that language that did not develop the application should check the survey for language accuracy
Ensure each response has the correct labels. For example, ‘Yes’ and ‘No’ are the labels and ‘1’ and ‘0’ are coded values.
Calculation fields (if appropriate)
- Ensure hidden calculations are working correctly by entering test data, exporting the data and manually checking the calculations
Labels & Codes
- Question IDs are used to identify unique questions
- Response options are consistently coded (No=0, Yes=1, Do Not know=-888_ throughout the application
- Make sure all validation rules marked in the paper survey are correctly built. For example, if the possible range of a numeric response is 0-5, make sure there is a validation rule stating that the response is >=0 and <=5
STEP 3. Feedback
Provide consolidated feedback in written form to Mobile Data Collection lead to update.
STEP 4. Reiterate
Repeat Steps 2 – 3 based on feedback and update from mobile data collection lead.
STEP 5. DATA ANALYSIS
Once the back-end and front-end of the application (step 2a, 2b) has been sufficiently tested to the satisfaction of the Mobile Data Collection lead as well as the testing team a preliminarily data analysis should be conducted. This should be done by using the following protocol:
- Using a test account (not a demo account) complete and sync a minimum of 5 surveys
- Export the completed and synced data
- Complete a high level analysis of the exported data to ensure that responses can be analyzed and scored
Download issues log here
Once you have established the testing plan all issues will be tracked via the issues log. The issues log will allow you to track and manage all issues easily and effectively. It is best practice to keep a single copy on a cloud platform such as OneDrive so that multiple people can edit the same file and see and track all issues.
(click on image for animated gif)