These QA guidelines for researchers are designed to help you run internal testing on your Surveys, Assignments and Settings before submitting for ilumivu audit.
Testing the survey
1. Create an initial draft of your survey
- Use this draft to test out different question types
- If multiple professional account users will be creating surveys, use this draft to grant editing permissions to more than one account.
- During live data collection, survey editing permissions should be limited to one account.
2. Test the working copy on both an Android and an iPhone (if your participants will be using both)
- Use two different participant test accounts to do this. Do not reuse mobile codes between Android and iPhones. How the apps process data is different between the two operating systems, and it can cause errors.
- Make a note of the phone model and OS version of each test device (e.g. Samsung Galaxy, running Android 9.1, etc.).
3. On the test accounts give the following user permissions:
- "Can Backstep". This will help when testing dependencies/skip logic
4. Once survey has been taken and answers uploaded, verify that submitted survey data is accurately recorded in the Flat Data File
- NOTE: Responses to Multiple Selection and Matrix questions only appear in the Flat Data File when they have been selected. If no respondent has selected that response, its column will not appear in the Flat Data File.
5. Verify that the dependencies work as you expect
- Check that the right questions show up under the right conditions. If they do not, please check how you have coded the dependencies.
6. If you observe any issues with your survey, please reference this troubleshooting guide: Common Survey Building Errors
7. Once you've confirmed that the survey is working as intended in the app, make a clone of your survey to assign to participants.
- Information on cloning a survey is in this article: Icon Guide - Survey Editor
- On the clone restrict editor permissions to the minimum number of professional accounts as you need. This will help protect the survey from accidental changes.
- Click the "Refresh References" tool to update all dependencies to the new survey
- Verify this by clicking on "Check Dependencies" icon
- Click the "Resequence" tool to update all survey IDs
- Verify this by clicking on "Check Dependencies" icon
- Perform steps 2 thru 7 with the cloned survey with both an Android and an iPhone test device
Testing survey assignments
1. Prepare device for testing:
- Verify that notifications are turned on for phone.
- Instructions available in Phone Settings for mEMA
- Ensure that battery saving optimizations and other app-killing features are disabled.
- Instructions available in Phone Settings for mEMA
- Instructions in steps 10 and 11 of Not Receiving Alerts - Quick Check
2. Create a schedule for two test accounts. This should be the same schedule that will be assigned to participants
3. On the test accounts, give the following user permissions:
- "See All Instances". This will help to test that the schedule has been correctly downloaded and verify which scheduled instances are being completed
- "Delete Instances". This will allow the tester to delete previously completed instances, which may be helpful.
4. Download the schedules to the test devices
- Verify that the schedule correctly downloaded to the phone by going to the "My Assessments" screen and tapping on clipboard icon next to survey title, or on the survey title. This takes you to "Schedule" screen when "See All Instances" is enabled.
5. Take scheduled surveys throughout the day.
- NOTE: When taking surveys, leave the app open in the background. Closing the app communicates to the phone's operating system that the app's background processes (such as sending notifications) are low priority and can be stopped to preserve battery.
6. If you observe any issues with your schedule, please reference this Troubleshooting Guide
- NOTE: It's important to know that the testing process is different from the participant experience. Testing the same survey multiple times in short windows increases the risk of partial uploads, exiting unfinished surveys, overlapping assignments, and other situations which can cause partial or data and errors. Good testing practice is to clear the cache and storage on the app frequently and re-sync mobile codes to simulate the participant's user experience of using a fresh account. While testing you may see odd issues which a participant won't see. If an issue is consistent, try clearing cache and storage and then following the troubleshooting guide. If this does not resolve it, please submit a Help Desk ticket.