Since the last update on the Digital Bridge project, we’ve completed our data collection for the project, and we’ve begun analysis. Throughout the course of the project, several people have expressed interest in our use of audio diaries as part of our methods. This post provides the nitty gritty on the methods and our initial lessons learned and impressions, in the hopes that it’s useful to other researchers or those interested in methods. This gets into the weeds in our process, and we hope it’s useful to other researchers or anyone interested in methods.
Methods
We collected data from 15 program participants and four case managers. The program participants supplied audio diaries and participated in interviews with the researchers. Case managers also contributed audio diaries and participated in a focus group. We collected data during the fall of 2020 from residents in the Seattle area. At the time, the area had strict restrictions on in-person meetings and gatherings. In response, we designed our data collection to be 100% remote.
Previous research has shown audio diaries are a method to understand people’s everyday experiences even if collected over a short period of time (Palen et al, 2002; Williamson, 2012). After looking at some collection methods using voice memos or a particular app, we decided to collect diaries via voicemail since all participants know and use this technology. Diaries have the potential to provide insights into the participants’ smaller daily successes and frustrations and to see if any trends emerged over time.
Digital Bridge program participants were asked to leave audio diaries for seven consecutive days and afterwards to participate in a phone interview. Participants received small compensation on a Visa gift card for each voicemail and participating in the interview. The audio diary prompt asked participants to tell us how they used technology that day (including phones, computers, and Internet), if they could do the things they needed to do, and what else would have helped them to accomplish their tasks. The interview followed-up on themes in the voicemail. We asked more questions about their experiences in the program and their employment and technology needs and goals.
As mentioned previously, since all participants were clients of Seattle Jobs Initiative (SJI), we heavily relied on SJI staff to recruit. However, we wanted to ensure confidentiality for participant participation. The University of Washington (UW) team created a sign-up form using Qualtrics and then the SJI program or case manager would text it to participants with information about the study. After someone signed-up, the UW team called the potential participant to give more details, get verbal consent, and answer questions. We then followed-up with an email that summarized the study procedures. We gave participants an option to receive voicemail reminders via text message or email. The majority selected to receive via text message. We then conducted the interviews over the phone. SJI also sent our study information via email, and the UW team promoted the study during program orientations. However, text messages were by far the most effective way of recruiting.
In total, we had 15 individuals sign up and participate. All 15 left at least one voicemail. We averaged five voicemails per person, and five people left all seven voicemails. Two people left voicemails, and they did not respond to follow-up inquiries about participating in an interview. We had two people that enrolled but ended up not participating. We collected data in October and November with the majority participating in November.
After deciding to incorporate the case manager perspective, we also wanted to get a sense of their daily experiences running the program. We asked case managers to leave voicemails for one work week (five days) and then participate in a focus group. Four case managers left audio diaries and three participated in the focus group. We mirrored their voicemail prompt to that of the participants. The focus group was designed as a participatory design to get the case manager perspectives on the structure of the program and what possible changes could be made.
Initial lessons learned and impressions
We need to do a deeper analysis of the diaries, but we did find them useful during the interviews. Sometimes people described instances in the diaries that did not come up in the interviews, and we were able to reference the diary and discuss the instances. Some of the audio diaries captured particular moments in time in a way that an interview may not – such as two participants talking about reading the news about election results. A downside to using voicemails to capture diaries was the three minute limit. Overall it wasn’t an issue but a few people did get cut off. However, we did miss the in-person face-to-face interviews. It was sometimes difficult to read cues from each other over the phone and had a few difficulties getting virtually connected.
On the project management side, tracking the voicemails and sending daily diary reminders takes a significant amount of time. Participants of the SJI program received their computers on a staggered basis, and thus we were recruiting and managing data at the same time. Additionally, since the compensation via gift card depended upon the number of voicemails left, we had to place each gift card order separately and send via mail or email.
Our methods likely influenced our sample of program participants. Agreeing to participate in over a week’s worth of activities takes time. From our conversations with participants and their case managers, people are stretched for time – taking care of children, participating in training, looking for work, and navigating social service systems for benefits – almost entirely remotely. Additionally, many of the participants were English language learners and participating in a project entirely in English may have kept them from enrolling. We were unable to enroll one participant because they did not understand our consent procedure. If we use this method in the future with English language learners, we would change the procedure – possibly using group interviews and a translator. SJI did collect more general program participant data using a survey. In our further analysis, we’ll compare our sample to the overall program data to get a better understanding of who we talked to and how they may be similar or different from the larger pool of participants.
Next steps
After completing our analysis, we plan to publish process and outcomes evaluation reports on the Digital Bridge program with SJI and the City of Seattle during the first quarter of 2021. We plan on presenting our joint findings to policymakers and the nonprofit community locally in the Seattle area. The UW team will also explore academic publishing opportunities.
References
Palen, L. & Salzman, M. (2002). Voice-mail diary studies for naturalistic data capture under mobile conditions. CSCW ‘02: Proceedings of the 2002 ACM conference on Computer supported cooperative work. 87-95. https://doi.org/10.1145/587078.587092
Williamson, I., Leeming, D., Lyttle, S., & Johnson, S. (2012). ‘It should be the most natural thing in the world’: exploring first-time mothers’ breastfeeding difficulties in the UK using audio-diaries and interviews. Maternal & Child Nutrition. 8, 434-447. https://doi.org/10.1111/j.1740-8709.2011.00328.x