Transcript from one-on-one interview
Wednesday, May 22, 2019
10:16 AM – 11:44 AM
Location: Research office of interlocutor (Nairobi, Kenya)
Participant: Non-Kenyan woman with kinship ties to the country who resides in Nairobi and has been working at this research organization since early 2017.
Discussion conducted entirely in English. Audio recording was uploaded to Otter.ai which did the initial rough transcription. Angela Okune then listened through and edited for accuracy before redacting according to interloutor's requests.
Total discussion time: 1hr 28 mins
Angela Okune 0:01
So today is Wednesday, May 22. And it's 10:16 AM here. So I just showed you this artifact as a way to start to think about data and you know, that state of data, especially on digitizing and archiving it. And so at first feedback, I was just curious, would you feel comfortable sharing this data?
with anyone or people in the organization?
Angela Okune 0:33
What do you think? You can choose.
For people in the organization, yes, I think this would be really good for knowledge management in the organization. In terms of people outside that usually depends on the contracts you have with clients, so, when they're willing to share data. And what I think is interesting here so you can... and this is one of the questions I had, so you put down "anonymized male", but is there potential to later on, have a version where you have everything there? But then another version okay, I can see you've got it there... where you then start blanking out things so that other people know it's been anonymized, but they can't see that text. So more automated way of doing them. So do like, replace search term, name whetever it is and then you replace that with "anonymized". And then that version you can share with others.
Angela Okune 0:37
Like having basically two versions of it. Yeah. Yeah, that's a great idea.
And, and you could see yourself potentially sharing the anonymized version outside and the non anonymized version internally. Cool. Yeah. In terms of clients, how open do you think most of the clients would be that you folks work with?
Hmmm... it's a tricky one. So it depends if they are a public organization. I imagine that they'd be a bit more open to having some of this shown, but I think there may be certain questions that they don't want people to know that we're asking. So the way I see is this, I think it's going to involve different contracting or when you, or when you start thinking through the results, you'd have to go to them and say we want to share this with the public for XYZ reasons. Here are the transcripts or the general questions that we're covering. Are there certain questions that you wouldn't want us to share? So it would involve a separate conversation with them? I don't see why we can't have those conversations if one of the things we're trying to do is promote sharing knowledge. So if at the beginning of contracting a client we say that this is one of our aims, and this is something we're trying to do, then you're already set up to have that conversation but I just think you can't share it as it is, you would have to have that conversation just in case.
Angela Okune 2:40
Right. And right now, I know there is some sort of clause around...I don't know if it's open science more broadly, or data specifically...
Angela Okune 2:49
So there is a clause right now as it is or...?
Those are mainly for the labs project. So they tend to be around academics rather than people from the private or public sector. So with qual, I don't think there's any clause at the moment.
Angela Okune 3:02
And since most of the projects that you folks do you have a qual component initially and then have like the quant/labs side of it later...
But even with those ones, you don't always share the data. So the data... the Open Science ones, the ones that explicitly have it in their contract are ones that have come in just for the lab. So they're not through the advisory arm, they're through the academic arm.
Angela Okune 3:26
Got it. And do you think the organization as a whole would be open to adding this kind of data clause?
Angela Okune 3:34
Or like a data conversation into the whole process?
I think they'd be open to it, but then ... I've had conversations where for certain clients, you know, that's not going to go down well, so then you would have to think through carefully which ones you pitched it to, because you don't want it to be a case of "we're going to share this data now [inaudible] you can't do the work."
Angela Okune 3:53
Right. Do have a sense of what kinds of organizations wouldn't want to share?
Private organizations. For example, if you are working with... I'm looking at companies outside to use as examples...like [REDACTED ORG NAME]... or someone like that where they have competitors that...and they don't necessarily want people to know that they're doing this type of research on their customers, then they wouldn't want the data to be shared.
Angela Okune 4:15
Angela Okune 4:20
Okay, yeah. And so management , funders Do you think participants would need to be part of the decision-making about whether or not to share the data?
So in the consent forms, we say that we will be sharing your data with the team. So people that are involved in the project and that they could be used for presentations, etc, with a wide audience. Personally, I don't think lots of participants may understand what that entails. So you could have something in the consent form that explains that their data will be shared, but to what extent they understand that is another question and ethical thing to consider. So do you show them what it looks like and they make a decision based on that. Or do you just tell them this will be shared on a platform and they don't know what that looks like...
Angela Okune 5:06
Mmmh hmmm. And what do you think would be...
I would want to show them. The platform.
Angela Okune 5:12
You said it would be good to show it to them?
Mmmh hmmm. Yeah.
Angela Okune 5:15
What do you think concerns would be if they saw it? I mean... based on your own experience...
I guess one who can access the platform? And then can they be identified? Even...I'm just looking at the one that you have there...even though you have got a lot of it anonymized, we still know this person is 27 years old and not married, and has been working in a church for six years. So depending on how many people fit that criteria, you may be able to, to limit it down to a person. Exactly.
Angela Okune 5:46
Yeah. I mean, I grappled with this and I think it's still an ongoing point of discussion of like, how much do you anonymize qualitative data?
Angela Okune 5:56
Do you think for example, location needs to be anonymized? Or...
So internally I would want all of this information. So when even I can see anonymized male, anonymized ministry...I would probably want to know that information so that I know that if we've gone to this person more than once with different research projects, or if I want to do something on a particular ministry, I know how to identify that. So that would be for internal use. But for external use then I think...this is tricky because the location that you have that could be of interest to someone so taking away is a loss of... so I don't know whether you could... hmmm... this is a very tricky one because the more information you give the easier it is identify that person, but then the less information you give them...
Angela Okune 6:44
the less value it is for the researcher.
Angela Okune 6:46
Yeah. Yeah. I mean, based...you haven't read through this transcript...
Angela Okune 6:55
...but based on what you get a sense of...what kinds of researchers do you think like ...let's say someone similar to your your level of expertise and literacy etc etc. researcher like, do you think they would find value in this...just as it is currently?
Angela Okune 7:19
Could they reuse it for their own research?
Let's have a look through.
Yeah, feel free to read it too if you want. It's actually interesting. [Both laugh] [brief pause] Okay, so I can see there are points that they would probably be interested in. But now this and I kind of see the point you made earlier that for me if I just had a huge body of interviews from different religious leaders, it would be a lot to trawl through in order to get to the questions that I'm interested in. So could be... though this involves more work... but it could be interesting to have either a summary at the top, where you have some of the demographic information so if you're after certain types of people, I know you've got this like religious position held, etc, but some more information there could be useful. And then I would ...so, you could either have the questions in bold or have questions by themes, if the first part is involvement, how they became involved in the church, then you've got lots of responses after that. And then you would then move on to the next thing. So that if I really wanted to understand how people became a part of the church, that I could just go straight to that session rather than having to read through the whole script.
Angela Okune 8:42
Maybe then even have one at the beginning. So if you have...similar to your tags here, but if at the top it's, here's the demographics, and then it goes into the key questions or themes that are covered in this interview, and then you have the list of questions there. Then you know, whether that's an interview that is relevant for you to read through.
Angela Okune 9:03
Yeah. The other option might be to have as a separate artifact, the interview guide.
Angela Okune 9:08
So then you actually see, without the responses, and you just see the questions that were asked, and then you would get a sense of okay, well, I'm interested...and then, as you said, like making this more aesthetically able to then find straight to that question...
And I'm assuming, so when you went onto this, the page before had so you went into the artifacts page, and then there was the link to this...would it then be bundled by project?
Angela Okune 9:34
So you can create projects. And you can, I mean, you can have groups within groups, for example. So it's a really like flexible kind of...
Cuz what could be a nice idea then is, if you could have...so similar to how we have this description here... If you had the link so almost like a box or something that says, religious leaders in Uganda. Then you had a short description of what the project involves. Then you can go into that one. Then you have the interview guide at the top. So then you can read through the interview guide and see what's relevant to you. And then it goes into the transcripts.
Angela Okune 10:09
Yup. Yeah, I'm thinking...and then potentially if there's a final report...
Angela Okune 10:14
or any other relevant materials, the data...
[Laugh together] I always want to say scraping... Yeah, something like that, you know, any other materials that [inaudible]...
Yeah, I think that's a good idea. Yeah. But just having an overview page, because I'm imagining if you have like 20 different projects, and the title is not necessarily going to explain exactly what they're involved. So just having that overview then means that you can filter through based on tags. If you're looking at Uganda, then you can filter through Uganda, then you get all of the projects that are based in Uganda with the description, you click on to see the interview guide, then see if you want to read the transcripts based on that. That'd be quite useful.
Angela Okune 10:58
And so I mean, it sounds like I know you're thinking probably internal team?
No, I'm thinking like something more external.
Angela Okune 11:03
Oh really? okay. So who else do you think would be interested in?
So for example, if I knew we had this resource available to us, for external organizations, then when we're doing literature reviews of other studies that have been conducted, then that's an easier way for me to go through the literature and see the actual scripts rather than just outputs. So that's what I'm thinking other organizations could use in that way too.
Angela Okune 11:26
and do you think that that would? ... Do you think that looking at data as part of this kind of, almost data literature review. How does that differ from let's say, looking at like, you know, paper journal articles or what have you that are...typically a literature view is largely secondary data, or secondary materials? How do you think being able to look at data as part of that initial design of a research project changes...
Interesting...so with the traditional literature review, you're relying on someone else's interpretation of the data. And you don't know what fed into their insights, or what their aims were. And so for example, if like using the example that you have of different religious leaders and their thoughts on whatever the question is, if my focus is on how do they join this particular organization, then my literature...the report that I write is going to be focused on that. But there could be other things that we explored in the interview that you miss out, because it wasn't your your key aim. And perhaps another researcher coming to it could be really interested in this same population, but they don't have the same research question. So when you're reading through the literature, you're only getting the responses to that research question rather than other things that were explored in the interview. So I think it's useful in terms of...I don't think this gets rid of the need to look into the literature, but then this could supplement it because now you can read through an entire transcript and identify other themes that weren't picked up on in the reports and things that are relevant to you as well. And I think this is useful in terms of just sparking thoughts or interest in topics. So reading through this, if I was doing a project that was looking at the same group of people, I would be looking to see how they responded to such a question. So when you ask them something in a particular way, do they come across as uncomfortable? Or do they give full responses? And that gives you an indication of perhaps how you should be designing your instruments, and no one's going to put that in their report really: "When I said this particular thing, people were uncomfortable." But you can see that from the transcript.
Angela Okune 13:37
Mmmh hmmm. Yeah, definitely. I mean, even looking at here, this guy was initially asking about ministry activities. I mean, I think so maybe another question is what kinds of transcripts or what kinds of research projects would perhaps be best suited for this kind of approach to opening up data sets because I really enjoyed this this interview because it was open-ended enough that you could kind of, you know, I could see multiple uses or reuses of this data because there's this guy who's, you know, a minister, but he's talking about like a dance group that they have. And he's talking about, you know, "we had someone who came with his alcohol." And you know, I mean, there's a lot of different things that if you're just, let's say, reading a journal about religion, they're not going to I mean, you're not going to necessarily, you know, like, it might be harder to find versus like someone who's interested in performativity and dance... might have a completely different, you know, and...
but that's going to be the tricky thing of how does someone that's interested in dance find this? Because you're not going to tag "dancing" to this one, because it's a small component of the interview...
Angela Okune 14:41
Right. Or maybe you do. So I get that's the question of like, how can we potentially be using tags to even lay ideas...because maybe you know, I'm an anthropologist of data, but like, maybe I look in here and I'm like, "Oh, I see that they're really talking about youth" and like, I'm maybe not doing a youth analysis, but I could see how someone who's working with youth groups could be interested in this. And so even if it's not my tag or my categories, per se, maybe I put it there.
Who would tag this?
Angela Okune 15:08
So I think the researcher...So this is a question I could ask you. What do you think...what do you think that this would be nice to have automatically uploading, let's say, after a particular embargo period is reached from when the project close date, then you know, things start just automated and go up to this? Or do you think it needs to be a manual process where the researcher or the PI has to manually do all this work?
I'm also wondering whether, depending on how easy it is to upload this, if the person is doing the transcription in the first place can upload it to this platform. So the researcher isn't looking in internal folders for these transcripts, or through things like Slack, but instead they go into a centralized place. And that forces people to use a centralized place. But otherwise, you're just going to have the same problem with knowledge management, where people use their own channels. They have Google Docs floating around and they're not... they're not going to really upload things onto this. So that could be one way of skipping that loop... by having the transcriber do it. And then if that's the case, then I'm thinking who would tag this then? So it's either the researcher tags it, but that's kind of looking at a more thematic analysis where they're going through it in each paragraph and thinking through what's the key themes and then making note of that, I don't know someone's...unless they're thinking of doing that in their own research, they're not going to do that as an aside...to tag things...unless you really buy them into the fact that this is a useful resource. So you may need someone externally to do this...to read through the transcripts and tag it themselves. But I think if you had the researchers doing this, I've think it's not going to happen in terms of uploading it. Unless you make it compulsory, a compulsory component of a project.
Because right now people don't upload things onto Box which is our internal one. So if you can't get them to do that, and then you're adding another one, it's not going to happen. So what I see if you're showing them, "this is where they get the transcripts from," and it makes it easier to find transcripts, and then find other ones that's related, then they would utilize it but I don't think they're going to upload it themselves.
Angela Okune 17:22
So then in such a scenario, or like use case, then it would be the place for all materials. And so it would be geared towards the comprehensive.
Angela Okune 17:32
And then going back to the question I mentioned, where I'm grappling with "is a comprehensive or is it like a curated kind of snapshot?" It would need to be the comprehensive side because it's the place where you get all this material. Because then such a scenario, especially as you keep adding more and more material then organization and kind of how you categorize becomes really key. Yeah. Do you think and one of the issues is often that we all have these like, what is your project GDP... like some acronyms, we end up having all these acronyms and we have no idea what they are, especially for someone who's not on the project. And so like even the titling of this, like religious leader in Uganda. I was like, Okay, how do I title this? You know, is it by the subject that's interviewed? And his subject position is it, you know, I mean, this was just a quite.. more open ended...So it included a lot of different you know, what are your activities? What do you think about praying and...
Do you have...just to go back... so when you have the description. So this is kind of the description on the particular interview. But maybe it's you need an overview of the entire project. So in this one. So for example, the [REDACTED PROJECT NAME] one that you mentioned, the way we usually tag it would be [REDACTED PROJECT NAME]. But the reality is unless you're here at the time that that study is being conducted, you're not going to know what that is. So then you probably need some kind of description saying that this project deals with sexual health. So these are the topics that are covered. These are the different phases of the research and maybe just...so almost like an abstract for each project, that's probably really what you need. And then you're not relying on the titles because, again, you'll get a very mixed group of titles there.
Angela Okune 18:30
Because often the titles of the projects here are the client's names.
They are rarely about the topic. It's just the client or what the product is yeah.
Angela Okune 19:29
Yeah. Okay. And so let's say you know, you're able to get things going internally, you have your comprehensive, you know, data sets here, in terms of sharing outside of the organization. Let's say even within your organization, maybe you don't feel comfortable sharing it with certain groups or certain people, what kinds of conditions would you imagine placing on sharing of the data, like would you require acknowledgement or money or...
so internally...I guess the key thing that people may worry about is how easy is it to download some of this data. So for example, if I had transcripts for a very sensitive topic, usually, for example, on box, things are restricted to certain teams. So if you want to access things, maybe it's only available to projects division or associates. The data team will have access to certain things that are data driven, but they won't necessarily have access to more of the qualitative findings because they're not involved with that. So but I don't 100% get the... those ones on box, but they're there for whatever reason, but we're not really told why. So I'm guessing if this was available to anybody in the organization, the concern would be what would those people be using it for? And will they be downloading this information and using it and sharing it elsewhere? So that could be one thing. If you restrict the ability to download then it also makes it less usable in some ways because you want to be able to download the... so maybe it's you can't download the transcripts, but you can download the interview guide. And that way it's harder to, to use the information elsewhere.
Angela Okune 21:18
Because there's different formats actually, that you can have artifacts in. And so like this is what we call like a text artifact. And so you can see this would be really hard to download...
Exactly, because you can't even do print screen, you'll just be doing a million print screens...
Angela Okune 21:30
Exactly, but you can have...you can upload PDFs, you can actually upload, you know, different kinds of images or what have you or video even and then I think the PDFs are somehow easier but actually, this is not meant to really encourage downloading because the idea is to actually keep contributing to it.
Exactly. So this version I can see you can't. Because there isn't an option of download the entire thing so I think this works quite well. But then if it was click onto it and then it opens up a PDF then that's separate on your computer as well.
Angela Okune 22:03
Yeah, because the idea is not to have a ton of people having separate versions.
Angela Okune 22:10
And so I'll show you in another session, this annotate idea, but that, you know, I mean annotate is pretty self explanatory...being able to actually add your question to this, let's say "raw material". So you could come in with your interest in sexual violence, sexual health, and you can like, you know, answer your questions, let's say, of this material through that lens. And then let's say in a year or two, somebody else has maybe a question on youth gangs, and they have their set of questions for their project, and they could come in and look at the same material and actually answer it in that way. Like, what is this interview talking about youth health, and you can at the bottom, then.. this starts to...this is the collaborative side of it...then you can actually see over the years how different people are analyzing the same material, and that in and of itself, can hopefully provoke interest, you know. And so that's the annotate function. But again, it's not...it's not about downloading and doing it in your own little corner. It's about how do we have a shared work space here together. [pause]
So ca you have multiple, I know you said that multiple people can do it and annotate. But I'm trying to think of a use case in here internally would be if you had...so let's imagine, I have a project and I get the people that transcribe it to upload it on there. Then I guess I would want multiple people to have access to ... whoever's on the project to have access to that particular folder and be able to annotate it, I want to be able to see their... the analysis, so it can be a way of doing analysis directly on here.
Angela Okune 23:50
And it would be attributed to the individual so you could see exactly...
yeah, yeah. So you can do that?
Angela Okune 23:57
Okay. That would be good then! Then that replaces data stripping in theory and you're looking at the actual text. Can you upload things? If someone wants to do...use a data stripping template? Could they in theory upload that? And then...
Angela Okune 24:11
to fill it out?
Yeah, fill it out in platform.
Angela Okune 24:13
It would be trickier. I think actually, the annotation structure is actually basically the same thing because you're asking a question and then answering it.
Angela Okune 24:22
But in terms of the format...it doesn't... and so that's the thing. And this is the question I had of like, the relationship between qualitative and quantitative, because this is still really doesn't ummm...kind of allow for quant data and stuff. It doesn't allow for uploading of Excel, really. Um...And so, do you see that as a limitation? How do you see the relationship between like the quant data that the lab for example collects and how that would work and interface with something that's more qualitative?
Mmh. The lab one, in terms of what they upload, that would be all of the raw data, then the code so the [inaudible] files for analysis. And then maybe the outputs. So you probably couldn't have that there. It's very different. With this one, if it was the equivalent of that, then you would have the instrument, the transcripts. And then the tricky part now is the analysis. And so theirs is done separately, it's not done on the same platform, you would use another program, like STATA or R to do the analysis, but then just upload the analysis. But the easy thing with that one is, whenever you're typing in instructions, there, they're recorded. Whereas if I was doing analysis on this, or qualitative research it's not... not something in the background that's recording how I'm thinking or why I'm answering the question in this particular way. And so that's going to be the hard part in terms of you may capture so here we have the data stripping thing, and you could upload that onto here but you don't understand why a person has done it in this particular way. Why have they put certain questions or answers in these different boxes and what are their external thoughts, you're not going to capture that. And I think it's...I don't know whether people would write that up unless they're going to be using that in their particular research. There may be a couple of people that do. And maybe initially they will, but I think after a while, because everyone is so pressed for time, they won't, they won't spend time writing out their process.
Angela Okune 26:24
And do you think that could be a bridge? Like, I guess this is one of the concerns that a lot of qualitative researchers have about sharing of data is that it gets decontextualized.
Angela Okune 26:39
What would be your thoughts on that?
That could have been the case but also it draws the person that has done the interview, which [inaudible] then you're not just relying on the transcripts, you're relying on things that you've observed around you, and memories of the interview, and that informs how you interpret the text. And so that's missing. So I could say, in this interview, this person was really scared of side effects, for example, but you could read the transcript and not fully understand why I've said that, but it could be a conversation I had with them off...off the record. Or it could be when I asked those questions, maybe they did something that made me realize they're really scared, they could have panicked, but you're not going to capture that in the text, unless you're transcribing it with people's pauses and, and tones, etc. So the final output that I have doesn't necessarily match what someone else would read. And so without that full context, you would come up with a different conclusion altogether. I think then the easy thing with quantitative data is you can capture that. You don't need to know a person's thinking behind why they decided to segment the data in this particular way. You see each of the steps they've done and you know how they got that final result. With qual, you never know how that person got to the final results. And what type of analysis are they...are they grounding everything in. Um. Why are they looking at things in a particular way. Are there other questions that are guiding their thoughts that they have as things in the back of the head? Are they testing hypotheses? Or are they just looking for the key themes that arise? Are there other people that they're interacting with that are giving their points of views? Is it one person's reports? Or is it reports made up from multiple people's viewpoints? Was it a brainstorming session where you brought in other experts, and they give...their opinions were outside of the interviews that you did? You just don't have that captured. And I think it loses some of the understanding and richness of why someone would come up with this particular recommendation report.
So you can't control for that. [both laugh]
Angela Okune 28:08
I mean, after you're saying all that one of the things I did think about is that even if you can attribute like, let's say, Angela, Okune said this thing or annotated in this way, right, now it's not capturing did Angela do the interview. Or did Angela
...Yeah. It's true. That would be good to capture. ... Especially as here, you often have someone else do the interview. And then even ... okay so three things come up. One someone designs the instrument, that's one person who's designed. that could be useful. Then who's the person that actually conducted the interview. And if there's any information on any characteristics of that person, and then three, who's the person that's interpreting the interview. Because looking at one where, the same person consistently doing all three will be very different from looking at one where there's three different people doing that. And in many ways, I would then see a person that the three separate people, I would assume that their analysis is going to be more similar to my analysis because they're not privy to all of the, the other internal information, whereas if it's the same person throughout then I would assume that there's more things going on outside of the script.
Angela Okune 29:58
Mmh hmm. And which version would you trust more? Lke, let's say you come in and ...
...the same person throughout.
Angela Okune 30:07
So if you were let's say to assess this data set to determine whether this was quote unquote good quality or bad quality, what things would you look out for?
Oh gosh. Good quality, bad quality? In what way?
Angela Okune 30:27
I mean, even and this can be outside of this particular platform, but like what kinds of qualitative data ... like if you're looking at a transcript, you're reviewing someone's work from the field or whatever...what would make you feel confident in?
In interpreting it. Um. So usually, for me, ...how it's written. So this is gonna sound strange unless I give an example. But for example, here, you've got "aahhh... noooo... mmmhh...ummmm..."... So I'm seeing it as this person is transcribing exactly what they hear. I've seen other transcriptions here where you can tell that that's not what the person said they've summarized it in their own words, or they've tried to make it grammatically correct. And so it doesn't match up, you don't capture some of those details of this person is unsure at this point, and then they go on to it. Other ones were liker the dot, dot, dot, etc. But there's other ones where sometimes you'll have in brackets "pause" and how long that pause is for. So if someone's sharing [pause] verbal things that you wouldn't necessarily want to record because they are not correct English, or they're also indicating pauses or laughter or other things then I trust that more than one that looks perfect, because I'm thinking they're summarizing it in their own words, and it's not real transcript. Then the other thing I mean, this one is good because you have the respondent talking a lot, and the questions seem pretty open ended. So that's another thing that I will look out for is it one where the moderator seems to be taking up most of the the written work or is it the respondent, if it's the moderator, then I'm assuming that's really just a quantitative interview, and they're controlling it. And there's probably some other dynamic going on then. Then also in terms of your summary at the beginning, ideally, and this has not happened here. But in an ideal world, if there was more information on the person conducting it, or if there was some kind of reflective notes, where they may say, I am from this particular region, and I was interviewing this person from another region, and there's conflict between our regions, having that context would be really useful or we're from the same region so this person would feel more comfortable with me, that would be useful because especially in the Kenyan context, that is often something that could be quite [inaudible]. So if I conduct an interview, the response that I am going to get is going to be very different from a Kenyan conducting the interview. So even having that like if you just have anonymous female, you're not going to know that this is a... like where I'm from etc, of the age, my age whether this person does or doesn't match me. So I think that information would be really useful as well. Yeah.
Angela Okune 33:10
And if you were to...like for your own research purposes reuse this...would you? Would you feel comfortable and like in what scenario could you actually like reuse this for your own research work?
Hmmm... [inaudible] I almost can't see using it [pause] for the research venue. I almost see it as a [pause] foundation almost for the types of questions I would ask. I would be using this as background research. So this will inform me of the types of questions I should be looking into, what themes are emerging in these different areas. Maybe how to structure my own interview guide and using it as a literature review. I don't know if I would just rely on someone else's research only. It just depends on how much...how many you have. But I probably would feel a bit [pause] cautious on just using someone else's research because you don't have control over it. Unless they happen to ask all of the questions that I wanted. Then you may think at that point that it doesn't make sense to do it from scratch.
Angela Okune 34:21
Yeah. Well, when this thing is so saturated with all this research and and you're like, what new information is there out there anyway? Yeah, no, I mean...
But then it could reduce the amount of research that you do. So that's one good thing, it could supplement it. So rather than having to do...if someone said you have to do 50, individual interviews, for example, then maybe if this was rich enough, I may then think, actually, I'm just going to do a couple of interviews to test and see whether these, these still hold. And if it's I'm reaching the same conclusions, then I can just rely on this. I would almost use this. Here's my data. Let me go and do some interviews to see if there are any other insights that have been captured in the data. And if they have been captured in the data then I can trust this data more. But I don't think I would just use this only. Yeah.
Angela Okune 35:08
that's cool. Because even then, I mean, hopefully you would feel inclined to also upload that to the same data and then that would further [inaudible] the credibility. You know...
But then also I think you would then need to have instructions or guide on how to use this platform to encourage people to do things. So it could be if you're using...so a pop-up emerges when you go into the data "how are you going to be using this data?" And it could be "I'm going to be doing a literature review", "I'm going to use it do my analysis from scratch" or "I'm going to use as a guide to what my interview will look like" and then based on that it tells you, if selected the last one "a guide for my instrument" it would then be. K well if you finish your instrument can you please upload it. And then you almost have a form and have them fill in information on the form. So it could be "what things do you find useful", "where there any things that you changed about the data" so kind of generic questions that you would have for each of the scenarios. And that forces people to what, well doesn't force them, but prompts them to see that they should be contributing to it. Otherwise, when you look at this, you don't even think about uploading other things because it's not signaled to you that that is what's expected. And then you'll just do it on [inaudible].
Angela Okune 35:46
Definitely. Also in terms of using it, as you were saying, also for lit review. I think the norm is not yet there yet. But people are increasingly talking about citing of data. And so applying a DOI or applying an ARK and different ways to actually credential it. Do you think that's something that would be important to the organization, to you, to the clients, to different groups?
Definitely. So with most of, I would say with most of our literature reviews if not all, we have to cite where it's from as a source, but then also [pause] clients often look for whether it's from an academic paper, so it's not just "I've got this from"...I can't think of where..."Google or Wikipedia or something like that", it makes it look like it comes from a credible source. So I think that's a good point that you made there that you would need to link everything to a researcher and then the source you couldn't just have the information there.
Angela Okune 37:14
Definitely yeah. Okay. I mean, going back to the question of if money or acknowledgement would be conditions, I mean, what is the incentive for an organization like this one to...
Acknowledgement would be probably...
Angela Okune 37:34
So is that the biggest?
I think so. The money I don't think would be one. But because one of the key values I guess is increasing knowledge of, of different behaviors across especially in the global south, given that that is the main remit, you want to be able to share information, but I also think [inaudible] if people know where that information is coming from so that they know the source and that further promotes [REDACTED ORG NAME] etc. So I think acknowledgement would be a key thing. Otherwise you would have been, I don't know how you can control for that. You could, in theory, if I came from another organization, I can just pull this off as my own work. And there's no way of saying you have to acknowledge me in your reports, because I may not want you to be acknowledged. But I think that would be the one if you could have acknowledgement there then that would be the one the organization would want.
Angela Okune 38:26
Have you interacted [pause] a lot...enough with different research, other than this company...like other research companies and researchers to get a sense of whether they would feel more inclined to just copy and then pass it off for their own?
From my interactions, I don't think they would necessarily pass it off as their own work. But I think it could be...The research could be informing their thoughts. And because it's not explicitly we're using this data, and we're presented it as our own, I think they may not realize that there's a need to acknowledge the source of their thinking... Yeah, I think that that would be the thing that would be missing. So I don't think people would necessarily say, "here's our transcripts. We did this ourselves." I don't I don't see that being the case. But I think if those transcripts did inform, the interview guide that you put together or they acted as something you wanted to validate, you wouldn't necessarily say we did five interviews, but we used most of the data from this particular source. I think you just forget that that's where it came from almost. So I just think, yeah, I don't think people would pass it off as their own but I don't think people would necessarily acknowledge where it comes from.
Angela Okune 39:40
And that's partly about just creating different norms or new norms?
Mmmhh, but then there's a mixture of things. So one, it's, that you may not...it could almost be in your subconscious so that you don't really realize that you're using this information and how you're using it. Or you don't see that as being instrumental in your final output. So that's one element. But then the second element, I guess is you don't necessarily want a client to think that you're incapable of doing this yourself. So by citing another person, it raises the question of "why are we hiring you? Are you able to carry out this work to completion? Or do you need to revive another organization's knowledge in order to get to this stage?" And so I think when you have those two things together, it prevents you from from really acknowledging things. But it just depends on how established the company is as well. So if you're a very established company, and everyone trusts you to be the expert, then you're probably going to be more comfortable with acknowledging people because it's fine, everyone knows you're great. Whereas if you're starting off, and you have more of that insecurity, of "I'm in competition with these different people, we haven't yet found our niche. People don't necessarily know that we're the experts in this area," then you don't want to flag up other companies that could in theory be contracted instead of you. So I guess it depends on the company as well.
Angela Okune 40:42
Mmmh. Yeah, no, I think the market in Nairobi is quite competitive.
Angela Okune 41:02
And so...and there's so many research organizations, you know, that I think that that would be probably be a real consideration. Yeah. Like we don't want to...step on others...
...the whole...the new technical division around qualitative research and design...the reason why it doesn't have a name yet is because you don't want to call it a certain thing so that other people think you're now stepping in their territory. So it's meant to be an internal resource. But if people then see [REDACTED ORG NAME] is opening up a design department, then all the design or HCD firms will start thinking, are we going to be slowly creeping into their market and that's not the intention, but we're not going to have those conversations with them beforehand and say "just so you know, this is an internal resource. We're not going to be getting external projects and, and competing with you for this, but we've recognized that we need this internally," as I think the fact that we've had to think about labeling something. We can't label something an internal resource...that's indicative of the competition that does exist that you are wary of the fact that you want to be collaborative and work with each other. So you don't want to step on people's toes unnecessarily.
Angela Okune 41:47
And do you think that is...I mean, collaboration across these potential competitors, etc, etc? Do you think that that is kind of...I wouldn't say politically driven... but it's like...currently at a state where people feel like they can collaborate across even other private research firms, or do you think it's kind of like, we're going to go at it?
I think it depends. So...I guess here we feel more comfortable with collaborating because we think we're more distinct. So...we have a lab. So there's something that's different from others. So we're [inaudible] in evidence, but then also we have behavioral economics roots as well in terms of everything we do, we try to ground in BE theory. So because of that, isn't a direct competitor. And that means if you wanted to partner with an HCD firm they don't feel like we're both in the same space, they can see [inaudible] between us. So we're focused on this, you're focused on this, together we can produce something better. Whereas I think if it was two HCD firms, for example, then you're not going to get that collaboration, because you're both going for the same types of clients. People that come to us, they don't necessarily want an HCD project. I'm using that as an example, because most people that I've worked with on my projects have been with HCD firms. They don't...they're not after that, they're after comparative testing, or some lab work or segmentation. And so they wouldn't go to that particular organization for that piece of work. So I think it's very easy for us to feel comfortable, collaborating with people because we see ourselves as being distinct. Whereas if there was another, BE firm, then that could be very different. I didn't think we would necessarily...I don't know, maybe we would... but I think that it would be more problematic in terms of how do you collaborate with them because you are now at direct competition with them.
Angela Okune 44:04
And how do you feel about like the role of the universities within all of this? I know that...like in terms of the students...actually the one resource going back to this, I was imagining this could be particularly helpful for students. Because they're still, you know, an expectation that students go to the field, but often students again, don't have a budget or are pressed for time. And so what kind of field work really emerges? And so being able for the qualitative research students to be able to come back to this because [inaudible]...to be able to kind of...especially hone your analytic skills by actually re-analyzing something else...and obviously going to the field but I like this idea that you were saying like going back to actually almost validating tests or improve, strengthen...what's already been done.
So I can see for students...[pause] I will be...I don't think they should be using this as ..."I'm just going to analyze this and not do the work for myself." Because from my experience, you learned a lot from actually going into the field of conducting the research. So you learn how you should speak to people, think [inaudible] your instrument, often, if you design an instrument, and then someone else does it, you don't realize how you should design instruments. The only way is to physically use the instrument yourself. You, you learn how to, to change things, but I think it'd be useful in a couple of ways: one, is the validation in terms of you don't have to have such a big sample size. You can use this in a way then also if you want to learn how to analyze data, then this could be a good way of really thinking through how you analyze things. Um, also in critically thinking through so even from our conversation, for example, thinking through what's good data, what's not good data? What other information would you need in order to trust this, that could be useful in terms of making someone think critically around what information they should be recording outside of the transcript. Because from that conversation, for example, having that conversation makes me think, okay, when I do an interview, I should be recording my profile, how I interact with this person so that someone else could understand that...whereas you would just think that is just fine otherwise. And then the first thing I'd guess that could help is, often, especially, if you have to do a dissertation or some kind of project, thinking through what you want your topic to be. That's the really hard thing is, knowing what those topics should be. So having some background research there can help hone in, focus on how you want to design things. So I definitely see it being useful for students, but I wouldn't want this to replace the practical skills that they would need to do as well.
Angela Okune 46:51
Definitely. And I think sometimes also when it's a format like this where it almost feels more official as if something is like set in stone, and it's not editable because, you know, like, the transcript is there. That is the truth. You know, there's less of an... like as you said, they won't be able to critique it or maybe even unless the teacher or research guy is really helping them to figure out, you know, poke holes in it. So I think it actually puts a lot of this onus on the faculty or teacher or whoever is teaching it. Yeah. Cool. I'm keeping an eye on time... I've got a few more questions. What do you think would have to be important? Sorry, for the markups already, but components, we already talked about some of these... but important components of the data sharing platform. So you said being able to cite the dataset is key. Restricting access to this data set to authorized individuals only?
Yes and no. So the idea here would be when especially when using it internally to have internal access, but then being able to say what parts you'd want external people to be able to access so perhaps having an edited version that you could go through and just blackout parts that you didn't want people to see.
Angela Okune 48:08
Ensuring that someone accesses the entire data set and all corresponding materials together so that they don't don't take one data point out of context?
Angela Okune 48:17
Being able to see usage statistics on how many people have accessed this data?
That would be interesting. I don't know how much value it would have apart from... I mean, it depends if I was using this would I go back and check my usage statistics, maybe not but then maybe someone else using it... mmmh. but then this becomes the whole thing of if someone else starts validating this or seeing it as being good because those people have read through it then are people just choosing things based on numbers, or number of views rather than the quality of it. In theory, it could bias people towards certain datasets even though that's not good data they should be looking at. That could be a tricky one.
Angela Okune 49:00
A related one is being able to gather information about the people who have accessed or made use of the data...
[inaudible] the most interesting things together. So if you know who's accessed it then maybe you would know what it is being used for... if there are other topics outside of the topic that you're looking at. I can also imagine we would be interested in that in terms of...is it just Americans or Europeans who have access to the data versus are people in Kenya or in the Global South accessing it and that would be really exciting if we saw that it was used by a wide range of people rather than just certain [laughs] people. So I can see that being nice. But I can also see it being nice to have but not necessarily vital to everyone to see but I think someone should be looking at that and analyzing it, whether that's public knowledge is another thing, but I think that would be useful analysis outside of the platform.
Angela Okune 49:56
I know that a lot of umm... [pause] alt-metrics is what they're called...the kind of alternative metrics for judging impact, like the whole kind of [inaudible] journals of like having impact factors. And like this is just kind of increasing [inaudible] for judging what is more impactful...has kind of become a important aspect for a lot of academics to justify also the impact of their work. So I think this is kind of in line with that. But it's interesting that like, sorry, it doesn't feel like it has to justify the ...do you think it would be a key like let's say in donor reports, or like we have impacted...
But then how would you show impact? Because usually...
Angela Okune 50:39
how many people are accessing it...
But then is that impact? So I guess no... if it was showing impact in terms of how many people outside of the organization or a third of the clients have access to this information and have gained something from it. Whereas...
Unknown Speaker 50:54
[interruption by someone looking to use the room for a meeting]
...[responding to her] Yes, but we're finishing up.
So because then you're not showing impact in terms of how you've impacted your end users. So for example, if I was doing a project on sexual health targeted towards adolescent girls, the impact there would be how many people...what the results are from the study. That's how I would show impact. But in this case, I'm just showing impact in terms of how many people have read my study, and that that's a different type of impact and I wouldn't necessarily call that impact. That's just reach, I guess.
Angela Okune 51:36
Okay. Um, we've covered most of it. Uhhh...have any funding sources or existing clients actually required you guys to share data?
Yes. So some people want the transcripts. Umm... It's not always clear why they want the transcripts, but it could be because they want to be able to take quotes later on, or they just want it just in case they need to access it in the future and they need to have the information easily.
Angela Okune 52:03
But they keep it for themselves? They're not making it public?
Yeah. They are keeping it for themselves. Yeah.
Unknown Speaker 52:10
[recorder paused and we walk to another room]
Angela Okune 52:13
Okay, we're back after having been kicked out of the room.
Angela Okune 52:18
So yeah, funding sources. Okay. But that's just for themselves. They are not sharing it once they get it.
No, but I'm trying to think of [pause] there's been instances [inaudible] the case we've had the transcripts, where it could be a client has worked with another organization to do a piece of work. And then they may employ us to do the next phase of work. And then really to understand the qual research that was done beforehand, they [inaudible] provide us with the transcripts.
Angela Okune 52:26
Actually, I think I kind of remember that...wasn't that with one of the project in India, where I think they had done like some research beforehand and then...
Yes, yes, yeah. Did we have transcripts on that one? Or was it just the report....?
Angela Okune 53:06
Okay. So in such an example, would looking at the transcripts that had already been done have been beneficial you think?
Definitely. So that's also the question of... we have a current project that's going on here...research has been conducted. And we have the results, but we don't know what informed those results, and when you look at the results... You're meant to design an intervention. So let's imagine if ummm ... I'm trying to think of a good one they were covering...They're trying to get people to...women who are pregnant to go and deliver in a clinic rather than relying on traditional birth attendants. So in that example, we, we have...the report tells us a couple of things that people like traditional birth attendants, because they can be in their home, they have the comforts of their home, and they don't necessarily like the clinic. We don't know why they don't like the clinic, and we just have that final... "they don't want to go to the clinic, they want to go to the TVA." And so for us designing the interventions, we don't know what we're designing for. They may be 10 reasons why they don't want to go to the clinic. Is it because it's not a nice space? Is it because it's too far away from them? Is it because they can't have anyone else in the room? Is it because they're forced to give birth in certain positions that they don't think are suitable. And so if we knew more around the why, then you could design better interventions. So someone has already conducted that research. So it's hard to face the client now, if they won't give us the transcripts because they don't have them, we need to go back and do qual research, because in their mind, it's been done, you can just work off the report. So it would be useful to have it in those cases, because you can identify gaps and know what things you should go research rather than starting from scratch.
Angela Okune 54:40
That is a really good use case of why we should share research data and why it's not just enough to have the final report. Yeah. Cool.
It just means that when it comes back to the whole analysis thing. When you and I want to find a report. And then you're assuming that that final report is detailed in nature. So I guess with academic qualitative reports, you have lots of the quotes from the users... I feel it's more...using the respondents words to write up the reports. Whereas if it's a recommendation report, then it could just be here are our thoughts. And you're given those high level things of people don't want to go to these places. But you don't...you don't have as much information. So really, some reports could be...have enough detail that in theory, you can gather insights from...but we're really relying on the purpose of that report, who it's targeted towards... and the person that's writing the report...are they giving that much detail or are they limited to having a couple of summary slides which...in which case, that report's not going to tell you much...
Angela Okune 55:49
Mmh, hmmm.. And even in such a case as in, they are picking and choosing which quote, even if they're block quotes...
Angela Okune 55:57
...probably to strengthen some point that they were trying to make and perhaps there were some kind of other alternative perspectives within the data. But...
you're not going to say that...
Angela Okune 56:06
...didn't make it into the report.
Angela Okune 56:10
Yeah, and I guess so...I guess going back to that question of comprehensive and do you need to have all the data for it to be valid or for it to be assessed?
I don't think you need all of the data, I think it's still useful, just having...something's better than nothing. But in an ideal world, if I could look at everything in its entirety, then I can judge it...if I can see the final output...the instrument that went into it, someone's thinking around the analysis... maybe how... any notes they have from analysis, and the transcript. Then if I want to access that information, I can access it and I can get more on it. Umm...I don't think it's necessarily... it still provides value, just having the transcripts alone but that just adds more richness in terms of how someone's thought through it.
Angela Okune 56:59
and have you ever had to other than this example we just talked about, tried to look for explicitly, like qualitative data that's out there?
Hmm... I haven't explicitly looked for transcripts or qualitative data cuz usually we're relying on... If you're searching, you're not going to see the transcripts, you're just going to rely on the final reports. You're...you end up looking for reports that cover the same topic to see whether they get the same findings, but I've never really seen cases where I can just find the qualitative data.
Angela Okune 57:32
Mmmh hmmm. Have you ever reached out to anyone to ask them for their data?
No. Oh no... [remembers an example] No but that wasn't qualitative data, no.
Angela Okune 57:41
That was quant data? And did they give it to you?
Mmmh hmmm. [affirmative]
Angela Okune 57:45
Was it like a private or public company?
So it was a professor who was doing a study on... for my dissertation I was doing a report on dancing as mating ritual.
Angela Okune 57:59
He had... He had done. He had recently conducted a piece of research where he had these different... I don't even know what to call them, but, you know, different sorts of things on a person and they move so you can see their outline. And essentially, it proved that people that ... so there's a conversation that if you are perceived as a good dancer. So does good dancing look like, symmetry, coordination, etc. If you're seen in that way, then you're perceived as more attractive and you're usually a healthier person, etc. But people kind of ... it's actually a myth. So he had this study that was just a really brief write-up and so I wanted to get the actual data and more insights and then he sent me his more recent study that hadn't been published but he was working on the data from that. So that was useful, but...
Angela Okune 58:48
and it was quant data in SPSS or like in what format did he send it to you?
So he sent his draft report and then the data in SPSS, I think, yeah.
Angela Okune 58:59
Did you use it? Like you ran your own analysis on it?
So, I wasn't running analysis. It was just using that insight into the report as further evidence. I can't remember that part. [laughs]
Angela Okune 59:13
Something or other. That's very nice of him though.
Yeah, I was very surprised by that.
Angela Okune 59:19
Side note, my husband and I met on the dance floor. [both laugh] You can add that as a data point.
So that was the motivation! Okay, alright. Was that included in your research write-up?
So did I. [both laugh]
Angela Okune 59:33
Context for the selection of research topic.
Angela Okune 59:37
That's super funny. Okay, well on that note...Um, yeah, I think you know this this question though of even looking for data sets, cuz quant data sets, especially with increasing you know, portals like the World Bank stuff and government open data stuff you know, it's, it's available increasingly for quant stuff. Still, whether it's useful and like used, is a question. But in terms of the qual data, it's just like not something you can like find...
The other thing that's easy to find with quant are the interview guides. And it's hard to to find qual interview guides. And that I would find really useful, because when we're designing studies here, for example, if I had one on sexual health, I would look through all of the different surveys that I can find online that cover it to see how they ask certain questions. Whereas if I want to do a qualitative version of that there is...it's hard for me to find those resources. So just having a guide on here are the types of questions you should ask or that had been asked in this space would be really useful as well.
Angela Okune 1:00:08
And in the quant ones that have been asked is that mostly like on dataverse, like, did they package it together? The guides or the questions...
It's not so much the guide, just the actual instrument that you can get a hold of. So it could be a government survey or ummm... so often it could be on a report. [Inaudible] found in the past, so if I'm looking through like the WHO report or something like that then it will have, here are the findings and then it would have a link to the interview guide. And then I can look through the whole interview guide, see how they like... see which questions skip? What types of questions they ask in different groups if there are certain risk questions that they have. Ummm, so for example, in our surveys, we often used the Nascott (?) risk questions, because those are the ones that HIV clinics use in order to tell whether someone's at risk of getting HIV. But if you don't have access to those things, then you're not going to include that in your survey, because you won't realize there's a standardized unit or component that you can use. So it's the report and links. So if you're reading through reports, you have been linked to the instrument, and I haven't seen that with qualitative reports where you have the link to the instrument.
Angela Okune 1:01:42
Mmm hmmm. Cool. Yeah, I think I know that [REDACTED PERSON NAME] is working on that. Yeah, I need to catch up with her on how that is going.
Ummm...okay. So after this whole conversation, what do you think benefits of digitally archiving qualitative research data are?
So the two key ones....So I kind of see it as one internally, knowledge management. So if I'm doing a project, I could find other ones that have been done. And I just see it more as... one, inspiration. So either to inspire more on what types of questions I should be asking, or to look through initially and see what things emerge. And then using that either to reduce my sample size and validate, or use that to create my own instruments and add some kind of foundation to, to my own analysis.
Angela Okune 1:02:36
Hmmm... People using the data in a way that wasn't intended, I guess. So if they don't understand the context, then could they take that information out? And so I know we've had conversations in the past around if you're segmenting people then a potential risk would be...we think it should be used in this particular way. But then someone could use it to discriminate against certain groups and say we're no longer giving you access to certain services. And so whilst the data in the project could be used for one purpose, someone else could take it for another purpose, that wasn't intended, and you have no control over that. And I guess the second thing then is for the respondents, again, you have some level of duty of care with them, because you're the person that's taking their consent, and they've agreed to your particular research study. So you've outlined the purpose of the research study, what their data is going to be used for. Later on, they could find their data being used for something completely different outside of what they would have expected. And there's no way of getting their consent for that as well.
Angela Okune 1:03:36
Yeah, I mean, going back to interacting then with the participants... You had mentioned, like going with the platform and being like, "Okay, this is where we're going to put it, would you be okay with that?" Do you think that's kind of...probably one of the best ways to make it explicit to them that like this would become public... a public dataset?
So if you had an example one...it could work...because we often do things using tablets, I can see how it would be easy for us to present that to a person. It could mean you may end up having a couple of things, I don't know if there are restrictions that you could have. Where it's either you could access this data, but you can't use it unless you seek permission from the person beforehand and explain the intent behind the research. That could be one way of getting around it. I think it's nice to show a person what it's going to be used for so that they can visualize it. But at the same time, that could act as a barrier. So then in that case, you may want to have an opt in/opt out option in the consent form where it could be if you're uncomfortable with it being shared on this platform, then tick this box, and then those...doesn't get shared, but then you need to indicate on the website that we interviewed 100 people of these 30 interviews have not been included because these people refused to have it uploaded so that you have an idea for someone reading it, how...how big the sample was and what's informing the report as well.
Angela Okune 1:04:56
Yeah, do you...what do you think? Would they be excited about it or would this be like another like, meh...
Probably depends on the people as well...
Angela Okune 1:05:06
I mean, if you could tell them like YOU can access this data.
Angela Okune 1:05:10
After this thing is done...
That could be interesting actually. I'm trying to think if there were any interviews that I've done where someone said to me "Can I have the data..." It's very rare. I can't even think of an instance where someone's wanted to see it. They may say, "Can I see what happened at the end? Or how will you be able to tell my name or who I am," but because we never really share the transcripts, it's very easy to say to them, no, you'll be anonymized. And the way we would indicate it would be - on your example - 27 year old male in Kampala -- so there's not going to be able to detect it to you. And it will be quotes and not the whole transcript. So it's easy for us to sell back to the person whereas if we said "everything's going to be online, we'll anonymize your name. But you can view." I don't know, I mean that could garner more interest. Maybe they would be interested in seeing that. But I really think it depends on the person and how tech savvy they are too.
Angela Okune 1:06:06
Yeah, I mean, I'm looking at this platform largely researcher to researcher, but I think the potential, I mean, because we never know like, there are researchers who, you know, are also being studied and like, you know, there's like, it's not so easy to segment actually... So the potential again, of it being there, allows...could allow for...
So unless you had it... One way that I could see [inaudible] people of being there... is similar to the consent form that you have. Either it's "we'll upload this to the platform, but beforehand, you can look through it and you can determine what you want and what you don't want in there." Because for example, the one that you showed me, which was a religious leader, in many ways they have...they may be more worried about how they're perceived, than if it's just an anonymous person that can't really be identified that lives in Kibera, for example. Then there's a difference there. So I also think the types of people you're interviewing are going to have an impact on their level of comfort with sharing the data. So if I'm going around politicians, they...there's more issues with the wider public knowing their thoughts on things so do they want their information to be shared. So I think it could be nice to show them the platform beforehand, especially in cases where you're interviewing people that may have more to lose or more concerned with their image or the public knowing their insights, showing them what it looks like. And before we upload this, we'll send you a version of the transcript if you want to see it, and then you can let us know if there are any parts that you do or don't want us to include. In which case, I think it would be important to have on the transcript -- because otherwise you won't know that was covered -- have on the transcript the question and then show that that was blocked out so that someone reading it understands that this particular question was asked, but this was seen as something that the person didn't want to discuss, which is interesting in itself because then you begin to realize sensitivities that people may have around certain topics.
Angela Okune 1:07:55
Totally. Yeah, I am...I think that would be... yeah. Because there's somebody who is interested in sharing their data, but it's about politicians.
Angela Okune 1:08:03
And so they don't want to put up the politicians at risk of being identified. So there's...so I guess ...and you work on maybe more sensitive topics as well than I do at least. And so also, I guess my sentiment is that this kind of platform would largely be for almost benign kind of topics, that feel less risky, if something went wrong, you know, that aren't really focused on like the hyper marginalized or the hyper stigmatized groups either that you know, if they were identified through triangulation, that something bad could happen to them, you know, what do you think?
No, I'm just trying to think. So, if it was a sexual health topic, because that's the one where it's really hard to find...
Angela Okune 1:08:47
data so you would really want that information...
but then the information could be really useful in terms of how to ask the question, but you're correct in jumping through our interviews... there could be...people do reveal things that if someone else found out could put them in an unsafe position, so it could be if they're having extramarital affairs, or if they have HIV or different things, and people around them don't know, they could be identified, then you don't want to be a case that people know. And then the thing that we often struggle with is, even by publishing our results, or letting other people know how to access the results, people could look through that survey. So one of the concerns, for example would be this. In our survey, when we're looking at adolescent girls, we don't explicitly say to the parents, all of the questions we'll ask. And we explain that we're going to be talking about their sexual health, and the health of other people around them, etc. But we don't say to them that in order to take part in this study we've already screened your daughter and we know that she's sexually active. Because we think that could be putting them at risk. And so when we have...when we publish the study, if we're including our sample criteria, and someone else reads that then they know that whoever is in this particular study is...has been identified as being sexually active. So we're very careful in terms of when we're screening to, to not say why a person is included or isn't included so that the other people around them can identify why they're being treated. But there's a risk of if all of this is now open and someone else accesses it, if you haven't thought through that potential risk, then you've now exposed this person...
Angela Okune 1:10:21
And if a parent somehow finds their way to seeing the portal, and then they're like, oh, yeah...
It doesn't even have to be the parent. It could be a neighbor who says, "Oh, I remember, these people came knocking on your door and they spoke to your daughter, you know, so the only way they can talk to her is if she's sexually active," then you've just exposed that. [pause] And so, it's a tricky one. And that would be the worry. So you'd have to make people think through potential risks of others knowing they participated and if there's a certain selection criteria, then what the risks could be with that. And then...
Angela Okune 1:10:34
Cuz that selection criteria would be in the instruments. So even in such a case, if we're like, "Okay, well, we're just going to put instruments and not datasets..."
But then...so we would have this...then we have two instruments. One, we have the screening instrument. And then the second one was the actual instrument that they used for the survey. So if I wasn't thinking through the potential risks of exposing this, I would just upload both. Whereas if I was thinking consciously of "if someone knows that the person that's taken part to have to go through this selection criteria," I would get rid of that screen instrument and just show the the main instrument. So you don't know that they've been screened, I would maybe just say that they've been screened for a number of criteria, but for confidentiality purposes, we can't share what that screening criteria is.
Angela Okune 1:11:35
Or maybe you could have it such that if you're interested in the screener, email this this this and have it be on kind of a per request basis.
But then how do you know who's requesting it?
Angela Okune 1:11:46
But then they would have to email with like a description of what they're proposing to do with it...
Angela Okune 1:11:51
and like what, you know, ... to just develop that relationship in the same way that you emailed that prof...
Yeah, it can't be I email and then you send it.
Angela Okune 1:11:58
No, no, no, it would have to be, you know, for a legit reason, and then you would know who and you would have their details and you would have their profile and stuff. So that then at least you have a sense of and they may be, would be under a restriction so they couldn't repost it. Yeah, exactly.
Especially I guess if you have an explanation of why we can't do this and it makes it clear to them why they can't share that information. But I just think that...I don't know if everyone would think through that beforehand. Umm.. And [pause] and, yeah...you would almost need to have maybe examples...[pause]... present on the page that you get to think about before they upload data.
Angela Okune 1:12:40
Yeah. Yeah. Because I think, almost to a certain crazy magnitude, like these are the small cases that then nullify any benefits for sharing because people get so worried that like, you know, it could go so wrong and the unintended potential consequences.
Angela Okune 1:12:59
Because we can't...even we could be as cognizant and as mindful as possible but like, there are just some cases where we just won't know. And so that's usually been the rationale for just keeping a tight lid on it all, you know.
Which then, which then brings about to who has access to the data. So if it's not the general public or the person that's been involved in the research, then that poses less of a risk in some ways because they can't identify others around them that have taken part. So let's imagine if it was...someone's conducting research here. And I, I look at my own transcript, but I know that you've also interviewed 10 other people in the organization. I can in theory look through those 10 other interviews and identify their thoughts. And so the same thing could be done if you're doing it in communities, you...you're accessing your own transcript, but then you could be curious about how other people have answered this. And then you start looking at theirs and then it spirals into understanding or identifying people based on your knowledge of this researcher who has gone around to these households. So I know its limited to these people only. Whereas if it's only really restricted to...if you're not sharing with the respondents that this is the data source, and they won't know how to access it so they don't have that, so then it becomes researchers or people that are conducting...the people that are conducting research know about the platform, and they're the ones that are accessing it. And in theory, there could be less of a risk there because they are looking at it through research lens rather than trying to understand what a person said...
Angela Okune 1:13:24
Muchene [Swahili for gossip]... what did one person say? And the other person say...He said, she said...Yeah...
Which then means you can't necessarily show a person the platform. Because then they can just find it themselves.
Or if you were showing it to them, then we'll just say, here's... or showing them a sample page that's not really showing them the website. So almost a dummy page. And then it's saying that researchers only have access to this platform, which is why they won't have access to it. But they can choose what goes on there and what doesn't go on there. So that could be one way around it. But then it's a shame because you don't get the full... sharing capabilities.
Angela Okune 1:15:06
Yeah. [pause] I mean, because as you know, my research is motivated by this idea of like, people who are researched feel as if their data is being taken from them. And so again, this question of like, well, how can we give back data to people... [pause] in a format you know, that makes sense...
So then it maybe it has to be...a game that maybe, maybe then you start so then it would have to be restrictions on certain things. So if it was a general topic that you don't really care about people knowing and if you say to a person so the consent I guess would be in order to take part in this study...someone, some.. sort of... here's your transcript, but note that in...anyone reading the transcript will know that you fall into this particular criteria. They may not be able to identify you, but... are you fine with this? And if they say yes, then it could be uploaded. For sensitive topics, where you think there's more risk involved, then it could be those things are restricted to the person only. So you know that there are transcripts, you can see the instrument. But if you want to see the transcripts, you have to email the researcher directly. And there's a disclaimer of why so it could be "there are certain conditions to take part in this study, for the safety of respondents, in order to access this, you have to email the researcher directly," so you don't tell them the exact conditions, but at least the person has an idea of why.
Angela Okune 1:16:23
And I think the researcher would determine whether...which one they would want that to be. Because it would be more work if someone like keeps getting emails, but also then they could be assured that they know who's accessing it, how it's...
So it could even be having a standardized response of: "this has been restricted for x reason," and then the researcher would then when uploading things be able to say: "open access for anyone" or "restricted access" and it would have this is the message that they will see and the person will email you directly and they select which of the two displays they would like and that way, yeah, they are thinking through it.
Angela Okune 1:16:56
I think that would make researchers feel much more comfortable uploading data.
Angela Okune 1:17:00
If they could know exactly who is using it and how they're using it. Yeah. Cool. How do you feel your thoughts reflect across the organization? Like, do you think that your colleagues here would share a lot of the sentiments you've expressed in this interview or do you feel like you're an outlier?
I think people would share the sentiment that it's useful, in terms of...often especially...So useful in two ways. One, internal resource. I think it would be very easy to get buy-in in that sense. "Here's the place where you can find this information." Because we struggle with it right now. So definitely there. In terms of externally, I think there will be some apprehension on who's seeing this and client perception and then whether things can or can't be shared. And apprehension almost lies in the fact that it's not explicit with many clients on what their wants are. So we may share some things based on our own value judgment but if you're sharing a lot of information then it requires further conversations and depending on where you are in the organization, you have a different level of comfort with that because you may not be the person that has direct access to the client to ask them and those details. So I think there would be some apprehension there. I also think there would be some apprehension in managing it so who's the person that's uploading the information...how much work do they have to do? Generally wouldn't want to do more work so unless they are really bought into this as something that I see a lot of value in they're not going to do it. I could see some people being like yes, I think this is great...I really think we need to expand the knowledge but other people are more like, "this is my project. I don't want anymore work, I'm already overrun with things." So there's kind of two...two groups but I think most people would say it's a really good idea, especially for internal management and I think people would also want to access it for external resources to help them with the literature review and thinking through...
Angela Okune 1:19:05
...And then I think when they would see that value...would then incentivize them to do it now, because it needs to be supported...
But I also don't know whether people would use the transcripts. I have a feeling that they wouldn't necessarily use the transcripts because people struggle with using our own transcripts. So I think the instruments may be the initial style of...you can see these instruments and the transcripts are there, if you want but I just have a feeling that people wouldn't necessarily read through all of them.
Angela Okune 1:19:29
If the format we're reading through them right now, there are not even transcripts there in the stripped format...then yeah...
Angela Okune 1:19:36
Exactly. And that's, I guess, my still ongoing question of comprehensive versus curation, because if it's comprehensive, it's just again, this saturation of information and people still aren't sure how to sift through information well, and so it would probably just lead to none of it being used anyway. Whereas if it's a curated format, then people feel like okay, well, it's one transcript and maybe it's been kind of blocked off in a certain way, you know, and it's a, it's a transcript to try and tell some kind of like narrative arc.
So you are saying have an example transcript rather than all the transcripts...
Angela Okune 1:20:12
Right. Of like, this is the kind of material we came up with. And then possibly, if somebody was interested in getting the full dump, then they could again, maybe do the email and get all of it.
So yeah, in an ideal world, you would almost have someone that's controlling it in some ways and you would have ... so this person would...for every single project that comes through there would be a template that they're filling in, that gives you the abstracts so they've read through the whole report, etc. all the findings there and then it links to the reports, then you would have the instruments and the thinking that went into the instruments. Were there any other sources that informed the instrument what went well, what didn't go so well with the instruments. So the change that you would make...and then example transcript with that annotation of why you've got this example transcript. What things are important and blocking it off and then, soyou have that and then it could be if you have more interest, click onto this and you can access all of the transcripts. But to do that first part is going to be a lot of work. But that would be ideal. Then you could read through, understand everything, how they did the analysis, etc. And then if you want you can read the transcripts.
Angela Okune 1:21:21
Yeah, I mean, one of the things I was going to do was even, like, let's say this particular transcript with religious leader would like...find out so I kind of I don't, I think it's not captured, like who led that project, who did the interview, and then like, go back to them and then interview them about what wasn't captured per se. Because as you're saying, like so much of it is about contextualizing this data within its broader meaning. And I think that often can only happen when you go back and talk to that person. So there's like only a few examples where qualitative research has been able to like kind of be reused and they... I can send you the papers actually, but one example was where a lady who actually went back and like re-interviewed the researcher who somehow she had found had done this study from, like 15 years ago or something and went back to her and then like, was able to, like, get that context and then re-analyze that same data again. And I feel like...
I guess that's a question on acknowledgement and attribution.
...especially internally, so there are two ways. One, internally you would need the information on who did it. Because you would want to...t's easy for me to go to someone and say okay, why did you do this? Externally then, yeah, that's an interesting one of how do you identify the person? If there's still an organization that's another thing. Do people want to be individually identified as this under person that did this project or do they just want it to be like, here's the organization name only?
Angela Okune 1:22:26
Like, does the organization have a policy of like, do they...like on a report, do you...
So on a report you would write the people that are involved, but then you may not have every single person that's involved in the project.
Yeah, usually. Or like the key people on the team that... [pause] Actually, no, I'm trying to think through...So it's often, you'd have the director, the lead associates. And then if there are other associates or analysts that's on the report. But you may not have people on the data team that have worked on it or the project leads that did the research, you won't have the field officer names, etc. So it's here are the people who managed the project, essentially and wrote the reports or came up with the instruments, but not the people that are conducting the research themselves.
Angela Okune 1:22:57
Pretty much the person who wrote the report?
And that's kind of an important, I mean, that is a lot of labor and like the people going out to do the field work, you know, and so like finding ways to also attribute and ensure that like, that is also kind of maintained within the kind of broader story about how the data was collected...
Let's see. So some of the big reports will have an acknowledgement page. But otherwise, it's just the contact details of the people's...if you want further information, then these are the people you contact. Which is why you just have the product management team because you can't contact a field officer to get information on the whole study. But that's interesting in terms of...because our reports are not...So the only ones I've seen the acknowledgement on, where it's everyone tends to be more of an academic type report, where it's in that type of format. Whereas if it's "here's a deck" and we're just giving the client recommendations, you don't necessarily acknowledge everyone, it's just let's just go straight into it.
Angela Okune 1:24:24
But in such, let's say like in an open data sharing kind of thing like this, it kind of matters. Because if Angela collected this interview, versus if a research assistant or a field, you know, a short term, one week consultant conducted this interview, really matters, no?
But then that means you then have to have other information there as well. So, using the example of one, you have a person that creates the instrument or the person that's managing the whole project. So the one that collects it, they create instrument and they're the one that analyzes, but then you have a team that goes and does the research. Umm.. so do you have this information in a summary of "this is how the research was conducted. It's run by a team, but then they outsource it to these people that are known within the organization. But are external hires so they're not...they're not internal staff, they're external staff, and it's name of the person," then do you describe how they ... you know what happened? How did that person come on board, were they trained? like there's so many different things that you would include, as well.
Angela Okune 1:25:25
The methodological aspects, which do matter, but usually aren't captured.
Angela Okune 1:25:32
Yeah. So if someone did a project with us. And we did qual, they wouldn't know who carried out the qual, it could be my project and I do the qual myself or it could be I get someone on the team to do it. Or it could be there's an external person that does it or I hire an external agency to do the qual work. You wouldn't necessarily know who that person is, why I've selected them and then what training they've undergone.
Angela Okune 1:25:58
And do clients ever care to know that information or funders?
Mmmh... no, the only time I've seen them care is if we're doing it in a country outside of Kenya. So if for example, with the [REDACTED PROJECT NAME] one, we have that in four countries. And so it's clear we don't have staff that can speak the languages in those four countries. So we had to hire external agencies, but we made it very explicit to them, this is what we're doing. We're hiring these people and to carry out the research for x y, z reason, we're going to be in country and during the training, you can come along and see the training and see how the team are being trained. But often you don't have a..that explicit...it's just that the clients were very hands on and thought it was important for them to know where the data was coming from as well.
Angela Okune 1:26:40
Are those common?
I don't think it's common because again, we don't usually hire external agencies unless we're doing something in a country that we don't have presence in. [pause]
Angela Okune 1:26:57
Yeah, there's a lot of background work that I think often just...
you don't see it...
Angela Okune 1:27:02
You don't see it. It looks very clean when you get the nice...even the transcript right like, which is still hard to find often. But even just getting that transcript, there's all of the other components and layers.
I still think even though... ideal end-case scenarios, you have all the information, there's still loads of value in having some information. And so it's easy to get trapped in "we need more" but getting that first stage of here's... here are the instruments, here are the the transcripts if you need them, and then you'll see how people interact with them and what they're gathering from that will then inform whether you need more or what more people need. And it could be even having either interviews of people that are accessing the platform or having a survey that is just what more information would you want in order to trust this data more or what would...if you had this what would have helped you to really use this information...that would be useful in knowing what the next...the next thing would be. But I think it's very easy to kind of get caught up in the idea of...
Angela Okune 1:28:05
like, oh, we need more information. Oh we need to have more. Cool. Okay, well the interview is officially done. Sorry it went a little bit longer than...
AO: This discussion took place at the working place (research office) of the person interviewed located in Nairobi, Kenya on Wednesday, May 22, 2019 from 10:16 AM - 11:44 AM. The discussion was guided by an amended version of this set of questions, which I had prepared in advance. We did not follow the guide strictly. No refreshments were provided. For the first 45 minutes we were seated in a closed meeting room but because the room was booked for another meeting from 11 AM, we moved to a corner space within the office to finish the conversation. As per the interlocutor's wishes expressed during the consent form process, I anonymized all proper names mentioned in the interview including project names, organization names, personal names. At the time of conducting this interview, I had known the interlocutor for five months and we had been interacting on a regular basis regarding qualitative research.
This transcript is part of a broader essay ("Researching in/from Nairobi") on expectations, values and experiences of those producing qualitiative research data in and about Nairobi as part of Angela Okune's dissertation project.
Angela Okune, "TRANSCRIPT: 190522_001 RESEARCHING IN/FROM NAIROBI", contributed by Angela Okune, Research Data Share, Platform for Experimental Collaborative Ethnography, last modified 20 March 2020, accessed 26 July 2021.