Asking the right questions is essential for accessing insightful information. Abel Koury from Compelling Analytics helps refine questions that drive your social enterprise forward.
After multiple interactions with nonprofits, Abel was inspired to combine his experience in research design and data analytics with social justice. Non-profits and smaller organizations don’t always have the capacity to create great surveys that will collect and analyze the proper data to further develop the organization’s services. Abel says organizations might be doing a wonderful job, but their data does not reflect the scope of their work.
Growing up in an “underprivileged town” with “economically disadvantaged immigrant parents” also shaped Abel’s perspective.
“I know firsthand how it is to grow up in poverty. I take that with me where I go, and I try to use that as a lens in my work.”–Abel Koury
Abel explained his thoughts on a rule of thumb for organizations. He started by creating an example of an organization with the goal of eliminating food insecurity. Focus first on articulating what you’re trying to achieve, and then sum up that end goal as a headline for a reporter. In this example, Abel said using “36% of kids are now able to eat three meals a day”. By focusing on the way you expect to present the story, and where you want the program to be when you get to the stage of sharing with reporters, you get clarity. Organizations can work backwards from that imagined headline.
“Before you even start the program, you’ve got to think about what do you want the data to look like? What do you want to collect? How are you going to show your impact?”–Abel Koury
Going deeper, Abel talked more about deciding what to ask and what information should be presented. Questions can limit results when the research design is lacking from the beginning. This led to Abel discussing how surveys should be created with the most diverse participant in mind. Questions inclusive to the variety of perspectives within your audience are an attentive and conscious effort.
Defining a question’s context is equally significant. Abel gave a great example where people need to choose between 1-5, with 1 = not very fit and 5 = very fit. These choices too much room for personal interpretation, miscommunication of the question and flawed survey answers. Unless fitness is defined through context. For example, 1 = not very fit (eg. I couldn’t run a mile) and 5 = very fit (eg. I can run a mile with no problem.)
As we continued, Abel got more into the rules of thumb organizations can consider to craft more effective surveys. He gave three solid rules any organization can utilize. I could really connect, since at Wild Tiger Tees, is also still learning to craft data collection, defining measures, and amplifying impact. Abel’s work with Compelling Analytics is building the future of proven impact.
Adam: [00:00:00] Welcome to the People Helping People podcast, the podcast to inspire greater social change in the business world and give you ideas on how to take action.
I'm your host, Adam Morris. And today I am here with Dr Abel Koury a personal trainer and founder of Compelling Analytics. A company which really helps you to ask the right questions to get the answers you need to drive your nonprofit or social enterprise forward. So I'm very excited to hear today. We have a treat. We're going to talk about something which has alluded me for a very long time, which is how do you write a good survey.
And how do you get good information that you can actually use? We're going to get some really interesting practical information here, so I'm excited to dive in. So, Abel, welcome on the podcast.
Abel: [00:00:48] Thanks so much for having me.
Adam: [00:00:49] I'd love to hear a little bit about the journey that led up to compelling analytics.
Abel: [00:00:54] Yeah, that'd be a, a long story, but I will try to shorten it for you. So compelling analytics was really born out of the idea that I could take my knowledge of good research design and my background in data analytics and combine that with my social justice, you know, sort of bleeding heart.
And it really helped nonprofits who have this sort of content knowledge or smaller organizations that don't necessarily have the capacity or maybe necessarily the expertise on how to create, you know, good surveys collect good data and analyze it effectively in order to continue to, you know, win grants and do really good evidence based research.
So it was just kind of a long process of seeing that at different organizations. I had worked at a, there seems to be this need for taking data, collecting good data, and then telling really good stories it.
Adam: [00:01:43] Wonderful. So you started off and you got your doctorate in psychology,
Abel: [00:01:48] developmental psychology,
Adam: [00:01:49] cool And then you worked as a professor for a number of years.
This is, so you've got to ask some really good questions and do some research into this.
Abel: [00:01:57] I did. I taught research methods for psychology undergraduate students, which was not their favorite course.
But we did talk about how to be a good consumer of research. So it's spent many years honing the sort of craft of what, what does really good research look like, and how do you collect really good data.
Adam: [00:02:12] very
Abel: [00:02:13] cool.
Adam: [00:02:14] How did you shift from that to looking at companies that are making a social impact?
Abel: [00:02:18] Oh, that's that's a great question.
I always say that I, I can't really separate the person that I am from the work that I do. So as a person who grew up in a rural, you know, small underprivileged town economically disadvantaged with immigrant parents, in, in like the worst possible schools. I truly understand what it means to feel like you have to climb the social ladder or the educational ladder without much help at all.
And so because I know, you know, firsthand how it is to grow up in poverty, I really, I take that with me where I go and I try to use that as a lens in my work and also to, you know, to look at organizations that are trying to make things better for folks like me. And you know, like I feel drawn to that cause
Adam: [00:02:59] Very cool. Sounds like a very interesting story growing up and just kind of experiencing what it's like firsthand.
Abel: [00:03:07] Yeah, I you know, I mean, if there is a silver lining, I will say that I, I feel very much like I can help give a voice to folks who would not necessarily have a voice or a place at the table. And I, and because I can do that, I feel like I, you know, I try to use my privilege for good.
Adam: [00:03:21] Very cool. Awesome. And so now tell me a little bit, when did a compelling analytic start?
Abel: [00:03:26] Well, it started in my mind like 10 years ago, but it started as an actual you know, something that you could site or a search online in November of 2019. My website actually is going live today after many painstaking hours figuring out Squarespace.
Thank you, Squarespace. For, you know, allowing me to learn how to make it, a website by myself.
Adam: [00:03:44] That's a great plug for Squarespace,
Abel: [00:03:45] Yeah, I know. I actually, I don't get paid from Squarespace with Squarespace. If you would like to pay me that would be wonderful.
Adam: [00:03:50] Great. Cool. And tell me a little bit, so you, you mentioned that the difficulties that, that you see nonprofits are facing, but let's talk about that a little bit more specifically. So can you describe a typical scenario that like a nonprofit would struggle with?
Abel: [00:04:06] Absolutely. So I've worked across a variety of different sectors either at the nonprofit in particular or small organizations without like a bunch of capacity or even for. Profit enterprises where we were partnering with nonprofits.
And it seems like there are some core things that happen in nonprofits because the, the hearts are huge and folks are trying to do so many things. Many times they're trying to do them under budget or under staffed. So you know, without the big sort of capacity or like a dedicated, for instance, data person, like a person who actually has spent some time thinking about.
You know, survey design and how to collect data. So thinking about the end in mind, a lot of times I'm, I would be either with nonprofits or working with them, and they had gotten kind of far down the line of this wonderful program that was I, that they knew was working. Like the children knew, everybody knew it was working.
But. The data, couldn't speak to it because at the very four, you know, front where you have to start thinking about like at the very beginning, before you even start the program, you've got to think about what do you want the data to look like? What do you want to collect? Like how are you going to show your impact?
Oftentimes, you know, without this sort of capacity or you know, you're sort of under the gun, are you being asked to do something you don't necessarily know how to do? You know, you get kind of far down the path. And that's what I kept seeing time and again, and I realized that's where I can make my impact.
Adam: [00:05:24] Very cool. And that seems like that's something that's not really a core expertise. Like people generally jump in with a passion for what they're trying to solve and they don't really think about how am I going to tell this story to the world so that I can get the help and support and resources and grants that I need.
Abel: [00:05:38] That's exactly right.
Adam: [00:05:39] Very cool. I'm now looking at a UN looking at a nonprofit. What are.
Some of the things that they really need to look at when they, they're trying to figure out how to click this data.
Abel: [00:05:55] Well, I, I usually I tell folks to do this one thing. I say, when you are done with your with your study, your program, whatever you know, whatever you're doing, let's say that you're, you put together a program to eliminate something like food insecurity, right?
So like, that's your end goal. I asked them to think if there's a reporter who's asking you, tell me. What you did here, what, what's your main finding? I asked them for that headline. If you can think about what you want your headline to be like, this very pithy, one sentence thing, like, you know, 30, 30, 6% of kids now are able to eat three meals a day.
That's your headline. If that's your headline, we know where we're going, right? So that's generally my first rule of thumb. I think ask them to think about, don't think about where we are right now. Think about where you want to be in the story that you're going to tell the reporter.
Adam: [00:06:41] Great. And so then you start with that headline and you kind of work back from there of what data you need to collect.
Abel: [00:06:47] Exactly. Yep. Because then that informs me in terms of thinking about how are they thinking about their impact? Cause like, you know, when you're so close to the problem you know, whenever you're trying to solve, you have so much knowledge that it actually gets in your way. And I I, I use this analogy all the time, but it's like when you learn to read.
You can never look at letters on a page again and not read them right? You, the automaticity of it gets in your way and it's a beautiful thing, right? As you become an expert, you actually become more of a problem for yourself. So. By taking that like that, you know, here's the headline approach. Here's that big aerial view.
You can, you can have a person who's like, you know, like me or someone who was like me and say, okay, that's awesome. I understand now how you're quantifying impact. Let's go further. Right? And then, yeah, we just kind of, we just keep on tracing all the way back to the very beginning. That's our, that's my strategy.
Adam: [00:07:36] Okay, cool. Now when you, when you're looking at this and breaking it down, what are different components that you use to actually measure the impact.
Abel: [00:07:45] Hmm. Do you mean like a survey, instruments, things like that?
Adam: [00:07:48] Like do you surveys? Like, do you, do you have people put new systems in place for collecting data?
Like what's the, what's, what's this kind of realm look like in terms of how do I measure impact?
Abel: [00:07:59] Gotcha. Ah, that's a really good question. So it really, it should be driven in large part by the questions that you want to answer. So if you know that you want to say something about, you know, the number of children who are able to get, just to continue with the food insecurity measure, you know, the number of kids or families that you want to reach.
And you want to see if your, your, your program has actually eliminated hunger. There's a couple of ways you can do that, right? You can say something like, I'm going to collect how many meals these kids ate, right? And you could do that with a survey that would be pretty simple. Or you can say, what does hunger mean.
I'm going to do targeted interviews and focus groups, you know what I mean? Like there's a richness there. I mean, ideally, you know, you would have multiple ways of collecting data, but if you have to choose just a few should really be driven by your research question and you know, what you really want to be able to say.
Adam: [00:08:47] And now you also do work with helping people craft surveys, which are a lot more effective.
Abel: [00:08:53] Yes. I actually came up with some rules of thumb that are based on some examples that I've seen. We're folks are just me. They're doing their best job at creating surveys and it seems like it would be really easy to just be like, I'm going to create a survey.
I'm going to put like, you know, ten questions there and it should be great. And the, the problem is that. Data is a powerful thing, but it is only as powerful as the research design. Like if your research design is flawed initially, even if you have the best intentions, what you can say from that data is going to be off the bat, is going to be limited.
Adam: [00:09:26] Now, I've found this time, time, and again, you know, I, I have no background in running surveys and you know, I'll attend to give back hack where you're kind of pushed up. Put a survey out and get some feedback. And you know, even with our social enterprise, while tiger T's, like we're, we're getting feedback on things and sometimes I look at the questions, I'm like, what are we actually asking?
And I have no idea how to structure that. So I totally relate to this of a topic, which is like, okay, where do I even start?
Abel: [00:09:52] Yes. And, and I, it, I think it like really can be overwhelming. My sort of like general rule of thumb, like the very big, biggest one is to write the survey from the point of view of the most diverse person who's going to take it.
Like, so if I'm. You know? Okay. So a lot of the research, especially in psychology, will be done by a cisgender, straight, white males. That's just how it has historically been, right? So, if that's your vantage point, you want to write the survey, not for folks who are just like you, but for the most diverse example of, you know, of who you could represent.
Right? I think that always allows you to capture a more, a richer portrait of your audience. It's like the first kind of rule of thumb
Adam: [00:10:36] That's really cool.
Abel: [00:10:37] Well. So for a few different reasons for me personally, and there's a, there's a psychological concept called priming.
And so priming is basically like, if you and I talk today about food insecurity in this podcast, later on, you're going to be primed to, if something, if anything has to do with food insecurity, you're going to be more likely to like, see that. So a priming effect also occurs when you're taking a survey.
So if right off the bat, I'm taking a survey. And the first question, a lot of folks will start with demographics, right? So they're like, okay, rate your race. If as soon as I get to that question, I see that there's only like, you know, white, African American, Hispanic, and then like other, I already am, I'm viewing this survey with a different lens, which is that the person who created this.
A survey is not really thinking about the richness of the participants, right? You should allow folks to be able to have multiple options for, in terms of race and ethnicity, to select all kinds, whichever one speaks to them, and also have a write in response. So I'm. Like when you sort of think about like why you're writing to the most diverse person.
Because when you, when they take that survey, you want them to feel like you're valued. Right. If right off the bat, you know, they're already seeing this person hasn't even considered me. You know what I mean? Like, then I myself, I know that I answered differently than I would normally answer
Adam: [00:11:52] So to get people to answer with their heart and feel like they're part of the survey, you have to make them feel like they're included.
Abel: [00:11:59] Exactly. Yes. Yeah, exactly.
Adam: [00:12:01] is really cool. I never thought of that. This is great. This is why we're here.
Abel: [00:12:06] Exactly. Yeah. Yeah. No, I and you could see, I'm geeking out on this, like, so hardcore, like, I think it's, I mean, it's really important and it, you know it's a passion for me. It really is.
Adam: [00:12:15] That's very cool. Now, you said you had some rule, other rules of thumb in terms of running a survey.
Abel: [00:12:21] I do. So, this one actually is a really simple one, and I've seen it on, I actually just saw it this week, so it was really fresh in my mind. So one thing that folks will do is they'll say here, Mark your age.
And they'll give, you know 18 to 2121 to 22, which would actually be problematic anyway because of the 21 being in two spots. But let's put that aside. When you give folks in age range, you're already collapsing those categories before you even you know, start taking the data. And what I mean by that is it's better.
Instead of giving someone a range of ages to select to ask them to state their birthday. When you do that, you get you could, you know, do the calculation so that whatever today's date was minus their birthday and get their actual age, you know, in number, like days wise. Right. That allows you to see if there is a real difference between folks who are 18 those who are closer to 20 you know what I mean?
Like there's not this arbitrary collapsing of ages and it doesn't save you very much time like in terms of data clean and Greeley, to not allow someone to just tell you their birthday.
Adam: [00:13:23] That is very cool.
Abel: [00:13:24] So it's small things.
Adam: [00:13:26] Now, when you're asking these questions that are about the person, is it usually better to ask it at the beginning or at the end? Does it matter where in the survey you actually put these questions that are more about who you are as opposed to the the survey of what, what you're trying to get to.
Abel: [00:13:41] Yes. Thank you for saying that. That is so important. Ordering of the questions is incredibly important and something that's often overlooked. Generally speaking, it is best practice to put demographic stuff at the very end of your survey.
There's also work by these researchers, I think they're at Stanford, but is Claude and Aronson. They had this idea. And it, it has turned out to be true time. And again in terms of the research, which is a stereotype bias. So if you are stereotype threat, so you know, if you think about some of your demographic characteristics and you know that there are certain stereotypes that are associated with your group, you're more likely to answer.
For instance, like an academic achievement has potentially a worse than you would've if you hadn't been thinking about your demographic characteristics. So good practice to just put those at the end.
Adam: [00:14:21] Cool. Very cool.
Abel: [00:14:24] Thanks.
Adam: [00:14:28] So continuing on. So how many rules, like I didn't ask this at the beginning, how many rules of thumb do you have for
Abel: [00:14:35] I just have one left. And that is Oh, this one actually again came up this week and so I drafted it down. Actually, is it okay if I give two more examples?
Adam: [00:14:42] Of course just ask.
Abel: [00:14:44] This this one is really just giving someone an anchor. And what I mean by that is I was reading a survey and it said something like, rate your level of fitness, and then it was one, not fit, you know, not very fit. Five very fit. That office, you know, on the, on the face, it seems like a fine question.
However, I don't know if I'm the participant, like what does it mean to be very unfit? So I would I would. Encourage you to say something like one very unfit and then in parentheses, something like EG, I couldn't run a mile. You know, right now if my life depended on it, you know what I mean? Something like that.
Right? Five, I can run a mile without any difficulty and just give folks something to understand it. Otherwise, you have people who are selecting one who might actually be. You know, more towards a three, and then people who are selecting a three that are probably, you know, more towards the two. And so, you know, you help people make good choice, you know what I mean?
Like you help them reflect their truest, truest
Adam: [00:15:38] Someone was coming up with a rubric of sorts that says, here's, here's what one means, here's what five means. So it's not just something that's applying scowl.
Abel: [00:15:48] Yeah. You don't want folks to be answering your questions blind for sure. You know what I mean? Like without any sort of like a understanding of what you have, what you mean, right.
And then it forces you to think about what do I mean, you know,
Adam: [00:15:59] Now, but, so this is a question we've been having it our wild tiger tease. At the end of each work session, we give the youth feedback on a scale from one to five on kind of four dimensions, a, their communication, their teamwork, their effort.
And. I forget the last time we did this every week. And so anyways, at the end of, at the end of each session, we get, give the youth feedback on you know, things like their communication, their teamwork, their effort.
And what we found is. Each of us running the sessions rates everyone differently, right?
And so we have no way of saying, here's, here's when you get a five, or when you get a a four, or when you get a one. How do you overcome that in a survey?
Abel: [00:16:47] So so folks help me understand that a little bit better. I'm sorry.
Adam: [00:16:51] Okay. So. This big, still a little bit about what you said about, about giving that context for it. I guess in the sense we have different people who are, are surveying the youth that we work with. And it's like, how do we make sure that we give consistent.
A rating among, among all of us.
Abel: [00:17:08] Gotcha. Okay. So this is a super common in research for sure. In terms of, we call it interrater reliability, right? So you want to make sure that your five and my five are the same. Right? So generally speaking you would basically practice you would, you know, each look at a person's behavior.
Maybe you would do 10, something like that. And you know, you would. Basically have all of you rate this person, rate why you did it, and come up together with like, this is what five behavior looks like, and be very specific about it. And basically you would just continue to do that over and over again until your percent of agreement.
Like, yes, you and I both rated this person and you know, or 10 people the exact same way. You know, you, you want to get as close to 100% as you can. But generally speaking, like, you know, it's acceptable in research 80%. Of agreement. I think even some people will say 70% but obviously, you know, you want your numbers and my numbers, like you said, you know, you want fives to mean the same thing across Raiders.
Yeah. And that's something that, I mean, it's tricky, but it's, I mean, it's everything, right. You know what I mean? Right. Like it's everything.
Adam: [00:18:11] Yeah, that makes a lot of sense. And I suffer from being very optimistic, so I tend to be like, yeah, five for everybody. And somebody else comes in and they're like, well, four is kind of my standard. And then five, if you're exceptional, and it's like, okay, so that makes a lot of sense. That actually just coming up with some case studies and collectively going over them and seeing where we fall and then kind of revising that so that we're all on the same page.
Abel: [00:18:35] Exactly. Yup. Yup. I think that that is a good exercise, you know, for you to really put this stuff into practice.
Adam: [00:18:40] No, I love it. Thank you. That's something that we've been talking about a lot of time for you to figure that out. So
Abel: [00:18:46] I think you're miles ahead that you're thinking about that.
I think that's great.
Adam: [00:18:49] Cool. Now you said good. Well, one more rule of thumb that you, you have for writing good surveys.
Abel: [00:18:56] This is the very last one. It's another example. And it's about being specific. So if I can leave you with, you know, sort of like the overall three. One is just thinking about the order of the questions, thinking about your audience.
And then the third one is being specific. So. I was working on a really massive project and I was reading through the surveys. I was the analyst, so I didn't actually get to design the survey. So I just had the data, you know, towards the end. And I noticed that they asked participants, how much did you spend of your stipend on groceries, on car payments, on whatever.
Right? But think about this. How much did you spend? Now do we mean percentage. Or do we mean dollars? So when I was looking at the data, it looked like some folks had thought about it as how much of my siphoned do I spend on this? This is a percentage. And it was like somebody would put one or five and I'm like, is that $5?
Is that 5% you know what I mean? So being really specific in terms of like what are you really wanting folks to do? You know, how often do you do X? Or how many hours do you work? How many hours or work in a day? In a week. And you know what I mean? Like really, you want to be as specific as possible. So that's what I will leave you with.
Adam: [00:20:05] That is really cool. And that makes sense. It's something that you just don't think about when you're writing a survey question and you're like, but being able to pay that down now, how do you balance that with, being able to write a question that captures everything in is not too narrow. Right? So like if I'm writing a question, I think in my head, well, there's option a and there's option B, but I'm not even considering CD and E. like, how do you get out of that to really you know, capture things which might not fall into that specific.
Abel: [00:20:37] That's an awesome, you're asking all the right questions. So I feel like that's a two pronged answer. The first one is, I always tell folks, pilot your questions first. Pilot your, you know, ask the most open ended question.
And see what answer choices folks will give you. So if I want to know about you know, again, like food insecurity just to, can continue on with that one. I want to, you know, pilot all the questions on the survey that I think I want to ask and I want to give it to, you know, maybe I can give it to 10 or 20 people and then I will let them write in their answer choices to see what are the possible sort of scenarios that I haven't thought of.
Now there's always a chance that you will not. Capture it in just your thinking. And so you always leave, in my opinion, you always leave a spot for folks to write in their responses. Now, yes, this becomes hard to clean if there's a lot of if there's like thousands of participants, but there is a richness there and there's a reason to allow them to be able to give you a very full and complete answer.
Adam: [00:21:32] Great That is really cool. Always thinking of the, the background side.
Abel: [00:21:36] Of course.
Adam: [00:21:37] okay, so, so great. Now, now that we've kind of gone over these tips can we just review them one more time?
Abel: [00:21:44] Sure. the very first one is writing a survey,
as for your most diverse person that you think will be taking your survey. So that's the sort of consider your audience viewpoint.
The second is considering the order of your questions. So remember thinking about not priming your, participant in a certain way by just, you know. Putting demographics first or something like that, or asking them in a way that doesn't feel authentic to them. And then the third is be specific.
So whatever you think that you're asking, you know, really make sure that you are asking it in the way that will give you the most specific data.
Adam: [00:22:16] Cool. Now another question for you is like, how do you find a diverse set of people to survey? So if you're in a crunch, you know, it's like, okay, I know my friends and my family and my network.
Are there ways to actually reach out and capture a broader population when you're doing a survey?
Abel: [00:22:32] Yeah, definitely. There is an approach called like the snowball method where you really can just like, let's say I wanted, it didn't matter if I had the complication with this is that your, your sample that the sample that you select is going to be.
The the only one that you can really generalize to unless it's a random sample. That's, you know, so it's, it's becomes challenging. But if you, you know, you could do something if it didn't matter that you had sort of like what's called a convenient sample you could stick something on Facebook and say, Hey, look, I have a survey.
This is the link to it. Oh, wait, maybe not, maybe not Facebook, depending on what your views on it, on it are, maybe just good old fashioned email. But however, you know, you do it, you can actually do that and say, can you also send it to five more people? So that's called the snowball method. It can give you at least a glimpse into you know, or some insights into your data.
If it doesn't matter that it's, you know you know, highly diverse population. The, the problem with that obviously is that, you know, there is some bias there, right? Like, if you only happen to know, you know, people who are in the same age or, you know education level, things like that, then of course, that might skew your data a little bit.
But that's one method that you can actually employ.
Adam: [00:23:34] Got it. And now when you're doing this research at a university level, like how do you get a random sample to actually survey?
Abel: [00:23:44] Again, this is a complicated stuff here. It is, it is certainly not easy. I mean, a truly random sample is almost, it's not impossible, certainly, but it's, it's very hard to find because you would literally have to say, if I'm going to survey, I'm going to take a random survey of of OSU.
Let's just say, even, let's just say OSU is a big one, right? You would literally have to have it so that every single person who, okay. Let's see. We have to even make it smaller now, right? If our universe is OSU and we want to look at students there you know, so we've already kind of cut our universe down into the smaller population, but it would have to literally be, every single student at OSU was like basically put into like, imagine like drawing a number out of a hat.
So everyone has assigned a number. They're all put into a hat, and you would have to select, you know, from the hat a number. Like, let's say there's a student, a, okay student a is now selected for our study, but we have to actually for it to be truly random, put student a back into the hat so you have the, you know, there is the probability of drawing them again, but because every single person is supposed to have an equal.
Chance of being drawn each time, like probability wise to get a truly random sample. I'm sorry, this is like so much. This is like so detailed that like, so you can imagine how that would become really hard in as the, you know, sample kind of gets larger. So a truly random sample was very hard. A lot of the data that I that I'm analyzed, that I analyze and have expertise in are these large scale nationally representative data sets that have been conducted by like the Institute for education statistics, department of education.
You know these folks who have lots of people on the ground who are, you know. I'm able to collect that sort of random stuff. The average person is probably not going to have
a truly random sample, and it's, it's actually, it's okay if you kind of know going into that. I mean, you, of course, you want, you want your sample to best represent the people you want to speak to.
And so I think that is the more that, you know, that's a really important thing.
Adam: [00:25:32] not it. So I guess that goes back to just keeping your mind on what you're trying to achieve with the survey.
If you're trying to pull in information from a community that you're working in, then getting some sample from there, it's going to give you a starting point.
Abel: [00:25:47] Yeah. That's such a nice way of saying that without all the jargon. Yes, exactly. Yes.
Adam: [00:25:53] Cool. Now I'm just curious if you have any, things that you've seen people do in surveys, which is
really bad for them or lead to really bad data.
Abel: [00:26:01] You know, one right off the bat, honestly, is I was analyzing this data once for, you know, for this job that I had.
And I was looking on the surveys and I was like, you know, I can't find the date anywhere. Which, you know. If you're creating a survey, you might not think like, Oh, it's important to put the date on there, but if I'm looking for like, how much did this person change in six months? And I don't actually know, like exactly when this was collected, that is a problem, right?
Or you have no way of matching up the survey with the person who took it. So, you know, it's good practice to not have the name like a name attached on a survey. You know, if the data is sensitive but at least giving them a number or something like that to, you know, to correspond to. So it's those little things.
That, you know, you might not think about, you know, putting the date, putting a place for the participants number that really in the long run, you know, in particular, not knowing how long, you know, if something was like, that is like a pretty bad one. Yeah.
I mean, cause there's no way to correct it after the fact.
Adam: [00:26:57] And then a question on length. So is there any guideline on like how long it should take somebody to do a survey and you know, if I, if it's a really long survey, does that degrade the quality of answers over time? Like, are there issues with just how, how much data you're trying to collect at once
Abel: [00:27:16] Oh, you are asking. This is, I'm so glad you asked about this. This is actually on my website. And it's something that I really care a lot about. So when you create a survey space, especially when you are sort of taking the advance the vantage point of just a researcher, right? In your, in your S you're just creating the survey, you want to ask as many questions as possible, right?
Because you have the sample and there is a real danger in that. Not only do you fatigue your participants, but if I may. A lot of, you know, the research that I've done has been with, you know folks who are living in poverty or economically disadvantaged. And you know, we're studying, like we are using them and our, our our studies because we care.
Right? But I always caution researchers in particular when you're working with folks, and you know, you have them in your, in your study because they are academically disadvantaged. Giving them a 15 page survey and saying, you know, we'll give you a $10 gift card. I find personally to be offensive. I think that, you know, if you are asking folks to take of their time and they're already, you know, like they're giving time that, you know, lots of them don't have really, you know, freely to give.
And then, you know, asking them. You know, all of these questions that you probably don't necessarily need even for that study. I always encourage research researchers again to look at the end, say, where do you want to go? And then just ask those questions. I think it really shows your participants and the people that you care about them as individuals and not just data points, right?
Like they're not just, they're not just averages for you. They're like people, real people filling out your survey. And so I think best practice is the fewer questions that you can ask to meet your needs, the better.
Adam: [00:28:51] Right. Okay. So I'm just going to just take a quick break here. So we covered talking about the surveys and things like that.
I would like to ask a little bit just about like the vision for where compelling analytics is going to go just to kind of get, kind of draw it back. So I'll ask the question there. So great. Thank you so much for sharing all this.
All right. Maybe we can move a little closer. All right.
All right. So great. Thank you so much for sharing all this information about how to create a good survey. It's always been very top of mind bringing this back to compelling analytics. Like where do you see your company going?
Abel: [00:29:23] Well. I am really optimistic and hopeful.
The, the first order of business for me is really making connections with folks who are doing really important work in Columbus. I've already had, you know, really awesome conversations with people who are in the nonprofit. Field who are doing things with folks who are LGBTQ IAA who are, you know, trying to solve huge problems of, you know, issues around childcare, availability and accessibility food insecurity.
So I'm already sort of partnering with folks and we're, you know, thinking about how I can best help them reach their needs. And for me, that's, you know, that's my very next step. I think longterm. I'd love to be able to you know, bring in folks to help me do this, you know, good work. In particular, you know, I'd love to be able to have compelling analytics be a spot where you know, folks who are either formerly incarcerated could come and, you know, learn data you know, data mining skills or a good survey design.
You know youth who are homeless, anyone who's in the LGBTQ, IAA population folks of color. You know, I really want, I want compelling analytics to be something that obviously grows into something much larger than me. And also, you know, really does the things that speak to my heart, you know,
Adam: [00:30:28] fantastic. Thank you very much for coming out and talking to me today. It's been really informative and very interesting.
Abel: [00:30:36] Thank you very much for having me. It is a true pleasure.
Adam: [00:30:39] Thank you very much.