Skip to main content
cancel
Showing results for 
Show  only  | Search instead for 
Did you mean: 

Episode 5 | Why Evidence Matters

Episode 5 | Why Evidence Matters

Welcome to EduCast 3000. It's the most transformative time in the history of education. So join us as we break down the fourth wall and reflect on what's happening. The good, the bad, and even the chaotic. Here's your hosts, Melissa Lobel and Ryan Lufkin. Hello and welcome to this episode of the Instructure Cast. And we are so glad to have an amazing guest join us this morning directly from the team here at Instructure. She's going to talk to us about the importance of evidence in ed tech decision making. Welcome Dr. Mary Stiers. Thank you for having me. Excellent. Mary, you lead Instructure's research team, specifically focused on partnering with institutions, foundations, and ed tech companies to establish and grow evidence to enable evidence based decision making. That's something that Melissa and I spent a lot of time talking data and everything. How do you become so passionate about research? How do you find your way into this role at Instructure? So my father, you before he retired, he was a neuroscientist who focused his efforts on all types of applied research. And I was exposed to that from a young age, including the lab coats and petri dishes and conferences. And interestingly enough, I thought about pursuing biological psychology, actually. And then when I went to college, life had other plans for me. And, one of the big things was that I don't handle blood So I realized I couldn't proceed That could be a problem, yeah. Yeah, yeah. So I was still very interested in psychology. had great experiences in high school. had a fantastic high school AP teacher in psychology. And so I landed in developmental psychology because I was very interested in all different areas and facets of psychology and really thinking about it in a developmental lens. And I pursued my master's and doctorate in developmental psychology. And then when I was in graduate school, I really focused heavily on child and adolescent resilience in the face of negative life events. What a time to focus on that. That's amazing timing. That's true. know. I know in light of everything going on, was pretty COVID, but certainly, you know, very focused on child well -being. And I'm still very interested in that. But I was slightly discouraged when I was in grad school because I didn't see at the time how I could translate my research to the real world. It felt very And I wanted to do something a little bit more than that. So I was working in the educational psychology department at NC State, and I taught several courses to future educators in actual educational psychology and adolescent development. And while there, you know, I realized that I wanted to be in an applied setting and work directly with educators and feel like I was having a positive impact. And I didn't at the time kind of see that pathway for me to be a teacher myself, but really wanted to work with and help teachers. So after, you know, graduate school, I went to for a small woman -owned research and evaluation company for over a decade while there. Learned a lot about rigorous education research and program evaluation. I partnered with everyone from museum educators and researchers to those focused in community colleges and professors there and did a lot of work with different providers and large research organizations. And I learned a lot about there about needs -based research. and writing research findings directly for those practitioners and people who could use them and put them into practice. But after 11 years there, I found that I wanted to work more directly with educators and administrators. And that's when I found Learn Platform. And at Learn Platform, we did get the opportunity to work directly with education administrators and educators to really empower them in conducting research, examining whether or not products were actually working for them in their local context. So I found that particularly valuable. We were also helping them to kind of manage all these different ed tech tools, I know we're going to talk about here in a little bit. You know, we've continued this work at Learn Platform by Instructure as we've kind of been acquired and merged into the Instructure team and expanded to work with ed tech providers through Evidence as a Service. And I've really found in the past five years of my professional career that having the opportunity to really work in close collaboration. with district administrators has been very humbling because they've told me, we don't understand you, right? Sometimes you need to break that down a little bit more for us. And so what I thought were clear concepts weren't so clear. And it's really taught me to speak in a way to ensure that research can actually be used by decision makers. And I've seen the same transition with these 13 incredible researchers that I work with on my research team as well. They're honest, critical, collaborative, but I've seen that same shift for them. to really make sure that research is accessible. You really, at the end of the day, all of us that are in it for the students, and that's why I care so much about evidence around learning and what is impactful and really working directly with districts and helping them to unpack and understand that. That's amazing. Well, and for our listeners who may not know, Instructure acquired Learn Platform, which has been almost two years, I think, at this point. Correct, yes. An incredibly valuable piece now of the Instructure ecosystem, and so we're excited to have your team be part of that. Yeah, and you have steeped Journey Mary research. And I love that. I love that down to the lab coats when you were a kid. You've also talked a lot about in your background, passion for learning or passion for education. And we always like to ask our guests just to get to know them a little better. They have a favorite learning moment or and it could be you delivering one, it could be you part of one, it could be witnessing one, Do you happen to have a favorite learning moment you could share with us? Yes, of course. And it's funny because it's difficult for me to kind of pick a single one. So I'm going to give sure two and just give kind of shorter examples of each of us. Great. So, you know, I'll start off with high school. So as I mentioned, I had a wonderful AP psychology teacher. name was Miss Faye Johnson. I think she is now retired from Montgomery County Public Schools, but she was incredible. You know, her passion just really shined through every day. And I think it made us all very curious and very interested in psychology. helped make concepts that would be otherwise inaccessible accessible. So talked a lot about biological psychology and broke some of those concepts down into ways that we as high schoolers could easily understand and kind of apply and think about in our local settings. So, you know, I was particularly excited by her work and excited by the way that she kind of presented things to us and made things accessible. And that's ultimately why I pursued psychology. And undergrad originally thought about biological psychology, as I noted, but then realized that wouldn't work out for me. which is fine. I'm happy with the path that this led me to. been very excited to work directly with educators and students. But the other thing I'll share is that when I was in graduate school, I had the pleasure of working as a teacher assistant for Dr. Jason Osborne in educational psychology. And really, he introduced me to the concept of mastery learning, which I hadn't really experienced firsthand as a learner myself until then. So I was really impressed with his ability to kind of flip the script. and education and treating tests actually as an opportunity for mastery, which I thought was incredible. So students could take a test multiple times. I think that was up to four times to really show that they had mastered the content. And I thought that was incredible thanks to this huge item bank that he had developed. And I had another graduate school instructor, Dr. Rupert Nick Cost. He was a social psychology instructor at NC State. He also challenged me in ways that I had never encountered before. you know, even through all of my K -12s and undergraduate career, but he gave me repeated opportunities to show mastery. And, you know, when I fell short, quite honestly, and up until that time working with them, you the way I'd experienced education was really through this fixed mindset lens. So you either passed test or you didn't, right? You either understood or you did not. But both of these experiences in graduate school really shaped my belief in this idea that everyone has potential. and really solidify this growth mindset for me, which has really influenced the way that I work with others and kind of think about things myself. That's fascinating. Cause I've been kind of obsessed with that shift towards mastery because I think I was one those kids with undiagnosed ADHD and you you struggle with like falling behind the curve and thinking you're not smart, right? And then you realize, people learn differently. Right. And so I haven't really thought about the work that the teams are doing behind the scenes to actually make that shift. That's fascinating. Yeah, I love that, that focus on mastery. And if I can, I'm going to, I'm going to shift us a little bit and stay with that focus on mastery a bit. So you get as part of Learn Platform and now part of Learn Platform by Instructure, you have over the years been able to lead a research project that gets released on an annual basis. And that talks about how technologies are being used. I would argue a fair amount of those are for mastery, but even more broadly, how technology tools are being used. And how do you unpack efficacy underneath that? So for all of our listeners, we'll share this report. It's a really insightful report. There's some statistics in there that when I read it and our listeners know I've been working in the tech ecosystem for 25 years now and very passionate about it. I do a ton of work with one at Tech. And I read some things in this report that were really surprising to me. So I'm curious though, to get us going, would you mind just sharing like, where did this idea of this report come from? You know, how long have you been doing it? Give us like the basics on the report. And then we want to dig in a little bit to some of the things that were really interesting in it. Of course. Yeah. Happy to share. So about 10 years ago now, a group of educators, administrators and researchers came together to help local and state education agencies manage their vast EdTech ecosystems. And we realized even back then that EdTech wasn't going away, you know, and because it's becoming increasingly difficult to manage. That's something that we frequently heard. So we realized that they needed support in understanding all the different ed tech tools that were out there and understanding not only the usage of these different tools, but the evidence around them. So that's when Learn Platform was created so that, you know, education administrators and educators could come in and understand which tools were being used. Are they safe from a data privacy and accessibility compliance perspective, you know, helping them to kind of centralize all these different tool requests and manage their hundreds of ed tech applications and contracts. And then this tool also allowed educators and administrators to understand EdTech tool usage by their students and educators to kind of solicit teacher perceptions and then also conduct these rapid rigorous research studies, which is one of the things that drew me originally to this work. So we've actually been examining EdTech tool usage for a very long time, which I think is particularly fortunate in light of the COVID years, right? And thinking about this massive shift. So we started to wonder, about the quality of these different tools used in classrooms from privacy, interoperability, research, and kind of content quality perspectives for a while. So it was actually back in 2023 that we delivered our first evidence report, which actually expanded on our EdTech Top 40 report. The EdTech Top 40 report has been around, I think, for about eight years but this is our second year of the evidence report. And I think one of the things I found was very eye -opening and there's several pieces, Melissa, as you noted, but from a research perspective, in 2023 for our first report, we found that only 24 % of the most used ed tech tools in United States actually had any kind of rigorous evidence that their tools were research or evidence -based. which I found incredibly troubling. You know, but one of the things I'll note is that in just one year, we've seen a 12 point increase to actually 36%. So that's been significant. I feel like, you know, I think that's largely due to companies starting to seek the digital promise research based design certification. Also districts conducting hundreds of rapid cycle evaluations to understand how tools are working for them. And then also, you know, there's been work that our research team has done in partnership with ed tech providers across the country to see how their tools are working in partnership with districts. And so we keep that collaborative relationship going. And then a couple other quick highlights, Melissa, because I wanted to touch on a couple here too, from a data privacy and interoperability perspective, we've seen more ed tech providers actually commit to student data privacy and interoperability through things like the student privacy and project unicorn pledge. over the past year. So actually about nearly half of tools have committed to student privacy and about over a third have committed to interoperability. That's a big word for a Monday morning. But we've yet to see the same increases in actual certifications. So it's these promises, but then following through on action is where things are falling short. So about one in 10 tech tools receiving the one ed tech and I keep safe certifications for data privacy. and approximately one in 10 receiving the one in tech and project unicorn certifications for interoperability, which really highlights. I think the need to put more pressure on solution providers to do more to not only make pledges, but really to follow through and actions. And I think the report gives educators, administrators a lot of things that they should be asking questions about. So thinking about, these tools respectful of student and educator data privacy? Have these tools demonstrated compliance? with local, state, and federal privacy regulations? Do these tools work seamlessly together? Do these tools provide quality standards aligned content? Have they done rigorous research that's been validated by a third party to be evidence -based and, more importantly, to have positive impacts on student learning? And I think all of these questions are really just critical. for to answer and really important for anyone to kind of dig into and better understand about all the different ed tech solutions they're using. Yeah, that's such a great overview. Thank you. And some really great highlights that you we want to tease out. I'll also mention for our listeners, you mentioned a bunch of different certifications and pledges. We'll make sure to link all of those in our show notes as well. Because that was such a great collection of ways that you can go about evaluating tools or, and as tool providers even more, ways to be able to think about how you're building your tools so they are the most effective in the classroom. So we'll make sure to link all of those. Yeah. Well, I think a lot of times when people hear research, they think long -term research studies. I'm kind of fascinated by that, that rapid evaluation that you were talking about. How do you set that up and make sure that you're getting answers in a timely fashion and actionable timeline, right? And what is good evidence? Yeah. So with the ROTVID research piece, we follow all the same steps that other research typically uses. We're just really using it for decision -making. And so when you think about traditional research studies, those can take anywhere from a year to I've seen up to four years, or even sometimes 10, right? And these longitudinal research studies. But the problem is that that doesn't really, that's not really helpful from a decision -making perspective. And With this rapid research, we're working in direct partnership with districts. They're giving us the data that they already have when it comes to their assist data and their assessment data. And then we're rapidly analyzing that data and helping them to understand, this product really moving the needle for their students or are these set of products doing that? So it's kind of approaching research in the same way. It's just really much more of a collaborative partnership. And then with thinking about good evidence, I feel like classrooms are complex and tech is no different. I think There are a lot of things to consider from a good evidence perspective. There's been some recent articles, I don't know if you saw in the New York Times by Jessica Gross, which really suggests that EdTech tools don't improve student outcomes and suggests rigorous reviews and oversight. And I really do agree in principle that not all evidence is created equal. And there's just too much out there to sift through, right? Melissa, you talked about me mentioning all these different resources to review. There's a lot, right? But I think in my opinion, a tool that has good evidence or strong evidence has shown commitments and positive action as well when it comes to things like data privacy, interoperability, and content quality and research. From a research perspective, pardon my bias, all tools, even ones new to the market so that they should be research -based. And that's not difficult to do, It's really easy to clearly document how the product should help student outcomes in what we call a well -defined logic model or a program roadmap. And I would argue that even startups should have that ability and should ground their product in research, right? And there's a ton of research out there that are done by just incredible folks at all these different universities and these large research organizations. And we can't disregard that. We really should be incorporating that into our product development. But product providers shouldn't stop there. They should look to see if tools have a positive influence on student outcomes by conducting things like correlative studies, quasi -experimental and experimental studies. And this research progression isn't anything new. So in fact, the Every Student Succeeds Act does outline this evidence -based progression from a foundational logic model up to ESSA tier three, which is a correlational study. asking, do students who use this product have better outcomes. And then ESSA tiers two and one are referred to as quasi -experimental and experimental or randomized control trial. We're asking essentially, do students who receive the product outperform those who don't? And the key difference between these top two tiers is whether or not you're randomizing students to receive the intervention or not. But circling all back, I think one of the things that I've learned from working in close partnership with districts and educators is that an EdTech tool may work in one setting and not in another. And there could be a number of reasons for that. could be implementation. It could be professional development or any other things like that. But really, a single study, no matter how well done, will not prove that a product will have a positive impact everywhere. Or in your district, you really need this preponderance of evidence. need to kind of understand, has there been research in a context similar to mine? But then also, has this research been done with my students? Have I looked at the data to see if it's moving the needle? And I think is really interesting because if you think about how districts are making decisions, they're just grabbing onto anything they can, I think, in some places and not having that kind of access to research in a context similar to theirs and really understanding what that outcome might be by adopting that tool. And that gets proliferated by the fact that there are thousands upon thousands of tools that are reaching not districts and administrators, but the teachers go adopt every day. I mean, there is so much out there. One of the areas you've mentioned, if you are ready, that I was surprised in the report. Another area that I was surprised in the report was about the volume of tools. Would you mind sharing just a little bit about what the report surfaced around that volume? I mean, some of the numbers are astonishing. And were you surprised as you all dug into that by the volume of tools being used? Yes, happy to share. So, know, in 22, 23. in that school year, school districts were accessing an average of 2 ,591 tools during the school year, which is incredible. And then on top of that, we did see that students and educators themselves were accessing 42 unique and tech tools over the school year. I just think about my kids accessing 42 different tools, and that's just incredibly high. And I wish that I could say that I'm surprised by this, but I'm not. We've been tracking those numbers, as I mentioned, even before COVID. And we've seen this continued increase. certainly accelerated, right, post -COVID. But it started out that, you know, back in 2016, 2017, that we were seeing that 300 tools were used across the district, which is really incredibly high when you think about it, even before, you know, these computers and everything were rolled out extensively across settings. But what has been surprising to me as well is that, spoiler alert, We're working on the next ed tech top 40 and we've seen that even though there's this S or funding cliff coming, the use of so many different ed tech tools has actually not slowed down. actually continues to grow off. So we'll be releasing that this summer. Well, and we've seen just an explosion of new AI based tools, right? That's one of the things that I think has been so interesting. Not since the .com boom of, think I've seen so many startups throwing ed tech tools out there, but it didn't start with AI. This has been around, like you mentioned for a while. What is causing this, in your opinion, what do you think is causing this proliferation of tools? Yeah, that's fantastic question. I think that there's a lot of reasons, right? I think that education is complex. so similarly, my reason is a little bit complex, my opinion here. But I do think that educators find a lot of different tools online. You know, are they hear about them from their peers or students may find tools and request them? You know, potentially not two parents have mentioned things. I've heard that as I think with also funding for new materials being on the lower end, educators are forced to look around and search for ways to supplement their instruction. And there's a ton of free tools out there. But the problem is that we all know free isn't free. There's concerns there about data privacy, just to name one. But there are others. And we know that also we as a research community have done an exceptionally poor job of making research on the effectiveness of these tools inaccessible. I may understand when looking at a tool that it has no evidence or very poor evidence, but many of these findings aren't translated for researchers and practitioners to easily access, right? They may not have a lot of that information and that evidence around data privacy, interoperability from all these different groups. There's so many different groups, but I do believe that Learn Platform and the ISTE EdSurge product index, they recently have updated their index, are trying to kind cull all this information and provide information on not only research evidence, but also data privacy and interoperability in a quick way for educators, administrators to go to and at least see, you know, do these tools that are being used, do they meet these basic requirements when it comes to evidence? Yeah, that's one we'll definitely list in the show notes. yeah. It's such a great resource. And I'm curious. mean, this is just, again, so and an interesting time for districts in particular to think about how are they building a student learning journey. As you talked through that, I just started to think about all the risks associated with that volume of tools. What have you seen or what do you think both on the risk side and then even on the positive side of all these tools? And you even mentioned, you know, thinking about your own children and all of the tools that they're accessing. Like, how do you reconcile this world in this volume of tools? Yeah, that's a great question. I think there's several concerns, but I do have pluses to this too. So I promise I will get to that. I think one is it's difficult for administrators. We've talked about this earlier. It's difficult for administrators to understand what is being used and then to even kind of vet and understand what's the full body of evidence around all these different solutions. And then additionally, I think there's real issues with an overload of multiple tools for ELA, multiple tools for math in a district, which can really create confusion and a lot of distraction. And we often work with districts, you know, who tell us I have multiple ELA or math tools, which of these are actually really moving the needle and having a positive impact on student learning. So some of us is this product really worth my investment? And I think that's a fair question. You I worked with the charter school network a few years ago who was actually evaluating the effectiveness of an ed tech tool for their students in special education. They really wanted to see a positive impact because they were actually spending 25 % of their annual budget on this tool. So they wanted to ensure that it was working for their students. I know I was shocked when it was pat him out. So I wanted to see that positive impact too. So we did that rapid research Ryan, as I alluded to earlier. We looked at their data and we found that students who actually did not use the tools, these students in special education who were just receiving support from their classroom teachers actually ended up having much higher achievement than those who did use this really expensive tool. And it was actually, I think, a 14 percentile point difference. which in other words, not insignificant. no, it's not because it means that if a student at the 50th percentile were to receive this intervention, they would actually perform at the 36th percentile. Right. And so that tool was actually not helping their students. It was much more effective for them to just use this teacher instruction based approach. And so they had changed things there. But we've also seen positive impacts of ed tech. So I don't want to say that all tech solutions are bad. We've actually seen a lot of positive progress. We've actually worked with several online tutor providers in the past several years post COVID. There's been a huge surge in interest in tutoring solutions. And we've seen that online tutoring solutions, when they're implemented well, they actually can have very strong positive impacts on student GPAs, student performance on different achievement tests. And kind of another spoiler alert here as a follow up to our evidence report, some people on my team have taken a look at the effectiveness of online tutoring solutions because previous research has actually focused heavily on in -person tutoring. So we took a look at online tutoring and have some interesting results there that I'm excited to share out soon. So I think I was encouraged by those results. So I will kind of put that plug in there. And then one more thing, just from a continued positive lens, I think that increased competition in the market really could ultimately be a because we may start to see the tools that are being used are ones that are actually better suited to the needs of learners across the country and also internationally too, right? they hopefully have a better user experience. We'd see higher quality content. We'd see greater compliance, while also really ensuring that these tools actually have positive impacts on student learning, but additive really to the incredible impact that we know that educators and administrators already have on students. I always had a little concern about some of those conversations around the negative aspects of technology, because I do think that, you know, my kids are digitally native, right? They're a 19 and 13 year old and they are on their screens all the time. We need to meet students where they are. But I love the idea that, you know, this conversation around we need to do it effectively, you know, not all tools are created equal. I think this is incredibly important. And you mentioned a little bit earlier that ESSER funds have dried up. We're not necessarily seeing a decline in the number of tools that are being used. What can the industry do to move in the right direction there and limit those or maybe not limit, maybe the word is choose the most effective tools. Right. Yeah, I think nothing's really going to change until we start to see more demand from educators and administrators to ensure that these tools actually meet evidence standards that are set by these third parties. I think the 2024 evidence report really provides a great starting point for states and districts to start asking these questions, right. And then asking for proof Do you have certifications by One Ed Tech or Common Sense Education? Do you have certifications by Digital Promise or iKeepSafe or ISTE or Project Unicorn? There are so many and even Learn Platform by Instructure, have different S of badges. We've also seen from districts and providers that districts and states are starting to require evidence in their RFP process, which I think is incredible. So actually in many places. You can't actually sell into the district anymore unless you have that ESSA 4 logic model. And in some places they say you need an ESSA tier 2 or a quasi -experimental design in order to even be used in their district. And then I'll mention two others. So in Florida and New Mexico, we've actually seen these states requiring evidence -based reading tools in K -12. So they have something called the Just Read Florida and Move On When Reading Initiative. And in those, in order to get your foot in the door and if you're a literacy tool, you need to show that you've had positive impacts on student learning through at least an essay tier three or higher study. So I feel like ultimately we're all moving in the right direction, but there's still some room to grow and we need to have a greater demand for these evidence in different places. Gosh, I love this. And thinking about that advice, so much underlying that our collaborative partnerships in this space, right? We're talking about different organizations that are coming out with standards and certifications. And I know a lot of those organizations work closely with districts, work closely with federal government or state government. What role do partnerships play in this? And do you have advice for the broader ed tech industry on like, how can we be more collaborative about surfacing up evidence so that we can, you know, as educators pick the best tools for our students? Yeah, I think this is a fantastic question. And it's one that we've all tried to figure out, right? Because we all work in our little silos and it can be difficult. So I'm gonna first tackle this through a research lens and then think a little bit more broadly, because again, researchers, so I'm a little biased here. I think that as researchers, we have to do a little bit more to really meet districts and states where they are, right? And I talked a little bit at the beginning about how research takes too long. We have these one year and four year long studies, but we have thousands of ed tech tools out there that really need to be evaluated rigorously to see if they're actually helping students. there's this proliferation of AI tools, as well. And so like those will even require more rapid research. And so we need to really be advancing that much faster. We also realize that research is expensive. So one of the reasons I left my previous job was that I couldn't actually work directly with educators and administrators to see how products working for them because they couldn't afford us. And that's a problem, right? And I've seen from federal grant perspectives that sometimes research will award them millions of dollars for a single study. And I think that really needs to change. On top of that, research isn't always accessible. as researchers, we're trained in graduate school to write for other epidemics, for other researchers, right? But the issue is that that's not practical, that's not helping. One story I'll share there is that we recently asked a researcher to actually unpack their findings. for educators and administrators that was looking at the effectiveness of a tutoring solution because the provider and the district partner asked us to talk to them about translating. And actually this researcher, yeah, it really is. And this researcher responded that this report isn't made for them. And so they refused to translate the findings, which was incredible to me. Yeah, and so like my research team actually had to take the results and said, we will unpack it for you and then have those conversations and actually had a webinar to really unpack the results in a way that practitioners can understand. So I think as a research community, we need to do more, you know, realizing that there's so many different ed -tech tools out there, we need to make it more cost effective, but still maintain rigorous research, we need to really empower districts to understand how tools are working for them. And then thinking things like standards. So Melissa, I know that you're on the board of One Ed Tech and have talked a lot about the importance of common LTI standards, right? And I think that similarly, I think that we need to have common standards when it comes to evidence, right? And so I think there's a lot of different ones out there. And one of the things that my team has been working on is looking at these ESSA badges and realizing that our interpretation of the ESSA standards for tier three may differ from another group. And I think There's more alignment than there is disagreement, but we really need to be talking the same language and speaking from the same laybook. And then I think from a broader lens, I've really been encouraged by the work of all these different foundations, organizations, districts, states, and providers across the tech ecosystem to really ensure that tools have strong and accessible evidence. And it's really only through true cross -sector collaboration and communication that we can really advance it. So I think collaboration, getting out of our silos, working on this together is the only way that we'll really advance this and help to ensure that the tools that people are using are actually impactful for learners at end of the day. Yeah. At a time when we, you know, the pace of change is faster than ever, resources seem slimmer than ever, workloads for everyone seem bigger than ever, how do we make sure that evidence isn't, you know, these evidence -based decision -making isn't viewed as a nice to have that we can just gloss over, but really a must -have in these decisions. How do we support organizations in making those decisions? Yeah, or something that takes too much time too, right? I can see people, I hate to say it this way, but just, you know, bringing it to the very basics of, I have time for that? Yeah, I think that's a great point. Because as I mentioned, there's so many different places to go to right in this 2024 evidence report really starts to highlight I think that it matters because there's this massive amount of funds that's being spent on tools that could be harmful. And I don't want to stay negative. I think that they could be very beneficial, but the problem is that we don't know, right? Like we haven't really been exploring it. And there was a recent podcast, I don't know if you've heard of it, but Emily Hansford sold a story podcast, which I think really highlights, you know, the detriment of not looking into this and doing that for so many years. And I think that we should be doing to really look into and see if these tools are helpful. As I mentioned too, the EdSurg product index is a great place, Learn Platform is a great place, but I'll just share just a kind of a personal example too about why this matters. So as I mentioned, I'm a parent, you know, I have two elementary school boys and they are using personalized ed tech solutions daily for homework. Those solutions have changed year to year. They regularly come home with all types of new tools, it seems like each month. And as a parent, I start to question, and probably you all do too, right? Are these tools actually helping? Are they added to what the teacher is providing? Are they growing? And one of the frustrating points I've seen is that even for these free tools that teachers may find, the provider sometimes will try to pass those costs on to the parent, right? So I will see that, like you can get these extra incentives or special clothes for your avatars if your parents pay $5 a month. You know, that is very troubling to me. We just need to do better at getting a hands on these ed tech tools. And I think it matters because we know that teachers have really the largest impact on student learning and really viewing ed tech tools that is additive. But if we're going to be spending all this money on it, we just really need to ensure that they have strong. Yeah. I've always been blown away by, because I have a, talk about my kids on the podcast a lot. I've got a freshman, you know, university student and a junior high school student. And I'm always amazed about the lack of communication around new tools that comes out. All of sudden they've got a new tool they're trying to use. And they have questions that I don't know how to answer them because I didn't know they were getting a new tool, right? That kind of thing as well. And so just, I think the transparency around that is really important. Well, and as you were giving that personal story, I mean, my place immediately went to equity and, how are you ensuring and not being a parent? And I'm open about that. There's things I just don't know in some of these tools and how to think about them or look at them. And I never thought about that, that there would be additional things that could help a student that are unlocked. I mean, not every parent, not every family can even afford to make those decisions. And that just creates a bigger divide in the space around who gets access to what kind of tooling and in what settings just makes my heartbreak, which leads me to my last question. And I know Ryan probably has one too change. And maybe this is putting your psychology hat out a little bit. I think we just, there's clearly areas in this where we just need help, help institutions and help educators understand, but then think more positively about change, right? We all get stuck in our ways or we really want to, you we're afraid of change because we can't understand how that's actually going to have the right impact on what we're trying to accomplish. So, you know, just thinking again, maybe from your psychology perspective, knowing all of this from a researcher, from what's going on in the industry space, like how can we promote positive change around how tool use is being managed within districts across the US and even globally? Yeah, I think that's a great question. So I think that there's a lot of organizations that have thought long and hard about this. And I'm actually inspired by some of the work that they're doing to try to change these things. And so the first I'll mention is the Southern Education Foundation. I don't know if you've heard about this outcomes -based contracting movement. So they've actually been working with local districts and evaluating the impact of high impact tutoring solutions. And they've been doing it, I think, now for a couple of And essentially in these situations, districts have language in their contract that indicates that payment to this tutoring company is contingent on student performance. So in other words, if students don't show those positive results, then the total district cost for the tool will be significantly lower, which I think is incredible. But the other piece that I think is important too, is that it has accountability on both sides. So as part of this, the district also has to be committed and show that they've implemented the tool well. And then the provider has to be motivated to show those positive results. I really love that outcomes based contracting movement and would love to see that, you know, rolled out to other areas, not just tutoring. On top of that, you know, we recently partnered with ISTE on the Ed Surge product index. I've talked about that a little bit on this and I've really been excited by their approach to really ensure that anyone can visit their website and quickly look and see are these different tools used in my classroom certified by these different groups. And I love ISTE is trying to change this and make it much more accessible. And then third, Digital Promise and Evidence for ESA. And Evidence for ESA is out of Johns Hopkins University. They've actually created certifications for ESA evidence. So I've talked a little bit about our partners in this space. And the thing about both of these groups that I love is that they really require rigorous research evidence in order to list solutions on their website. So we do know that the What Works Clearinghouse, it's the only federally funded group that looks at ESA research evidence. But the problem is that the cycle review is a little bit sporadic. And so the most used ed tech tools are probably not going to be listed there. But that's one of the reasons why my team has taken a look at the What Works Clearinghouse standards, become certified in those since it is federally required and approved. And we've ensured that the work that we're doing is aligned to the What Works Clearinghouse standards and validated over 100 research studies as a result of that as well. And just that certification. And then finally, I wanted to share one quick other one. So I've been impressed by the work of the Jacobs Foundation. I don't know if you all are familiar with them, but from an international lens, they've actually been laser focused on the importance of evidence for ed tech and work with countries across the world. work with providers and educators and researchers and venture capitalists to really ensure that the demand for high quality evidence continues to grow. I think that they are strong investors and thought partners in this space. And they really do believe in the promise of ed tech. So really keeping that kind of positive mentality about it, right? Really wanting to ensure that it moves the needle and that it can be helpful for students across the world. And our team has been working with the YACOBS Foundation for the past several years to conduct ed tech effectiveness research and also has been working with UC Irvine, which is funded by the YACOBS Foundation as well through their series and have been working with them on different internships and kind of collaboration in this space. love that. That all is, it's about surfacing up really important insights and making them easily accessible and usable by practitioners. I just, I love if we think about change that way, how we can really start to move the needle in this space. Yeah. Mary, you've made a complex topic, incredibly approachable, and I appreciate that because I don't necessarily have the pedagogical background that Melissa does. And so, Talking to you, just, feel smarter having sat here and listened to you for a minute. But what aren't we asking? What additional, as these conversations, I feel like we cover a lot of territory, but I always feel like we walk away going, gosh, did we miss something? there a gap? What else should we be talking about? Yeah. So, you know, I'll give one other bit of food for thought that I've been thinking about a lot. And I've actually been talking with researchers about a lot for the past couple of years. And that's like this idea of the most reverse research evidence, which relates to randomizing students to receive an intervention that can potentially help and leaving other students to receive nothing for the sake of the most rigorous research and an asset tier one design. And Melissa, to your point, that brings up serious equity concerns. It's viewed as the gold standard of research. We've talked about that for a while, but it really doesn't apply in education. When we think about it, educators and administrators don't want to randomly assign a group of students who could benefit from something to receive nothing at all for an entire year, right? And we...leave people out. And I've had members of my research team come to me and say, I had the hardest call because someone begged me to let their students use this tutoring solution. And I had to tell them no, right? And that's question. And so I feel like there's a better way. I really do. And I feel like it's all about still doing those tier two and tier three studies where we're looking at, these students, you know, are these products helping to move the needle? But without this randomization piece, we can still ask questions about how effective is it? Will this product work? without incorporating that really ethical. Yeah, yeah, not leave, not leave anyone behind there. Yeah, that is so important. Yeah, I've thought about this often too. It's you want to give every student every chance. And the hope is if you're purchasing these tools or using these free tools, that that's the point, right? That you're giving students more of a chance and every student more of a chance to be successful. So I'd love to get into that randomization piece and trying to figure out as an industry, how can we think a little differently about that? you know, tier of research so that we can give everybody every chance, but still get the insights that we need to make the right decisions. I love that. And on that note, think that's an inspiring note. Mary, thank you so much for joining us on this episode. It's been lovely to have you. Tons. Tons of resources to be linked. think we may have set a record on the number of resources we'll be linking to in this podcast. think so. It's just been such a pleasure to have you. like I said, listeners, we'll be sharing all of this, including the research report Mary dug into. But thank you for being here, Mary. Thank you for having me. This was great. Thanks for listening to this episode of Educast 3000. Don't forget to like, subscribe, and drop us a review on your favorite podcast player so you don't miss an episode. If you have a topic you'd like us to explore more, please email us at Instructurecast at Instructure .com, or you can drop us a line on any of the socials. You can find more contact info in the show notes. Thanks for listening, and we'll catch you on the next episode of Educast 3000.



Labels (1)
Was this article helpful? Yes No
Embed this guide in your Canvas course:

Note: You can only embed guides in Canvas courses. Embedding on other sites is not supported.