After an outage on September 1, the Instructure Community is now fully available, including guides, release notes, forums, and groups. If some styling still looks unusual, clear your cache and cookies.
Ryan Lufkin (00:01.945)
Hey there and welcome to EDUCAST 3000. I'm your co-host, Ryan Lufkin.
Melissa Loble (00:06.336)
And I'm your other co-host, Melissa Loble. And we are joined today, not with just one guest, but two guests. And actually two really special guests. These are some of our colleagues here at Instructure that are specifically focused on thinking about how to unpack research and understand indicators for student performance. Or if we want to think more broadly, I will say.
assessment perhaps, but it's so much more than that. And we're excited to pick their brains. There's a lot happening today in the world, as well as a lot of opportunity for how we as educators think differently about how we understand student success, learner success, and how we help each other drive to that and meet our outcomes. So with no more further ado, please welcome Russell Ligon.
He's a psychometrician here at Instructure, alongside Alexandra Lee, who's a senior researcher here at Instructure. Instructure, welcome both.
Russell Ligon (01:10.705)
Thank you for inviting me. I'm happy to be here.
Alexandra Lee (01:13.727)
Yeah, super excited to be here and talking about how do we measure learning?
Ryan Lufkin (01:18.063)
So before we jump into today's topics, we'd love to hear a little bit more of your background and how you actually ended up in the world of assessment. So Russell, let's start with you.
Melissa Loble (01:18.687)
Love it.
Russell Ligon (01:28.827)
Sure, so my background is actually as a biologist. I have a PhD in evolutionary biology, but about five years ago I decided to make a career transition to one that would give me more flexibility for personal and family reasons.
and I switched into a role as a data scientist. I still wanted to work in a field in which I was kind of excited about contributing and giving back something that I could be proud of at the end of the day. And so I was focused on kind of green energy, biotechnology and education. Luckily for me, I ended up at a small company called Learning Ovations, which was an education startup focused on improving childhood literacy.
So as I was hired as a data scientist, but as many of you know, at a startup you have to wear many hats and I got the opportunity to develop, become familiar with, and then put into practice kind of psychometrics. Basically tools that would help me understand the assessments that we were using.
to better provide information to teachers to make customized, tailored instructional choices for their readers. And so that's how I got into assessment, education, and psychometrics.
Ryan Lufkin (02:45.145)
That's an incredible background. Also, by the way, I've always said that psychometrician is like the best job title ever. It's like something about it. I'm a psychometrician.
Russell Ligon (02:51.822)
It is.
It is, I love it. But I always have to like double check that I'm spelling it right whenever I have to write it down.
Ryan Lufkin (02:59.385)
Yeah, yeah. Al, how about you?
Alexandra Lee (03:04.149)
Yeah, I have a sort of winding path, but maybe more related to education than Russell's. So I really became interested in education research due to my own experiences as a classroom teacher. I was really lucky to get to teach in lots of different places. I taught in Thailand. I taught in Singapore. I moved back to the States and taught in the rural Mississippi Delta. And then I also taught in my hometown of Denver, Colorado.
And when I was teaching high school English in Mississippi, I really became interested in learning more.
student motivation and how I as a teacher could better motivate my students and that you know ignited a curiosity in me that eventually led me to go back to school to pursue a PhD in educational psychology and educational technology at Michigan State University and we are recording this on the first day of March Madness so I have to say go green.
Ryan Lufkin (04:00.175)
Yeah, those games, those games start very shortly. Yes.
Melissa Loble (04:02.51)
Yes, go Spartans.
Alexandra Lee (04:04.129)
Yes, and maybe I'll kick myself later, but I'm hoping they go all the way. Yes.
Ryan Lufkin (04:08.675)
By the time this posts we'll know the answer, yeah.
Alexandra Lee (04:12.961)
And after completing my PhD, I became really interested in doing applied research in education. So similar to Russell, I wanted to do work that matters and was directly impacting teachers and students in a positive way. So I started looking for jobs, you know, in industry, which is moving at a fast pace and being, you know, technology that's being used by teachers and students. And that's what led me to my current role as a researcher here at Instructure.
And I feel really lucky to get to do work that's meaningful to me and that I can relate to on a personal level having been a teacher. And so really enjoy the work I get to do here focused on assessment research and efficacy studies and those sorts of things. And assessment research is really just near and dear to my heart since I taught in a state tested subject. And I saw firsthand just the stress that teachers and students are under in those state tested
subjects and I really wanted to make sure as a teacher I was helping my students be successful in those end of course assessments so that they could go on to do great things, go on to college, go into career paths, apprenticeships, all of those different pathways.
Melissa Loble (05:29.934)
love that, Al, and I can relate to the state-tested subject. I started my career also in the classroom I taught high school in New York City. And there was not just one assessment, but two. You chose your level in which you were assessed. And as a new teacher, I remember the stress around having that be...
not only state assessed, that was like that was the impact on students because that was pathing them. had ninth graders that was pathing them already that early and like it's just this added in comparison to the classes that I taught that weren't state assessment courses. It had such a distinction on sort of my experiences as well as I know the students which kind of leads me to the next question we love to ask our guests is like a favorite learning moment or teaching moment. So it could be one where you were teaching your students. It could be one where you
Ryan Lufkin (06:01.615)
wow.
Melissa Loble (06:24.024)
were learning something yourself, could be something you've observed, something that you've worked on with all of the organizations that you've helped do research. Would you mind starting and sharing with us a favorite learning moment?
Alexandra Lee (06:36.327)
Yeah, so I think that this learning moment to me that comes to mind is really related to how I try to approach teaching. And it was really impactful for me. So a learning moment that comes to mind for me was not my favorite at the time it happened. But looking back now as, you know, a middle aged lady, I can really appreciate how this was such an important moment for me and really transformed how I approach studying in school for the better.
Ryan Lufkin (06:56.399)
You
Alexandra Lee (07:06.211)
So the moment is, you know, picturing me as a freshman in high school and I had my first experience failing a test. And the test was on Greek mythology in English class and required me to study and memorize a lot of detailed information about gods and goddesses. I think a lot of folks have taken a similar test to that when they were in high school. And the problem for me was that at that point in my life, I just didn't know how to study.
and my favorite teacher, Miss Scott, at George Washington High School in Denver, she taught me how to study. And the way she taught me how to study was by giving me my first F. And I needed to have that happen to take it seriously. And the thing was that she didn't just give me an F, she also gave me an opportunity to retake the test and to work with her to get help on learning how to study and prepare better.
And so the combination of her having really high expectations of me and not sort of passing me on that test when I didn't deserve to be passed on the first time and giving me an opportunity to learn and grow an important skill, studying for an exam effectively, is something that I'll be forever grateful to her for. And I really am grateful that I got an F on a test in my freshman year of high school.
Ryan Lufkin (08:25.063)
yeah, yeah.
Melissa Loble (08:25.912)
Mm-hmm.
Ryan Lufkin (08:30.403)
You
Melissa Loble (08:33.176)
Yeah, yeah, that's such a, man, I think we can all relate to that story. And so often when we ask our guests these questions, we get, you know, the, remember this because it really inspired me to be this. And it's like, no, no, no, there's like, there's points in our life where it's a wake up call or a, you know, a shake. I think that's so important.
Ryan Lufkin (08:37.38)
Yeah.
Ryan Lufkin (08:49.635)
Yeah. I actually love every time we ask this question on our podcast, I think of a different experience of my life, but how, what an amazing teacher to use that opportunity to actually change the underlying skill of studying, right? And focusing on that. That's amazing.
Melissa Loble (09:02.478)
Yeah. Yeah. All right, Russell, you're up.
Russell Ligon (09:06.033)
So, Al and I did not coordinate these answers, but I was thinking of some learning moments, because I know this is a thing that you love to ask your guests, and I'm happy to share it as well. My learning moment that I want to share is also one related to failure and learning how to study.
Ryan Lufkin (09:10.095)
You
Melissa Loble (09:11.768)
Sure.
Russell Ligon (09:31.761)
except mine happened when I was in college. It was a bit later. So essentially like I was a good student and I got into a good college. And, but I did so without having, again, having had that moment that Al had as a freshman in high school where I didn't really know how to.
study and then kind of build knowledge as opposed to gather facts and so the i went to pomona college and i was a bio major and basically i scraped by with c pluses in the first year as a freshman in chemistry general chemistry and then
As a sophomore, I was in organic chemistry and was continuing to kind of be towards the lower part of the class in terms of performance. And I had a professor, Cynthia Solasi, and she took those of us who were having trouble and she created a study group for us. And she was spending her own time. And this is kind of before I even knew about office hours and how you could go ask for help from professors.
And she was working through these problems with us and showing us how we needed to be able to put what we were learning in the classroom and in the lab into practice, of understanding molecular structure and, you know.
complex organic chemistry in a way that was different from simple fact or knowledge regurgitation, which had been kind of something that I was fine with at that point, but it wasn't sufficient to succeed. And so the time that she took to help those of us kind of learn how to take information and use it to create kind of knowledge and information to be able to build on that.
Russell Ligon (11:20.697)
was something that I am very grateful for. I did, by the second semester of organic chemistry, I was able to get a B minus. And so that's the grade I'm most proud of in all of my college, the slow upward trajectory from, I think it was from C, I think it was exactly C minus to C, to C plus to B minus. And so thank you, Professor Sawasi.
Ryan Lufkin (11:31.959)
Nice. A real impact, yes.
Ryan Lufkin (11:49.423)
You
Russell Ligon (11:50.289)
for taking that time to, again, teach in a different way, but the same, you know, it's analogous to Al's. How to learn, how to use information and build.
Melissa Loble (12:02.018)
Yeah, and that's a classic course that's, you know, we call it a Weeder course or however you want to, I mean, I know so many people that didn't have that opportunity and never made it through that course. And then you stayed in science, which is just rad. I love that, but like it kept you in a field that you were really passionate about. That's so cool. So I mentioned at the beginning, this is a topic that...
Alexandra Lee (12:02.081)
Mm-hmm.
Melissa Loble (12:26.464)
I'm super or an area of education that I'm super passionate about, but I'm not sure all of our listeners have some of maybe the same foundation that maybe Ryan and I do. So just because we have all sorts of people listening to this podcast. Al, if we can start with you and perhaps we'll focus assessment. We'll use the word assessment, but let's focus it on student performance and indicators in that. So.
So where are we today as compared to like 10 years ago? Maybe give us like a baseline or grounding so that we can build our conversation on that and make sure all the listeners are thinking in the same way as we all are.
Alexandra Lee (13:06.977)
Yeah, and I like thinking about assessment as a broad way that we understand what students have learned and I like to think about learning as including things like social-emotional learning, soft skills, hard skills in the career.
and then of course learning standards and academic types of learning. And so when I think about, you know, where we are today and where we were in our history and 10 years ago is somewhat recent past. One thing that comes to mind to me is how federal policy in the US has really shifted and how we're holding, you know, teachers and schools accountable for helping students learn. And so I'm really interested in the policy.
changes and accountability models and how these have shifted away from a centralized model and towards more of a decentralized model over time and really shifting power and decision making back towards the states. since we do have lots of folks listening to this, I wanted to sort of walk through a couple key historical moments related to this. So, you know, when we look back at the trends over the last 10 years, but really going a little further back to 20 years ago,
we can really see how this shift from centralized accountability models towards more of a decentralized model has taken hold. And so a little over 20 years ago, the No Child Left Behind legislation was passed. And that really, I think, had great intentions of wanting to make sure all students are learning, but really centralized accountability at the federal level. And then we saw sort of a shift
shifting away from federal accountability models towards the states with the Every Student Succeeds Act passing now within about 10 years ago. And with the Every Student Succeeds Act, there was more of a role in states defining their own accountability models. We saw assessments now broadening out to include more differences in learning standards and other competencies. And then today in 2025,
Alexandra Lee (15:22.997)
we can see this move, you know, increasing even more and really decentralizing education decision making towards the states. So more to come. I think this is the trend, you know, I kind of see looking back at our history of accountability models and accountability models are important because it has implications for assessment. And that's why I'm really interested in it. And so, you know, these differences in accountability models from state to state have resulted in a greater need
for assessments to be tailored and customized to each state and each state's differing standards and learning priorities. And in addition to tailoring the assessments to the unique learning standards of each state, there are also increasing differences between states and other outcomes being measured for accountability. So recently some examples have been coming across my desk where states are including a greater emphasis on career skills.
Ryan Lufkin (15:58.447)
Mm-hmm.
Alexandra Lee (16:22.787)
portfolios of learning and really assessing the whole learner beyond just these academic standards. And so I think, know, as someone who does assessment research, I'm really interested in thinking about how this variability state to state impacts assessment and how do we design good assessments to sort of rise to the occasion of this variability.
Ryan Lufkin (16:25.711)
Yeah.
Ryan Lufkin (16:48.335)
Yeah, this is really important right now as we're thinking about how do we create outcome-oriented and equitable learning environments. It's funny, I talk a lot about AI and that shift towards more skill-based, more outcome-oriented is kind of omnipresent in every conversation. So Russell, where does the work of a psychometrician fit into that broader conversation?
Russell Ligon (17:11.279)
Yeah, that's a great question. So one of the fantastic kind of toolkits that a psychometrician or an assessment research team has available to them is called item response theory. And it's a class of analyses basically that let you understand
the items and students simultaneously. Items is kind of a general term. It can include specific questions, but also kind of broader, like fill in the blank type things. So this item response theory, it basically lets you see items in the real world. And so you can use these tools to identify.
places where individual items or assessments at a broader level are behaving differently than expected for different subgroups of assessment takers, so students. And so when you talk about equitable learning, you want to be able to evaluate or assess those students equitably, fairly as well.
And so a psychometrician can help ensure that the items that we include in assessments are free of or have at least minimal bias. And through kind of repeated cycles of evaluation and kind of data cleaning essentially through these psychometric tools, we can do our best to minimize the impacts that the items that we're creating and putting in front of students to evaluate, know, mastery on topic A or B
are items that are.
Russell Ligon (18:52.699)
essentially free of bias that we inherently impute into them when we create them. No one can help it, right? As we all have our cultural, historical backgrounds and that informs what we... It's not. Right.
Ryan Lufkin (19:07.353)
I was gonna say it's not really one size fits all, right? Like Al was pointing out, you've got these regional differences, very like global differences. We've got to account for those.
Russell Ligon (19:15.833)
Yeah, and so, you know, that's a big part of it is kind of this like nuts and bolts. We want to, we can evaluate using psychometric tools, need the assessments to have a quantitative measure of the degree to which they are or are not performing differently. The other thing is...
We want to be able to deliver those assessments in a way that makes sense to all learners. And so technology, not necessarily AI in this case, but technology, we want to make sure that our assessments are interpretable and fairly delivered.
and scaled and scored quickly so that teachers can then use that information rapidly to make decisions. Because that's the end goal is to, we're not assessing students for the sake of assessment, we're doing it in many cases.
Ryan Lufkin (20:03.555)
Making it actionable, yeah. Is that cool?
Melissa Loble (20:06.862)
No, I'd love to build on that. so Russell, you've shared how to think about the role of assessment creation and delivery and how do you know the importance of the science behind how ensuring that we remove bias from and other things from assessment. And Al, you've chatted already about the history and what do you
What do we need to be thinking about from a legislation or policy perspective? As you look forward, to, you know, thinking about how you'd advise districts or states in what they need to know or think about assessment. How do we combine these two worlds and perhaps give them some practical ideas around how should they be thinking about assessment in their state, in their district? How even maybe should a teacher be thinking about assessment? So that, again, we can all see.
Regardless of whether you're teaching a second grader or a lifelong learner, we can all help people progress and achieve the outcomes that they're hoping to from their learning.
Alexandra Lee (21:15.369)
Yeah, so when I think about, you know, what...
Schools, districts, teachers, instructors, and higher ed should be thinking about, really think it's important to consider a balanced assessment system as a good approach. And so no single assessment, even if they've got really strong items on them, confirmed by item response theory, is going to adequately measure and provide all the information that different stakeholders need to have to make informed instructional decisions.
Assessments are designed with different purposes in mind and so you really do need to think about balancing different forms of assessment with different purposes as the way to approach assessment and having it be sort of an ongoing process. I did pull a paper, because I'm a researcher, off the National Academy of Education website. They recently published a report
Melissa Loble (22:11.403)
Love it.
Alexandra Lee (22:18.371)
around reimagining balanced assessment systems and how they are defining a balanced assessment system is one that is intentionally designed to provide feedback to students and information for teachers to support ambitious instructional and learning opportunities. And so when I read that definition, you know, to me a question arises, which is, well, how can educators make sure their
strategic and intentional and how they're using assessment to learn what they need to from it, which then would enable them to make sound instructional decisions and really increase student learning. And so, you know, how I kind of like to think about a balanced assessment system is thinking about assessment as being of learning, for learning, and as learning. And that to me is an intuitive way to kind of conceptualize what this looks like in the real world.
and how these assessments have different purposes. So I wanted to just kind of walk through those three ideas. So first, thinking about an assessment of learning. think this is what we most often think of when we think of assessment is an assessment that's summative in nature and is used to sort of evaluate students. So when I was talking at the beginning about my learning moment and getting an F on a test, that was an assessment of learning. And you know, an exam
Another example of that sort of assessment would be state assessments at the end of the year. So that end of course assessment I help my students get prepared for when I was teaching English too. And these assessments are useful. They're useful data points, especially at the school or district level to identify trends or the state level or the national level to sort of think about what are the trends we're seeing and what students are learning and not learning and what are macro changes we can make to help them.
learn more. So think about curriculum changes, professional learning, initiatives, those sorts of things. The next sort of idea around assessment is thinking about using assessments for learning.
Alexandra Lee (24:32.477)
And so these assessments would be quicker, more readily available assessments that aren't summative, they're more formative, and they're really being used on a very regular basis, even daily basis, so that teachers can get real-time feedback on what their students know and do not know, and so that they can readily adjust their instruction pretty quickly in real time.
more assessments, those formative assessments are really helpful for teachers in particular and students can also get feedback from them. And then a final way to think about a balanced assessment system are assessments as a learning tool in and of themselves and I think this is one that sometimes like blows people's minds.
Ryan Lufkin (25:23.183)
You
Alexandra Lee (25:24.181)
But research in cognitive psychology has found that when students are taking assessments, they have increased memory of new information. They're more likely to identify misconceptions and learning gaps themselves. And that's especially true when they're given feedback. And so when assessments can be used for learning,
another important component to consider is making sure that there's a feedback cycle that students are a part of when they're taking assessments and that you know sort of helps them develop their self-regulated learning processes, metacognition, and will ultimately set them up to be highly skilled learners for the rest of their lives.
And so I think that's sort of the advice I would give to folks when they're thinking about how to design an assessment is it's not going to be a one size fits all approach, but you really want to think about different forms of assessment, giving you different pieces of information, and then that's kind of helping fill out a whole puzzle that'll really allow you to make strong decisions around instruction and interventions.
Ryan Lufkin (26:28.857)
Yeah. So as we kind of evolve into these new areas of assessment and try to do them at scale, this is one of those areas where technology really plays an important role, right? Russell, what role does technology play in good assessment, design and delivery?
Russell Ligon (26:45.169)
Yeah, that's such a good question and building off of Al's point, it depends kind of on the goal of a particular assessment. And so to build on an analogy or to build an analogy that's been used probably before, I really like thinking about these assessments for learning, kind of these formative assessments that provide information to teachers and students about where they are in their learning.
as kind of measurements of where you are on a map. So if you have a goal that you're trying to reach, taking an assessment lets you understand your position on your map relative to that end goal.
But in this case, in this analogy anyway, you're not just simply pressing a button on your GPS and getting your location. You have to take some measurements and you have to get your bearings, measure different landmarks to get your position. So it takes time because just like in the real world of assessment, assessments take time. And although some assessments can be used as learning via those processes that Al mentioned, they also are often taking away from instructional time.
So they provide valuable information, but it doesn't come for free, right? It comes at a cost. There's a time. And so one of the key ways that technology can improve assessments and improve how assessments are part of a balanced...
pedagogical plan is by speeding them up. Now, you want efficiency, not just speed. So you want to still get good quality information from the assessments, but you don't want it to take an hour and a half out of your day. And so one of the ways that technology has been used effectively within the realm of assessments, especially informative assessments, is through computer adaptive testing.
Ryan Lufkin (28:29.602)
Mm-hmm.
Ryan Lufkin (28:41.135)
yeah.
Russell Ligon (28:41.637)
Computer adaptive testing takes into account individual student learning trajectories, and ideally it takes into account like a large bank of items that can be customized based on that student's estimated kind of ability at the start of the assessment, and then based, it incorporates their answers to more quickly hone in on their true ability or mastery of a given subject or topic. So that's a way, that's a key way that technology...
Ryan Lufkin (29:09.571)
be almost impossible to do that without the technology to kind of access those right in banks. Yeah. Yeah.
Russell Ligon (29:12.825)
Absolutely, it's 100 % impossible without that technology. it's also tailored, right? So you can get high precision information about both your highest and lowest performing students in a given classroom from the same delivery, assuming that the technology is in place, the item bank's in place, and that the information which comes from psychometrics is there to inform how that test is delivered.
Melissa Loble (29:28.973)
Okay.
Russell Ligon (29:42.373)
So that to me is like a super important way that technology and assessment can be married moving forward. I mean, and have been, I don't mean to imply that this is not widely used. I would say another one that is key is integrating these student information systems so that...
Ryan Lufkin (29:53.167)
Yeah, yeah.
Russell Ligon (30:08.901)
the teachers, the district administrators have access to kind of like holistic sets of information about not just performance on assessment A in class B, but also, you know, attendance and performance in previous years and, you know, other key important parts of who a student is.
Ryan Lufkin (30:29.327)
the life metrics, right? That alternate packet student success, yeah.
Russell Ligon (30:31.641)
Yeah, the learning journey, right? Yeah, yeah, that's exactly right. And I'll mention this as well. No assessment's going to do everything. So you need a balanced set of assessment strategies. And yeah, we want holistic information about students. And that's what students want as well in many cases.
Alexandra Lee (30:32.321)
Hmm.
Ryan Lufkin (30:55.119)
Yeah, yeah, awesome.
Melissa Loble (30:57.535)
Yeah, I think one of the things too that technology has surfaced up perhaps is there's so much more assessment out on the market. Products, content, services, you name it, right? I feel like there's more and more of that. Some of it doing really impactful things like you've described, Russell. And then, you know, some of it you wonder. So I'm curious, how do we know?
Ryan Lufkin (31:06.959)
Hmm
Melissa Loble (31:24.015)
a particular assessment or approach or technology, how do we know it's effective or where does the evidence play into this?
Alexandra Lee (31:30.721)
Yeah, that's a really important question. And I think it's one everybody needs to have a working knowledge of so that they can be informed consumers in the market.
view for an assessment to be effective, it has to give you the information you expect it to give you. You really need to be able to trust that the assessment is giving you valid and reliable information on what students know and do not know. If you can't trust that it's giving you the best information possible about what your students know, then you can't use it to inform instruction or you might be trusting it and you shouldn't be and you're making decisions that are going to potentially
harm students in the worst case scenario or not help them, right? And so making sure that you're able to evaluate an assessment at a high level to see if it's valid and if it's measuring what it says it is is really important, I think, as consumers and teachers and everyone should have a little bit of a working understanding of how to look at validity evidence of an assessment. And so, you know, as a researcher in Russell as a psychomatologist,
you know, we...
towards a couple sources to figure out how we show validity of an assessment, how we establish that it's valid. so there was a researcher, Samuel Messick, who really provides a framework that we use a lot in our work. And he identified key sources of validity evidence. And these sources of validity evidence are also what underpin the standards of test development. And these are the standards that assessment
Alexandra Lee (33:15.843)
should be using to be aligned with research-based best practices. They're updated by leading education research associations, American Education Research Association, National Council on Measurement in Education, and then the American Psychological Association make these standards. And they're really building off MESIC's work around sources of validity evidence. And so I'm not gonna go through all of them in detail, but I wanted to kind of hit like a couple tops.
Ryan Lufkin (33:43.215)
Hahaha
Alexandra Lee (33:45.924)
ones I think folks can really look for. And so one source of validity evidence is looking at the test content itself.
And so to my earlier point, the test should be aligned closely with the standards that it's seeking to measure. And so making sure that the questions are aligned with your state blueprints or your instructional pacing, that's an important source of making sure that it's going to be a valid assessment. Another thing that you can sort of look towards is looking at the characteristics of the items on the assessment. And so this is Russell's
Melissa Loble (34:23.822)
Okay.
Alexandra Lee (34:25.239)
bread and butter as a psychometrician. But you should be able to look at some documentation and see, there's been some sort of item analysis done to validate the items that are included on my assessment. And that's an important component. Another thing you can look at is, is this assessment, if it's on state learning standards, is it correlated with other assessments that are measuring the same thing? And so a best practice to look towards would be
Ryan Lufkin (34:29.315)
Ha ha.
Alexandra Lee (34:55.139)
for correlation studies or predictive validity studies or that's showing that a benchmark assessment for example is highly correlated with a state assessment because they should be measuring the same things and they should be doing that comparably well. And then a final source that is my personal favorite source of validity evidence so I am a nerd.
this stuff, but I love this idea and it's the showing that a test is valid because it has positive consequences on student learning.
And so this does sort of make sense. know, assessment providers are saying, if you use our assessment, you're going to be able to make better instructional decisions and drive higher student outcomes. Well, let's test that. Let's see if it's a valid assessment. If it's not valid, it's not going to make those informed decisions possible and won't lead to higher student outcomes. And so, you know, research studies can be designed to really look at, you know, is the
Melissa Loble (35:32.238)
Mm-hmm.
Melissa Loble (35:47.79)
Hmm?
Alexandra Lee (36:01.929)
use of the assessment tied to higher student achievement. And that's another way of looking at the validity of the assessment. And I like that one in particular because I think it's important to think, know, at the end of the day, is this helping students? And if it's not, then we shouldn't be doing it. So.
Melissa Loble (36:20.834)
Yeah, and it helps us weed through, you know, I think everybody has intentions of being good actors in the space, or I'm gonna hope for. I'm gonna glass half full that one. But it helps us weed through what's more, again, hope versus what's real, or what's marketing versus what's...
actually happening on the ground, which is why I love that one as well. And just for our listeners, I'll remind you again, we'll get a link. Both Al and Russell have already mentioned a number of different research studies and resources. We'll make sure there's a robust collection of links along with the show notes. That way, if you want to dig in deeper to any of these areas, something really struck your fancy, you will be able to sort of follow the trail that they were sharing.
Ryan Lufkin (37:06.681)
Well, and it wouldn't be the Educast 3000 podcast if we didn't talk about AI at some point on this. And so much of the conversation around AI has been focused on academic integrity, right? And how it's really undermined traditional assessment models. Russell, what role does AI play in assessment moving forward? How do we actually leverage these powerful tools?
Russell Ligon (37:28.933)
Yeah, I think there's an inherent assumption that the data you get from an assessment, from the perspective of a psychometrician, is generated by a student, right? So with that assumption in place, right, yeah, let's make sure that it's actually a kid taking the test or a lifelong learner and not a bot built for that purpose.
Ryan Lufkin (37:44.271)
How do we make sure it is actually generated by the...
Ryan Lufkin (37:53.827)
We hear that fear a lot with educators, right? That it's gonna be students or AI submitting homework that's graded by AI and no learning happens in that loop in between, right?
Russell Ligon (38:04.433)
Yeah, yeah, that's a little bit of a horrifying, you know, worst case scenario from my perspective, because to me, Al and I were talking about this in kind of advance of this opportunity to chat with you guys. You know, learning is such a fundamentally like human process. And of course there are statistical elements that have been extracted and built, incorporated into these models that produce incredible content, but still the...
A joy of life for many people, myself included, is learning. And it is kind of one of the production of art. Yes, AI can do art, but it doesn't take away the joy and the beauty of human-made processes. So from kind of just stepping away from that broader perspective and the role of...
the roles of AI in assessment. One of the key areas, of course, is content generation, as you mentioned, Ryan. There is great potential for AI to generate varied, yet topically relevant questions, items that could be used to evaluate student learning.
but, and you knew the but was coming, you have to trust it. And so especially depending on the kind of assessment you're talking about, any content that's generated by someone who is not a content expert, who doesn't have that deep background and who is kind of...
Ryan Lufkin (39:32.175)
Always.
Russell Ligon (39:56.645)
more degrees removed from the actual teaching of that topic in the context in which it's being used, you run the risk of decreasing that assessment validity. was mentioning, you have to be able for an assessment to be valid.
someone has to be able to look at that assessment and say, yeah, that's measuring what it's set out. This is a third grade math assessment. I can tell that because I've been a third grade math teacher, not me personally, but in this scenario. So it has the potential to streamline some of those processes, those item creation processes. But at the end of the day,
you're gonna depending on the use case of the assessment it's still really important to have people involved in the kind of yes for sure you have to have and because assessments have to be trusted for them to be used there may be a mandate for a given district to use a particular tool that they purchase right but if the teachers are getting results that they don't trust because the test has low validity that you were talking about
Ryan Lufkin (40:41.187)
human in the loop aspect, right? Yeah.
Russell Ligon (41:02.417)
It hasn't been empirically demonstrated either quantitatively or even qualitatively. If there's a disconnect where the items, when the teacher looks over a student's shoulder and says, what is that question even asking? You know, why is that on this assessment? That's good. They're gonna do what the district mandates, but they're not gonna use that information in the way that.
informs their instruction and rightly so if they don't trust it, right? So there is definitely a role for AI in kind of streamlining a lot of these processes, but at the end of the day, and this is true across all content in my perspective and probably yours too, Brian, like there are many places where it can streamline, but there's still quality checks by humans and where is that trade-off when it's still actually like a net gain in productivity?
Ryan Lufkin (41:46.575)
100 %
Ryan Lufkin (41:52.291)
Yeah, well, one of the conversations that's popping up more recently is the idea of AI drift, right? That idea that initially you might get some really great answers and it might be really consistent. Three months down the road, you might find that it's drifting off and we don't really know why, but we've got to have that kind of constant human loop maintenance on it. And so as we were thinking about leveraging these tools, even to scale these really powerful approaches, we've got to make sure we've got humans plugged in to make sure it's not wandering off into the forest.
Russell Ligon (42:20.251)
Yeah, 100%. And that's part of best assessment practices as well, regardless of AI, is continuous evaluation and refinement. And so I talked about the ability of psychometric tools to be able to identify biased items. We can also identify high-performing and low-performing items, where a high-performing item is one that gives you lots of information.
Ryan Lufkin (42:25.027)
Yeah.
Ryan Lufkin (42:39.012)
Yeah.
Melissa Loble (42:39.533)
Mm-hmm.
Ryan Lufkin (42:42.852)
Yeah.
Russell Ligon (42:43.377)
And it gives you information about the difficulty. Using that information lets you build kind of an assessment based on a blueprint, based on IRT. But you have to continue to evaluate because student populations change over time.
Ryan Lufkin (42:54.691)
Yep. Well, and Al's point about like, assessment as learning, I think was really interesting because I think in a lot of times when you, as I've used AI tools to explore topics that I'm writing about or talking about, I'm learning a lot about it in really powerful ways, even as I'm like preparing to, you know, if, if my, if my
test or my assessment is the actual presentation I'm giving, right? I'm learning through the process. And so it strikes me that there's some opportunity there to really use AI again with the proper checks and guardrails as almost a part of the learning process. Yeah.
Melissa Loble (43:30.892)
Yeah, yeah, I am. particularly this is such an interesting topic. I'm particularly concerned about this in the lifelong learning space or the continuing education space. I think in K-12 in the U.S. and in other countries, the equivalent primary and secondary, there are more checks and balances, some more structure, some more formality to that. I still think we have we have gaps, right? And but there's more opportunity to bring in the human.
Ryan Lufkin (43:55.279)
Yeah.
Melissa Loble (43:58.366)
I recently, and Alan and were chatting about this briefly. I recently went through some, some, I'm a scuba diver and I did some scuba diving instruction and I took an assessment alongside some e-learning that I was doing in order to, to learn a new skill. And it was very clear that assessment was built by AI and there were questions that it was pulling that made no sense and had no alignment actually to the content. And so I go over this with like my professional instructor and he looks at that and he's like, this is
this doesn't make sense. And in my mind, and he's like, well, who wrote this? And I'm thinking, I know who wrote that. That was probably because they don't have that same checks and balances or practice, right? Like as we start to expand, this is why we wanted to do this episode was let's give some folks some foundation for how do you think about assessment so that regardless of where in the learning life cycle you are, you can have that kind of.
Ryan Lufkin (44:33.615)
Yeah.
Melissa Loble (44:48.428)
the right kind of impact, but you can use it in the right ways so that it is actually driving outcomes. So I'm thinking, sorry about that side note, but it just made me think of that, like, I have an immediate example of where that recently happened to me. So let's end on future predictions, if that's OK. Our last question will be about the future. Al, where's the future going when we think about assessment, particularly in the framework that you've shared?
Ryan Lufkin (44:56.547)
No, I love that.
Ryan Lufkin (45:00.228)
Yeah.
Alexandra Lee (45:13.557)
Yeah, I mean, I think if I could kind of wave my magic wand and have something happen in the future, what I'd really love to see is positive cultures around assessment. I think that assessment a lot of times makes people feel really negative or they've had really bad experiences with it. And I think it's really important to create and foster positive cultures around assessment and around how it can be used. And when I think about how that can
culture is a big thing, it's hard to change and so I like to think about it at sort of a systems level and really thinking about
the learner at the center of this ecosystem with different layers on top of it. education is a part of a complex dynamic system where there's lots of different relationships that are really important to consider. And when we're wanting to change cultures around assessment, we need to think about all of these interlocking relationships at different levels of this big ecosystem as influencing what the student experiences at the end of the day.
And I really think it's important to kind of acknowledge the complexity of making culture change towards a more positive culture around assessment.
Ryan Lufkin (46:29.55)
Yeah.
Alexandra Lee (46:35.105)
I'm a motivation researcher, and so I've thought a lot about mindsets and beliefs around assessment and how that's really important for what folks experience around it. And so really think thinking about how the mindsets of district leaders influence the mindsets of teachers that then influence the mindsets of students and how we need to kind of consider all of those interlocking parts and relationships really strategically to build positive cultures of
Ryan Lufkin (46:54.636)
yeah.
Alexandra Lee (47:04.919)
assessment. And, you know, one personal example that comes to mind around this idea was that when I was teaching, I tried to essentially be an open book with my students and acknowledge my mistakes. And then I sort of realized in doing that, that I was modeling for them how you learn and grow and improve over time. And that all feedback is important to helping you master some
Melissa Loble (47:30.638)
Hmm.
Alexandra Lee (47:35.02)
new and really leaning into that.
And so when we would do assessments, I would try to model for my students what I was seeing at the class level in the data about my teaching and modeling for them. here are some things I'm not doing very well on. you know, we're like, I'm seeing in the data that these are some gaps in what you're learning. And that's a bit on me as the teacher, right? But then also having them individually look at their own data and have those same realizations. And I think having
that sort of feedback loop where we're all sharing how we're learning and growing and using the assessment information to provide feedback on that process is really critical and I think really emphasizing at the end of the day we all have the same goal. Like our goal is for you to be successful and the administrators to the teachers to the students all need to be
excited about this goal that you're working towards together. And so I think when I think about the future of assessment, I'd love to kind of put all the bad history of teaching to the test, drill and kill all that stuff away and think about how can we embrace assessment with a growth mindset, a mastery orientation, and a positive culture around it. So.
Melissa Loble (49:00.396)
Love it. Russell, how about the future for you?
Russell Ligon (49:05.029)
I have a micro and a macro. So my micro is about the kind of opportunity costs that comes with assessment. So the idea that when you're giving an assessment, you are not actively teaching. And so.
Melissa Loble (49:07.352)
Right?
Russell Ligon (49:20.625)
aside from the assessment as learning component, which would be fantastic as a way to kind of shift mindsets about assessment design and creation where that can be a key additional outcome. So basically finding that balance, which only comes through using kind of high quality, valid, reliable assessments. So making sure that the assessments that are used to inform instruction and make educational decisions are
the best information in the shortest amount of time because there's only so much time in a day especially true if you're a teacher. And then the macro is kind of a broader just big picture like how are we going to develop ways to more authentically evaluate student growth.
Ryan Lufkin (49:57.753)
Yeah.
Russell Ligon (50:13.689)
and mastery. so traditional psychometric tools and assessments are fantastic, but they provide information of a very particular and somewhat limited scope. And so finding kind of that true match between what a teacher is seeing in a classroom when he or she can see a student in all these different contexts and building assessment or assessment strategies that better capture that.
Melissa Loble (50:15.32)
Mm-hmm.
Ryan Lufkin (50:40.271)
Yeah. I feel like we need to wrap up with like a super friend style theme song. Second Matricians. Like, I love this. I love what you guys are doing. This is amazing. Thanks for being on the show to share this with us.
Alexandra Lee (50:46.155)
Ha ha ha.
Russell Ligon (50:52.251)
Thank you so much for having us.
Alexandra Lee (50:53.225)
Yeah, happy to be here.
Melissa Loble (50:55.084)
Yeah, so much inspiring work here. Thank you. And as the future evolves, we're having you back. Just so you know, we've got all sorts of topics parking lotted here that we want to talk more about. But thank you again for being here.
Ryan Lufkin (51:00.567)
Absolutely.
Alexandra Lee (51:06.527)
Yeah, thanks for having us.
Russell Ligon (51:06.801)
Thank you guys.
Note: You can only embed guides in Canvas courses. Embedding on other sites is not supported.
To interact with Panda Bot, our automated chatbot, you need to sign up or log in:
Sign inTo interact with Panda Bot, our automated chatbot, you need to sign up or log in:
Sign in