Good morning, I call this meeting of the House Education Committee to order. Today is Friday, January 30th, 2026, and the time is 8.04 AM. Apologies for the late start. Members present in the room today are Representative Elam, Representative Dibert, Representative Schwonke, representative Eisheid, Representative Underwood, co-chair Story, and myself co chair Hymshoot. We have a quorum to conduct business. I'd like to remind members and folks in the audience to please silence your cell phones. We're in The Betty Davis Committee Room 106 in State Capitol Building in Juneau, Alaska. The documents for today's meeting have been distributed to members, are available on the table outside the door, and on basis. I would like thank our recording secretary, Kayle Brown. And our moderator from the Juneau LIO, Sue Quickly, for being here to help us today. And of course, our very capable committee aides, Tammy Smith and Ella Lubin. Thank you for all that you guys do. On today's agenda, we have one item. The state of assessments on a national and statewide scale so this agenda item is comprised of several presentations We're going to move fast. I want to try to be done by 9 45 Doesn't always work that way, but if I set 945 then we should be down by 10 so This agenda. Item is composed of summer presentations dr. Martin West academic dean of the Harvard Graduate School of Education And vice chair of The National Assessment governing board Dr. Randy Traney, Superintendent of Matzoo's Boroschool District, and Dr Cindy Mika, Superintendent for Kodiak Island Borroschool district, Deputy Director Kelly Manning, Assistant Director Karen Malin from the Department of Education and Early Development. Members have copies of these slide decks and there are extras on the table outside this room. I would like to welcome our first presenter, Dr Martin West, the academic dean of Harvard Graduate School of education and vice chair of the National Assessment Governing Board. You have copies of his slide deck, and I just want to say that we're really, really glad that you could join us today. I attended a session that Dr. West was on a panel in, I think it was NCSL, at a conference in the summer. He provided so much or the panel provided so many context about the nape that we thought it would be valuable if he could share with us today. So that's how we arrived at this presentation today, so Dr. West, if you could please introduce yourself, put your name on the record and begin your presentation. We're so grateful for your time. Thank you so. My name is Marty West. As you said, I'm on the faculty and serve as academic dean at the Harvard Graduate School of Education. And I also serve vice chair of the National Assessment Governing Board, which oversees and sets policy for the national assessment of educational progress. I should say at that outset that I am not an expert in Alaskan education, but we'll be trying to offer some insight on what we can learn from the data gathered by the NAIP program. I'll also say that visiting Alaska and spending some time there is very high on my bucket list. This is as close as I've come, So, let me tell you a little bit about the National Assessment of Educational Progress. It started as an internal research program within the Department of Education in the late 1960s, and it was codified into law by Congress first in 1988 and then reauthorized in 2002. And it is a joint product of the national center for education statistics, which is the federal statistical agency for the domain of education and the national assessment governing board on which I sit. That board comprises 26 individuals who are appointed by the Secretary of Education, and they include designated seats for state legislators. You would like to know a Republican and a Democrat, always as members of the board. One from Tennessee is currently our board chair. There are also representatives of as well as testing experts and members of state boards of education like myself, I should have said at the outset, I also serve on the Massachusetts Board of Elementary and Secondary Education. So what is the greatest value of the program? It's that it is really the only common yardstick that makes it makes possible to compare what students know and are able to do. across time and across all 50 U.S. states. So obviously states have their own assessment systems, but those systems tend not to be directly comparable from one state to the next. They often change over time as well. And then we have national exams like the SAT or the ACT a subset and an unrepresentative subset of students. So NAIP is designed to address those challenges. The main NAEP test is administered in 4th and 8th grade in reading and math every two years and that test is administered to representative samples of students nationally, as well as representative samples within each state. So that is the assessment that allows us to compare, at least in reading and math, performance from one state to the next. We also, as budget allows, conduct assessments in other grades, grade 12 in particular, and in Dr. West, I'm going to interrupt for just a second. Do you want questions as we go? I should have clarified this with you. I will welcome questions as they go as an academic. I am used to getting interrupted angrily all the time. And so I get a little nervous if I not being interrupted. Well, clearly you've never been to Alaska because I don't think you'll be interrupted angrily by anyone on this committee. I do have a question if we could just pause for a sec on slide three. When you say representative samples, Again, we welcome you to come visit our state. I have heard, but do not know if it's true, that a representative sample could include a school and testing a fourth grader, and that school has only one fourth- graders in it. Would that be considered representative for that region, or do you need a certain end size or a certain number of kids to call it representative? How do handle our very, in Alaska, we have like all states, we have urban, but we have remote and some of our remote schools, you might have 11 kids in the entire school, and only 1 fourth grade. So can you talk a little bit more about that? Yeah, so I will address that very directly in just a few slides and so if I could defer your question, but it is the case that a school like that could end up in the nape sample and We we make it work. It's important to us that that would be thecase So let me come back to that in. Just a moment So in addition to reporting at the state and in the case of some large urban school districts that participate at the jurisdiction level. We also report out results for different categories of schools. That typically is public schools, charter schools we seek to be in a position to report results separately for private schools unfortunately we have not in recent years been participation rates we need in order for our colleagues at NCS to feel confident in the representativeness of those results, but we are able to report separately on the performance of Catholic schools. So let me get directly to this issue of name sampling. every student enrolled in traditional public schools in grades 3-8 typically takes every year. That is a census-based approach, but Nate instead relies on a sample. So we are drawing a subset of students within the state to provide representative information on all of the students in the State. It's true in sampling process unfolds in the year leading up to a given NAIP administration. So the NAEP 2026 is actually in the field right now, data collection started this week, and you can see the process of forming that sample occurred from November of 2024 to March of 2025. The foundation for drawing that is two data collections that NCES does, one called the Common Core Data and one called The Private School Universe Survey. Each of those data, yes. I'm sorry to interrupt and I am not doing it angrily. We are struggling to know which slide you are on. So if you could say next slide, that helps me to do it. I was presenting through and i wasn't sure, so it is on slide six and I will make it clear when we are advancing. Thank you very much. So, the NAIC 2026 is in the field right now, and the process for drawing the sample was in these months leading up to it. And as I said, the foundation for that are these data collections that do actually take a census of all established and recognized schools in the United States, both public and the common core of data. private in the private school universe survey. It does not include home schooling, but all state recognized public and private schools do have a chance of being included in the NAEP sample. I should also say that one of the strategies that we use to reduce the burden not every student takes the nape even within a given school that is selected for the sample. And then no student take the entire assessment, rather what we do is we split up the assessment into four or five different parts. And we have each student, take one of those parts of the assessments, and then we aggregate. those scores together into a composite and that means that we aren't able nor are we permitted by law to report results for a individual student or even for an individual school. We're only interested in assessing performance at higher levels of aggregation. So we have a question on Yeah. Thank you, Co-Chair Hymn-Shoot. I had a question, sir, on something I heard you say. I believe you said NAIP is not administered to homeschool students. And if I hurt that correct, is just my own personal experience of being interviewed extensively by the census folks in the back in past because they do do deep dives on a sample and so the question is why does it National Assessment Governing Board would be very interested in assessing homeschooled students, especially as that is a population that growing and growing particularly quickly in the wake of the pandemic. Unfortunately, we do not right now have the capacity or the sort of technological capacity in particular infrastructure in order to be able to both draw a representative sample of homeschooled students, nor to administer the assessment in those diverse settings in which students are being educated. There is also some resistance to the idea of achievement or outcome information on homeschoolers in some segments of that community, but we would be very interested in trying to find strategies to close that gap in our monitoring system. Thank you, Dr. West. So in terms of drawing the comments on something that was just raised. I should be clear there's a first we draw a sample of schools and then within those schools students are randomly drawn to be a part of that. of that sample. If there's only a single eligible student, say a single eighth grader in a school that was sampled, obviously that student would be included. And that is possible because there is no minimum size that makes a School eligible for nape. Any recognized public or private school can end up in the sample, that being said, the probability that a School will end Up in The Sample depends on the number of eligible few fourth or eighth graders is unlikely to end up in the sample in a given year. And so that might give a sort of intuitive sense that they're not included when in fact they are included at the appropriate rate for the overall results to be representative. So what does this look like in the National Center for Education Statistics first classify all of the public and private schools in a state into different groups based on their urbanicity or rorality status as well as by the demographics of these students that they serve to create different groups or subsets of schools within states that are more similar to one another. They also use a prior measure of achievement, usually based on the state test in order to sort schools within each of those groups by their expected achievement. And I was just then on slide nine, and now I'm moving forward on side 10. Dr. West, could we pause for a question, please? Yeah. Thank you through the chair. Thank You for being with us today. Appreciate that. I'm sorry I Missed It says That are you showing us how you're choosing a sample because step one is Identify all potential schools in each state. Can you just tell me what makes you a potential school? Yeah, so that would mean you were a public or private school that is recognized by the state that serves a fourth-grade population for the fourth grade assessment or an eighth grade population for an 8th grade assessments. So a K-8 school might be sampled for either assessment, but a standalone elementary school would only be eligible for the fourth grade assessment, a standalone middle school, or a 7-12 school would only be eligible to the eighth grade of assessment. So we're just starting with the full list of all the schools in a state that serve either fourth-grade or eighth- grade. Thank you. got so far as once we've developed these lists, we then use a random process to select schools from different points on that list. And then we get in touch with the schools to confirm their eligibility, for example, that they actually do serve a fourth grade or a eighth grade population. do agree to participate in the assessment again we keep the burden as low as possible and then when it's feasible when the enrollment is not too small within a given school there'll be random selection of which students participate. In the name test um and this is a actual photo that was two of our field staff boarding a bush plane, I believe in the Canai region of Alaska. On the right of slide 11, you will see their sleeping quarters as they stayed in the school, having brought their own food until the next plane out, allowed them to return back to the back to their base in the continental or in the lower 48. And so, you know, there would be, we are currently in the process of trying to transition the nape from a, it's original technological platform for a digital assessment, which were naped provided devices that our field staff would bring with them in cases to every school that they visit to a system that relies less heavily on Nate Field Staff traveling to schools and more heavily on school devices and we are in the process of making that transition but as we've made it and even after we have made it if a school needs us to travel there to be able to in the assessment, and if they have ended up in the NAEP sample, then we are committed to making sure that we are there to assess them, to make sure that even states like Alaska with some unusual, from our perspective, schooling arrangements can trust that the results really are representative. And Dr. West, I have one question before we move on and just as a reminder, we have about 20 minutes left for this presentation. So, and I'm the one slowing us down with the question, but it's very good. It's the keen eye. I hope when you come up, you will visit the key night. Key night, great. Key-night peninsula. Yep. And then my question though is I as the former educator myself, I would go to school pre-pandemic when I was not feeling well on testing day because the research that I knew at the time suggested kids test better with the person they know than with a substitute or strangers. Do you have anything to add to that about? I mean, I am grateful that NAIP staff are willing to make the trip to far away places to get good data. But what about that impact on student testing by having the test? uh and local educators to be involved in the administration of the assessment uh the Nate Field staff are there uh primarily to uh deliver the technology and to sort of oversee the process um but I think you're you right that that is a a phenomenon uh if the presence of those field staff were a uh significant influence on the results that would be something that showing up everywhere, not just in, you know, in a specific state. But again, given the involvement of local educators in the actual protocols, my guess is that it would be more similar than different from a state testing experience. Okay, thank you. We are confident that the samples reflect the school population in each state. We're also able to report out on results by race and ethnicity, by sex, by disability status, eligibility for the School Lunch Program language status as well. The one limitation that we have is Alaska currently does not the administration of the contextual questionnaires for school leaders and students that we do make available. They're not legally required but we make them available to states and again Alaska does not currently participate in that and so that makes us not able to break out the results uh for other states like the student chair of students who report that they themselves live in a rural area uh or the percentage of school administrators who support high levels of student mobility in their school. Dr. West on that question um do most states participate in that and what So I don't have the exact numbers in front of me. The vast majority of states do participate. It's definitely north of 40 states that do. The ones that don t cite concerns about not wanting to place the burden of completing surveys on educators and not want to devote any additional instructional time. for students beyond the completion of the exam to data collection. It's relatively short. I believe I don't have the numbers in front of me, but I'd believe it's 20 or 30 minutes at most. But we understand and respect that judgment. But that is a decision made by each State Department a long-term, a longstanding practice that we do not. I know it's been several cycles, but I would have to go back and we'd be more than happy to get you that information. Okay. And coach, her story. Yes. Thank you through the chair. Dr. West, what is the advantage to participating in the contextual questions? So it would allow more fine-grained reporting of the results. Access to different educational resources, for example, and then you can relate the achievement results to the types of instructional activities and resources that they have had access to. And it allows them for Alaska to be contributing to and represented in any national analysis. that is done based on the NAIP data and NAIT data is a major driver of education research. So again, we do encourage all states to participate and think there are some drawbacks to not doing so, but we of course respect state, state setting their own priorities when it comes to the use of time. I should have said at the outset, as vice chair of the National Assessment Governing I see our primary role as making sure that the NAIP program serves the needs of state and local decision makers. Interestingly, of all the reserved seats for different seats, for different stakeholders on NAGB, there is not one for a representative of Congress or the Department of Education or a federal official. It's really a program that is governed by the Thank you, and we have a question from representative eyeshide. Who's from the Kenai? Elon Sorry representative elum Eyeshite is a friend of mine. I shite as a good friend. It is from Anchorage Yes, I am from The Kenay, thank you. No, I appreciate the presentation and the information here, but I Hoping that you can help me understand Some of this information This particular slide that we were just on, or that's up there on our slide right now, says that NAPE can confidently claim that Alaska's results represent the state. And then, but yet we're seeing that your, the numbers, sample rates that you can have as little And then I guess I'm just trying to do the numbers here, and I am getting kind of lost in maybe where we are at, but you can confidently claim that this can test Alaska's results with just as little as one student in a community. Dr. West. That would be in, that would be a hypothetical situation in one of the schools that was selected for the. for the nape sample. So, what I wanted to make clear in response to the prior question was, if that is a school, if there's a School in Alaska that has one fourth grade student, they could end up in the Nape Sample. It's, again, unlikely given the probabilities, but they very well could. But I don't know whether it's several dozen schools that would be included in the sample in Alaska each assessment cycle and it's the sort of aggregate results across all of those schools that are used to report on Alaska. Does that help to clarify? So it sounds like it would give a sampling of that particular student from that particular community and maybe there's not very many students in that community but it will then lead to a checker boarding of similar challenges throughout small communities. And so in one of my schools on the Keeneye Peninsula, Cooper Landing, we have K-12 and one building in two classrooms. I mean that's a lot of Um, and I don't know that that would give a good sample rate compared to, say, soldotna, which is not that far away, but has several hundred students in it. Um. So what is your ratio of students whenever you're going across larger geographic areas that have that kind of difference when just, again, just a couple of hours away we have Anchorage, which has other large schools. That seems to me that, that would skew the numbers. Remember, what we're trying to do in drawing the sample is not characterize achievement levels in any given area within Alaska, whether it be the Kenai Peninsula or Anchorage. Rather, we are trying be able to speak to averages across Alaska as a whole. we would want them to be represented in the sample with the same sort of with a probability based on their share of the overall Alaska population. And that is what the process that I just will be more concentrated in larger population centers like Anchorage. More of the state's average will be driven by what's happening there, and then what is happening in very small settings on the Keynite Peninsula. Those smaller settings are still contributing to the average by virtue of their inclusion in the data collection process and one last very quick very short follow-up question if based off of that information, how are we supposed to decide policies that are rule based or remote versus urban. Use the NAIP results to help you learn about how Alaskan students are faring on average over time, how that compares to other states, what are some potential drivers of those trends, and then you want to combine that with information from your own state assessment and other data sources in order to think about problems that you might diagnose or have your attention called to as a result of the nape data. We are much more about saying what is happening to student achievement on average over time and from one jurisdiction to another. We're not meant to then directly And of course, when you're thinking about mandates and rules versus flexibility and very unique and varied educational settings, you all, as members of the Alaska Legislature, have vastly more experience wrestling with those issues than anyone on the current national assessment governing board. Thank you. what the tool is to be used for is extremely valuable. I think those are great questions. Dr. West, let's continue. We're gonna try to wrap up in about 10 minutes. Yeah, so let me, so I'm not gonna go into as much detail given the time available as is. available to you for analysis on the slides. I'm just going to hit some high points when it comes to the results that we've been seeing in the last several cycles of NAIP administration. And the news, frankly, that the NAEP has been putting out has not been good for the nation or for Alaska. and we would love for that to change and that's why we're very interested in sharing these results with audiences like yourselves. So a lot of attention has been paid to the last three nape administrations, because they allow us to look at the phenomenon of pandemic learning loss. Here you can see the main assessments reading grade four and grade eight, where I'm on slide 14, math grade for and Grade eight. You see that there have been substantial declines in students measured achievement from where they were in 2019 to what we saw in 2022 2021 assessment was canceled because of the pandemic and the school closures, and so we were shifted into even years rather than odd years to comply with our every two-year legislative mandate. In math in grade four, you saw a hint of recovery between 2022 and 2024, but not so much as to bring things back to 2019 levels and reading perhaps surprisingly we saw continued declines in achievement nationwide from 2022 to 2024 and I would just say in reading it's much harder to actually identify a discrete impact of the pandemic and school closures rather achievement nationwide has been declining from around 2015 through the present in roughly linear fashion. If you go to slide 15, what you'll see is that pandemic learning loss is a pervasive saw a statistically significant decline in performance in at least one grade or subject. Alaska there, you see three asterisk beans, three of its subject and grade combos of the four were lower in 2024 than they were in 2019. actually surpassed their 2019 scores in one grade or subject. That's Louisiana in fourth grade reading, Alabama in 4th grade math. The other interesting jurisdiction that actually did not experience a decline in its performance over the course of the pandemic are the Department of Defense Education Activity Schools, which are operated by the department of defense around the world. limited progress between 22 and 24 in math, but continued declines in reading. If you go to slide 17 and following, you can see the results for Alaska again in fourth grade math fourth representative of this pattern of achievement that started declining, not only with the pandemic, but really around the middle of the prior decade in 2013 or 2015. You see that here in fourth grade math. Back in 2003, when the state a bit larger than that for the nation as a whole over the past decade. Here's a similar picture on slide 18 for eighth grade math, again similar in this case a widening of a gap on slide 19 for fourth grade reading. And again a wideening of gaps with the Nation in eighth-grade reading There's some information in the slide deck on examples of states and districts that have demonstrated considerable progress overall in relative to the nation over the same time period states that were making gains even as many states suffered declines. Alabama, Louisiana, and Mississippi have received a lot of attention, and so maybe places to learn from. Dr. West, we're going to stop for a second. We have a couple of questions. So we'll start with Representative Elam. Thank you through the chair. Thank You. Again, this is very interesting information. And I'm looking at the math results and the reading results, both kind of the combination of slides. How does Nate take into consideration differences in curriculum? I know that certain schools will use different curriculums than others, and kind the same thing whenever we look at reading. Curriculums as well some of them are phonics based others have you know different platforms I know that you you we've got different schools using different math curriculms here. How does this accommodate? different types of curricula so great question the Nate assessments are based on frameworks, frameworks assessment frameworks that are established by the National Assessment governing board with a lot of input from stakeholders and experts. And we do try with the main nape to make sure that the assessment sort of reflects broad trends in curriculum in That being said, those curricula can vary from one place to the next, from one state to next from district to district within states. And so differences in alignment to curricular can be one possible explanation for some of the variation in achievement. We see from oneplace to The Next, we can't rule that out. major driver of the types of differences in scores you see presented on these slides, but certainly in extreme cases it could be possible. Follow up please. Yeah we'll do a quick follow up and I just want to mention Dr. West our other presenters today are local or in state at least and so your time is incredibly valuable to us and I'd like to go ahead and extend the time if you're able to stay with us for at Let's say 15 minutes and we may shift the rest of our schedule today just to maximize on our opportunity with you if you're able to stick with us. I can certainly say for 15 minutes, and even a bit longer, I do think I can hit some a couple of high points and be done very soon. Okay, but I really do appreciate the questions. Yeah, thank you. Yeah. The quick follow-up is that I noticed And I like statistics and numbers and stuff, but I just noticed that the DoD education I don't know a lot about it, but it seems like that's going to be significantly more standardized than potentially across the nation. And so I noticed between 2019 and 2024, there was no significant changes. Is there any correlation? Dr. West? that the DoDIA schools are more standardized than the schools in a given state, given the importance of local control as a phenomenon throughout American public education. As for whether that is a explanation for the relative stability of their performance. I suppose it's possible, but I would be more interested in sort of first understanding the decisions they made within that governance framework about the continuity of instruction over the course of the pandemic. And I'll just add their articles written about, Dodia has done amazing things in recent years, and they talk a lot about their focus, that they've managed across the entire planet to take their system and bring it into focus. And so, there's a lotta written about it out there, and I can't speak knowledgeily, but I know I've read a few articles. So it's- Absolutely, there is a great deep dive in the New York Times that you can find in these years. And what I took away is that they really have focused on you know, maintained, measured achievement and basic skills as a top priority system-wide, so. Okay, thank you. We'll go next to Representative Dibert. Great. Yeah, good morning through the chair to Dr. West. Thank you for your presentation. My question is on the reading as well. Put into context or how do you deal with cultural differences like in Alaska? where like a community up in the far north like an activity past school where they don't have any trees and let's say in vocabulary there are Words having to do with like oak tree or you know, how to you? deal, with a state like Alaska who it's very vastly different than other states that take the nape. Thank you, Dr. West. Yes. The short answer is as best as we possibly can that we have a committee within the governing board with which is charged with reviewing assessment items with that concern about accessibility across geographic and cultural differences very much in mind, and items are reviewed item by item, again, with input from experts with exactly those concerns in when items are piloted, as they always would be before being included on an assessment, we look for what's called differential item functioning, which means a set of items that a given population of students might perform less well on than would be expected based on their performance on other portions of the exam, and we then I will admit that this is not a perfect process and it cannot ensure, especially for very, very small populations, you know, I think we can't ensure success, but again, those very average performance statewide, even in a setting like Alaska. Thank you. Representative story culture story. Uh, thank you, co chair, him shoot through the chair. Dr West, you had said, uh, based on rep elums question. you don't think curriculum was the major driver about improvement. I believe it was in Alabama. So I was curious what you thought major drivers were to look for for improvement and then also with the this is a follow up the contextual questions would Alabama report we've gone to using the science of reading say if that was what they felt was why we should be looking at Thank you so much. I believe it was representative. Sorry, I want to clarify what I said to representative Elam slightly, which was I don't think that differences in the degree to which local curriculum are aligned to the specific content of the NAEP exam are big drivers of in achievement from one state to the next, that is not to imply that using better curricula is not a major driver of improvement in a given setting over time. And I actually think that is going on in places like Alabama where actually they've seen very rapid progress in math in particular. And Mississippi is the one that people often talk about the miracle that's really a marathon of a couple decades of improvement in reading. I do think increasing use of high quality and structural materials aligned to the science of reading in the case of that I will step back, though, having said that, and say that NATE data on their own are much better at telling us what's happening to student achievement than why. We are not well designed to pinpoint precise causes and effects of the trends that we see. Involve a closer look at the strategies that have been in place in a given state Thank you, and I was just gonna ask to go back to slides 17 and 18 especially because Just over 20 years ago, Alaska exceeded the nation The the average in Alaska acceded the national average. Yeah in eighth grade math, and we've seen a decline since then. But we, overall in the 20s, Alaska was closer to the national average than we are now. How could we use that information? You know, the NAIP is a fairly coarse tool. Do you have any recommendations on how to use that in formation diagnostically, and maybe it can't be used that way? I do think broad patterns on the nape can be useful for shaping our hypotheses. So there has been a huge amount of attention nationally and I assume in Alaska as well on a phenomenon of pandemic learning loss and it's clear that school closures were not good for learning. But as I suggested earlier, when the current declines in American student achievement started. It was really the early to mid part of the 2010s. And so we should be looking for explanations that can account for that timing. That the other thing we know, and I haven't mentioned this yet, is that the declines on average and this is true nationally and in Alaska We can look separately at performance at the 10th percentile of the distribution, the 90th percentile, of distribution. At the 98th percent, I'll students in the US and in Alaska are doing about as well as they ever have been. But at bottom of that distribution things have really fallen off. So we would be looking for explanations that are particularly impactful for that population. this phenomenon of recently declining achievement is not unique to the US. It is also evident in other developed countries who participate in international assessments. And so all of this evidence, and there's more we could talk about, but does lead me to worry about the role of screens and social media targeting youth and the decline of reading outside of schools as one major challenge for for American for students and for America American school systems. And so again you'd want to think about the plausibility of that kind of explanation that a lot of account for patterns you see here as well as those on on state tests. But that might be one way in which you could make sense of some of these of the patterns. Right. So the the idea of social media and the use of screens at is at least core correlational but not necessarily a puzzle. Okay. That's right. I don't think we have smoking gun evidence but as I think about all of Well, as I think about all those factors, and then as I look at my own household and children, it leads me to be interested in that hypothesis. OK, I think which slide are you on right now and we'll match up to where you are. So I'm going to just say I want to close with two additional points. One is that in addition to just talking about average scores, and you could go to slide 25 for this. The NAEP data also allow you to look at the percentage of students nationwide and in a given jurisdiction like Alaska, that hit different achievement levels, and we as a board have set three achievement in advance. Our goal, as we say it, is for every student at the opportunity to hit proficiency. That means that they are roughly on track to complete high school, ready for post-secondary coursework without remediation. And slides 25, or I guess it's slides, yeah, 25 26 and 27. uh can give you a sense of some of the example skills that operationalize that definition of proficiency. Um because of what I just mentioned though that the recent declines in achievement nationwide and in Alaska have been driven by low achieving students. What we're seeing is a growing um growing share of students who are not even meeting that basic level of proficiency set by And so this will take you to slides 28 and the next three sides beyond it. And, so, what you can see, for example, in math grade 4, there's been an increase from 27% in 2019 to 36% in 2014. That's the share of Alaskan students who are not meeting that basic benchmark. It really means that they are not on track to emerge from primary and secondary schooling with strong literacy and numeracy skills. And that is the population that I worry about most nationally. I think the same is true for my colleagues on the board. Last thing that I want to share with you, and this is if you jump all the way to side 34. One other thing the NAIP allows us to do is to compare across states the ways in which states have defined proficiency or being on grade level in their own assessment regimes. explained how NAEP has basic proficiency and has the basic achievement level and the proficient achievement levels. Here what you can see for math in grade four, and again there are parallel slides for the other grade subject combinations, you could see in the dark blue data points, where a given state's definition of being on grade level, how that compares to nape definition of proficiency. And Alaska, and you can see how the state's definitions compare with one another, you see that Alaska as of 2022 was in the very top three states in terms of the rigor of its definition, of math proficiency, And like many other states, it had increased its pretty dramatically as compared to 2007, which is the gray data point. You can see that Alaska is an alone. Many states during this period raised their standards to more closely approximate the definition of proficiency on the NAIT, but Alaska has been sort of at the forefront of that Grade 8 math where you have the most rigorous standards in the nation and grade 4-n-8 reading If I could introduce the second Dr. West, Representative Dibert had a question Thank you through the chair to Dr West On this slide in my community we have a lot of We have a military base and an Air Force base. And in Alaska, we have a lot of migration in and out of the state. When you look at, do you all look at transient rates of, you know, when it comes to achievement of students that across the nation? Thank you. So in those states where we have contextual questionnaires, we are able to look uh, variable reported by the school leader, which is the, uh share of students who have been enrolled in the school for more than a year. Uh, and I, again, can't remember exactly how it's operationalized. And we are then able to, um, do some analysis related to that. We don't currently have the ability to do that in Alaska because of what I shared about the contextual variables. Um, but you're right that it is a phenomenon to keep in mind as we understand these patterns, which is that not all of the students who have been who are being assessed in a given jurisdiction have been there for an extended period of time. Okay. Thank you, Dr. West. We can continue. So I will close. I know I'm over time and I've appreciated all the questions in conversation. I will just remind you that the 2026 NAEP assessment is in the field right now, including in Alaska. The data collection will wrap up later this spring and the results will be released for the nation and for each state including Alaska in early 2027. We are also gathering Only for the nation as a whole, that is not an assessment where we're able to break out results by state, and those will be available in early 2027 as well. And so, I really appreciate your interest today and I will hang around for some of the following sessions in case anything comes up. But again, thank you for your opportunity. And before I let you go, Dr. West, one last question that I ask all the time, and I think I've had answered and can't remember what the answer was, Alaska does not participate in the science test. Is that correct? I, oh, yes. So for, you tripped me up for one second there, until I remembered that. So there is a congressional mandate States, as part of their obligations under the Elementary and Secondary Education Act, do need to participate in math and reading every two years. In other subjects, like US history, civics, and science, we occasionally, as budget allows, give states the opportunity to participate separately in those subjects. and the share of state to take up that opportunity varies over time and from assessment to assessment. In science, it's been as high as the high 30s or low 40s in history and civics more recently, it has been closer to a dozen. And Alaska has generally not participated in those optional Board member, I would love to see broader and perhaps even universal participation in assessments in civics and science, for example, so that we can provide evidence on a broader array of subjects. And there are some upcoming opportunities on the schedule for states to voluntarily participate in those assessments, but Alaska is generally not done that. There was so much pressure about the amount of testing that we did everything we could. So the questions that were not answering and not participating in science could be a response to trying to accommodate families' desires for less testing. So, okay. Thank you. Let me just say I think that's very wise to do assessment audits and be cognizant of instructional time. that again uses this approach of giving any individual student over only a small portion of the assessment. It's a very low burden version of testing for the quality of the data that it provides. So just something to think about as you all contemplate these opportunities in the future. Okay. Thank you so much. Really appreciate your time today. Questions from the next presentations, we are going to take a brief at ease to transition to our next presenters. Okay, so we're back on record and we are going to hear now from Dr. Randy Traney, Superintendent of the Matt Sue Burrow School District and Dr Cindy Mika, superintendent of The Kodiak Island Burrows School district. Members have copies of this slide deck. We have about 20 minutes for this presentation. Dr Trani, welcome. Oh, thank you Madam Co-chairs for the record. My name is Randy the current sitting ASA president, and then I'll let my co-presenter introduce herself as well. Thank you, co chairs. This is Cindy for the record. I'm Dr. Cindy Micke, the Superintendent for Kodiak Island Borough School District, and thank you for that opportunity to speak today. Thank for pronouncing your name for me. I will get it right in the future. Please go ahead. Next slide, please. So often educators speak in terms that not anyone outside of education understand and so just quickly we wanted to go through some vocabulary so that you can relate when we're speaking about it. When we talk about formative assessments, those are assessments that teachers use to form instruction immediately. When we talk about summative assessments, those are assessments that are used to measure a student against a standard criteria, typically at the end of the unit or at end the year. We have standard-based assessments and those assessments measure a specific knowledge or skill based on a Standard. Assessments that we use to collect valid and reliable data multiple times per year with our students and specific skills. And then we have adaptive assessments. And those are assessments that are computer based and that they adjust in level of difficulty based on the students' responses. Next slide, please. I'm going to just for a second, a standard-based assessment could be adaptive. A standard based assessment can be a screener, right? Correct, they can be adaptive, they could be screeners, they be formative, and they also be summative. So they're not exclusive. Thanks. Okay, let's go on. So, we have three assessments in the state of Alaska. With the passage of the REIT Act, we had the M-Class Dibbles, and that is our literacy screener for our earliest learners. It's given three times a year through Kinder through third grade. And if students are not proficient on the assessment, then they are placed on an individual reading provide the intervention and then we continue to screen them and monitor them. The next one is the map growth also can be used as AK star. This is given three times a year in reading language usage and mathematics in grades third through ninth grade. In the spring it is integrated with AK Star and those questions are embedded that are aligned to the Alaska standards. The map growth, as we'll talk a little bit in another slide, is an adaptive test. We also give the Alaska Science Assessment one time in the spring to 5th, 8th and 10th grade students. I'm going to pause for a second. Oh, I am sorry. Representative Story has a question. Hi, yes. Thank you. Coach, you have him shoot through the chair. Dr. Mika, can you tell me about how long each of these assessments are that are given to the students? Please. Yes, through the chair, the map growth assessment can take anywhere from 60 minutes to 2 hours. And so when we give it, we have to block out an amount of time. doesn't take as long but it's administered individually and so it takes more time for a teacher to get through the entire class but is a faster assessment on the early literacy skills but like I said it is given one-on-one. And I don't know about the AK science Yeah, thank you for the record, Dr. Frandy training. It's a similar amount of time depending on the student hour, hour and a half, two hours, and also depending on grade level. I mean, they aren't really long tasks that take half a day, but it is logistically when you're trying to plan all of the students in the school and the supports you'll need. It ends up consuming a date, but from an individual student standpoint, it's an hour hour-and-a-half. Thank you and dr. Micah the the screener the dibbles is Just a couple of minutes per student, so each student does it individually But they're very quick Through the chair it can take about 15 minutes 15 to 30 minutes Per student when we give that assessment, okay? Thank You. I think we're ready for slide four for the record, Randy Trainee. I wanted to try to use an analogy because, like Dr. Mika says, we use these languages as educators, and we're trying to provide analogies so people who aren't necessarily educators might understand that they have a string formative and summative. I think a form of assessments is like that diagnostic blood test that you get from your doctor, where they're checking, you know, We have you on this new medication for your high blood pressure. Let's see how it's doing, and then you take that blood test, and based on that result, you change or modify what you're doing. So we do that with education, and our map growth tests, and the fall and winners, and are dibbles testing, are the tests that we use that are formative. Some of the assessments are like that final medical report after you've been discharged. They document what occurred. You were in the hospital. They tell you what has happened. It's like an after the fact update. And that's more similar to our AK star map testing results that we take in this spring, but we don't get the results until the fall. And NAIP From our perspective from educators who are trying to teach students by student by standard is more like an autopsy report of an unknown Jane Doe or John Doe. It arrives long after the fact we don't know anything that happened with our individual patients so our students or even our schools. It's just some statewide data that comes out and nothing in it is actionable from a district's It looks like Dr. West is gone. So he gets to hear you compare the nape to an autopsy. That's fabulous. Please talk to our training. So I think we can go on to slide five. And Dr Mica, are you going to take this one? We haven't rehearsed this, so. Yes, for the record, this is Cindy Mika. So and this in terms of how we use the results in our classrooms, our schools, and our district. You know, as educators, we believe that both the dibbles in the map growth give us information that we can use immediately to inform instruction. Both of these do measure growth in our students in that school year that we could use as actionable data. And so, you know as an educator, I consider We can take the information immediately, and we can change the way that we are providing instruction or intervention to the students. The AK star, although we do use the map as the vehicle for that, the AK Star is based after the end of the school year. And so it's difficult for a classroom teacher to impact instruction on a particular student when the results are given after-the-fact. Same with the Alaska Science Assessment and as Dr. Training pointed out, with we don't actually get data at the district level. We know when our schools are assessed and our students are assessed, but we do not get that data at this school district-level. And we're going to pause for a second for co-chair story. Yes, thank you. Through the chair, and I don't know who would like to answer this, either of you, I was fine. But I thought with the AK stars, what was so great about our being allowed to include our statewide assessment is it told parents how they did at the end of the year, because they're taking the same test they took in the. You know the beginning of the year the middle of a year and now in the spring and that data goes to them And they know individually how their student did at the end of The year is that correct after Mika, would you like to respond? Sure through the chair. This is Cindy Mica for the record That is correct. The parents do get the results, but they get them the following year And so I believe this year our results went out either end of September or early October timeframe. So it is, we do like that it's embedded in the map because we've given it several times. They do get the results, but they get them the following year. And if I could just interject, from my perspective as a classroom teacher, it felt like a miracle when the two were put together because it was one less test. And yeah, and we're used to the spring assessment a summative assessment that that didn't change, but the amount of time spent on testing changed significantly. Did you have a follow-up or some story? Yeah, thank you. A follow up through the chair. I'm a little confused that it takes so long because the maps data, why I've heard it's informs instruction so we can get that data quick. How come the summitive data? How can that take through the chair. This is Randy training. Oh, you know, I'll add just some thoughts about that in that end of the year, the spring assessment that's combining the two. I think the answer to your question probably is going to be best served to answer by somebody from Deed. What I do know is that there's a projection that we can access, when we take that spring assessments. So we might, we kind of have a hint page that tells us This kid is likely to have met or not likely to a met. I don't understand just how come that takes along. I dunno, you need to ask the people who work on that. I think that's probably going to come from Dean. I do. I have seen some. Information that the correlation is really high Like they're really confident. It's like 0.84 or 0,82 something like that Of those predictions that they are giving us in the spring Um, but you think your question is probably a deep question. Thank you And a question from representative diver Um yeah, good morning through the chair. Um my question, is In my experience as a third grade teacher for two decades, it's kind of like that no-man's land where we Start the map testing We you know it was very difficult for me to look at last year's data because it said dibbles and it tests for different skills And also it took the beginning of testing it. Was actually very a difficult time because there was a lot of high stress with the third graders and these this first long testing. But my question is just if there are any third grade teachers watching on gavel the gavil if any tips you know that you can help with this grade level as you do your presentation. Thank you through the chair. Thank you all for coming and presenting here today. And so kind of the the ties here. I guess I'm curious if we're looking at as Alaska from a testing perspective. Do we have a coherent single system here? Or do we? Have a collection of individual tools and if that breaks Are they still aligned like what breaks how if this if anything should change in this? Do We lose alignment within these testing systems? That's a great question. I'm just gonna remember 2016 or 17 when the internet actually broke and we didn't test so So Dr. Trainee, would you like to respond or or dr. Mica? And Yeah, through the chair Randy training So hypothetically if I if If I think about which one of these tests would do the most damage if it went away in my district, I guess that's how I'm phrasing it in the head. The map growth is the test that we use that is most actionable for promoting student achievement because we get data very quickly by student, by standard, and then we can adjust our instruction. So if you made me pick which If the others broke and went away, we would still continue to improve instruction. Dr. Mika may have similar thoughts, I bet. For the record, this is Cindy Bicca. I agree with Dr Traney, but I'm also going to add in the Dibbles assessment. This assessment in our district, we have seen the most growth in our primary grades as a result of having an assessment that we can give that's a quick screener for every student. I'm a proponent of the assessments that measure multiple times a year. and we receive the information that's immediate. So for me, if anything went away and we changed, it's like going from apples and oranges to zucchini and lettuce, which we've seen in the past here at Alaska when the state assessments have changed. But I'm a proponent of the Dibbles It is information that's immediate for my educators and we can adapt quickly to it. Also, we utilize it for at the district level for seeing where our professional needs are with teachers, campuses, and the entire district as a whole. And we have a question from Representative Schwanke. Thank you, through the chair. Dr. Mikayva. I have quite a few schools that are in rural communities and even our largest right in the middle of Glen Ellen I know that it is very common for educators to have to stop and start the map growth test because of internet connection issues and the lack of bandwidth just within our school my own son just randomly started talking about it one day and I'm curious from someone who's looked at a lot of test data what what kind of impact do you think that that has on the overall results because I am confident that our schools are not the only ones that have to deal with those types of interruptions. Dr. Mica? Yes, through the chair for the record, this is Cindy Mica. We see this a lot in our rural, you know, Kodiak's an interesting community because we have schools that are on the road system where we do have broadband, but we also have five communities that our off and do not have broadband. schools. And we experience the same issue in those schools where we are having to start and stop. The Dibbles were able to give it. It's not internet-based. While it is, we do record it online, we can actually record on a teacher's iPad that is not connected to the internet. And then when we have internet stability, we could upload the data. So that's a benefit for how the DIBbles is assessed. You are correct in that the map growth does rely on the internet and we don't have a way to utilize it without being online. And we have had that issue. I've been present when we've been trying to get online with one of my schools, Akiak, one my most remote schools. Very quick. We have about 10 minutes for the rest of this presentation. It's super quick Is there a method or is there requirement from deed to report those types of interruptions because I I think that would be extremely Important data that we should collect Through the chair through the Chair, this is Cindy Mica, and I do not know that We collect that data, but I would Be interested in that as well. Thank you Thank you, Representative Schwanke. Okay, we're on slide five, should we go to slide six? Great. So this, I just was giving an example of some of the ways that we are able to look at data immediately on map growth. The first one is across my entire district. So I can see how many students over from the fall to the winter assessment. and what our growth is for the entire district against where we would typically nationally see students in growth, and then the bottom one is a student's chart. So over the years and over the assessments we can track how the student is growing and how our interventions are working. Next slide please. I'm just going to interject there's the quadrant graph that's not here that is super helpful for a teacher You know, you've got your kids who are, anyway, I'm not going to take the time right now, but as Classroom Educator, the map gave me actionable information immediately. So, okay, go ahead. Through the chair, The Quadrant map, our educators absolutely love it. It's high achievement, high growth, low achievement low of growth. And so we're able to plot those students by Yes, so this is the Dibbles, and the dibels provides early identification of reading risk in specific skill areas for early literacy. And so we're able to see school summary, class summary. And then the chart on the graph is of a specific student that we were doing interventions with And we were able to, I guess I call it a dipstick, where we're able assess them multiple times. We assess every two weeks when we have a student in an intervention, and we are able to see their growth in the specific skills. And so the Dibble's assessment is fast and frequent, and, we can have data that is actionable immediately. I believe Dr. Trainee was taking this one. Yeah, for the record, Randy Trainey. So what we really wanted to do is try to illustrate what the NAEP does for this state of Alaska and what it can't do for this State of Alaskan, at least from our perspective as district leaders. So NAIP is very good at providing state level data, state-level snapshot data every two to four years. We heard exactly how the sampling occurs, and we heard that sometimes that sampling can be very small. It does allow national comparisons, which is I think the strength of the nape, and it's used often in policy decisions nationally. Then I'd think we've given home the idea that it doesn't provide student level data for us or school level data, we can't use it to help promote growth on So, it's really not a tool that we can use to inform our instruction while it is actually happening. It happens too late and it isn't specific enough as a district leader to be a valuable tool for that for us. And anecdotally, I might add, because we are such a small state and there are different categories, and I know we all saw that study where our charter schools in the state were Well, I happen to have the highest performing charter school in the state in my district, and it's one of the high performing schools in whole state, all schools considered. It has been selected many, many times to be part of The NAPE, which makes our charter school scores look really great, because I know when we have that professor from Harvard tell us how the study is conducted. a very significant portion every year happened to be from our high-performing charter school, which is great for the state, but it does illustrate this problem that we do have very small sample size in Alaska. Next slide, please. Just a moment, Dr. Trainee. Okay. Yes, that's through the chair. Dr Trainey, I just want to refresh. We are really mandated to take the nape. to get federal funding for like our title one and all those resources that add to the supports we can give our students, is that correct? Can you confirm that? Yeah, through the chair, I believe that is the case. So we. We get a letter each year telling us which schools in the Matsu are being selected every superintendent gets that It's you know very well organized we get that information well in advance We know when the when test is going to occur and then we comply with that because there are Funds associated with it the state level and represent vice-chide. Thank you co-chair him shoot Two quick questions on the nape so You know, as a former teacher, we often told our students in the nape that, you know. Our goal is to get proficient. We have basic proficient and advanced. But my understanding is proficient is oftentimes above grade level. Is that correct? Dr. Traney? That graph that we were shown earlier how our state assessment is one of the hardest and It was at the level of The nape. So the nap is like a high bar. I think So I do you know I'm not an expert on nave, but I believe that that is the case follow-up that's my understanding too so I knew the answer to that before I asked it but I thought it was important for the public to know proficient is above grade level and then I'm also a former track coach I coached high jump so if you raise the bar for what constitutes proficient but nothing else changes at the school assessment level it doesn't essentially great create this Dr. Traney, do you want to respond? I'll respond this way, I guess. There's lots of different tests that we take and they measure different things. So for example, in the mass who we're really proud of how we do on AKA STAR, you know, but if you look from the outside, we might be 40%, a little more than 40% proficient in both ELA and math. At the same grade levels where I'm taking the PSAT, which I can reference against the rest of the nation, we have a much higher take rate than the rest the nation and we're outperforming the rest to the nations. So as an outsider, you might see the AK star results and you'd think kind of what you were just saying that bar has been raised very high on our AK Star test, but it makes it look like we aren't doing a very good job with only 40% proficient rate. Conversely, we can go look at our PSAT and our SAT results, and we'd see the students in the Matt Sewer out performing the nation. So, I appreciate your question because it does point to how changing the bar that we're jumping over might change somebody's perception of how we are doing. Okay, and a very brief question from Representative Elam. Thank you. I will try to be brief. No, uh, it looks like we have kind of a, a peppering and I'm for the record. I am not an educator. So, um, the, we, have a kind of checker board or kaleidoscope of testing here. And I just wondering from a administrator perspective. In all of your testings, where does the nape testing get prioritized? We have a lot of different tests here that are coming through, and so are we teaching to which testing model? Dr. Micah, do you? I don't like the term teaching to the test. I think that we teach to this standard and I think in Alaska our assessments are based around our standard. I do not believe that the nape is to our standard, and also it would be really difficult for us to at the student level or the district level so we aren't able to change our instruction based on the assessment because we don't ever see the question. Sorry, the questions or the results at the district or school level. I know that for Kodiak, we've been chosen the past in 2022 and 2026. And my most remote village was chosen both times, Akiak. And in both situations, I only had one student at a grade level chosen. three of my five rural schools have been chosen. And at each of those schools, Akiyaka, I have one student in the grade level. In New Zinke, I've two students in a grade-level, and in Port Lyons, I three students at that grade, level, so the sample size is very small for rural school, as was asked earlier. I don't know if it's improbable, but for Kodiak, it is probably probable that we might only have one or two student tested at that great level in our rural sites. So we don't, while I know the NAIP is important for the nation and the state, because we actually don' t get the results to segregate it at our specific level, we can't use that data to improve our instruction like we do with dibbles and map growth. Thank you. Okay. And I think it's really important on this slide eight where NAEP is a reporting tool, not an improvement tool. And I was comforted today to learn that not every student takes every section. And so it's a fairly low burden test that gives very high level data. And, so I think there's still value in it, but in terms of, in my room as an educator working with individual students, it had zero value. represented co-chair story. Thank you through the chair. When I think about the history of Alaska, and this is more or less a comment unless I'm remembering incorrectly, we used to have a lot more assessments given to students, and we, this was really kind of the bare bones that we've gotten down to because of feedback that we have gotten from our communities, but I thank the Bare Bones that we had gotten to are really ones that do inform instruction and help. instructors, educators do their work for our kids. I think, does anybody remember the Terra Nova? That's gone. But also, having the AK star embedded in the map has helped to reduce the overall testing burden. Would you like to add anything, Dr. Trainee or Dr Mika? For the record, Randy Trainee, I would agree with that. I think there's been a pretty consistent effort at the statewide level and district-wide level to reduce the number and types of tests and try to winnow it down to those ones that help us most to inform instruction. And clearly that's Dibbles and Map Growth right now. AKA Star, it's summative in nature, but you Okay, let's move along here. Slide 9. Dr. Mika, do you want to take this or do you wanna take it? Oh, you can take it. Okay. I won't run through the whole chart. You can see that we've tried to summarize our presentation on what each of these different tests can and can't do. down at the bottom I do want to point out where there's the the row where it says all students like so do all student take M class and Dibbles and do All students take map growth and we say sort of and that sort is that big issue that we have in the state of Alaska where you can opt out of the testing and so with approaching one in you know about one in our correspondence programs. We certainly have one of the largest correspondence programs in the state. Our opt-out in those programs is really high. And in Matt Sioux, in particular, I have a high opt out just in general. So there's the sort of, there is no mandate that every student take every test. And so whenever we look at the data, we have to understand that we are not getting a representative sample or a complete sample of everybody in any district or any particular school. Dr. Trainee, a quick question about that. Are the districts paying for a map, or because the, or is the state paying for the map? Is this a district expense, or has this test provided for you? I think that's a shared expense. And there's some individual perspectives that each district might take on how often they're using it. But. I'm really not good to ask that because we've had map and met through for like decades. So it's just been the background. Dr. Mica, do you have any more information on that? Through the chair, for the record, this is Cindy Bica. For the state required portions of map it is provided to us, but Our district chooses to purchase outside of that and so we actually assess other grade levels besides those that are required by the State Because we do find a value in it Okay, thank you coach her story. Hi. Yes. Thank you through the chair Dr. Trainee you said for dibbles. Yeah, sort of but I would think dibles Everyone would want their child to have those quick assessments by their teacher of their reading and To help them adjust Instruction for the growth and dr. Training Yeah For the record any training sort of I would say that of of all the tests That's the one where we have the most um participation at all uh you know in all the great levels that it covers there are families that do opt out of the dibbles um there's also life that gets in the way um so it doesn't necessarily happen um our most challenging in matt sue the most-challenging environment would be our correspondent students who are co-enrolled in say another private school um operationally getting the the dibbles testing done can just be problematic and sometimes those folks opt out of it all together. Thank you and a quick question from Representative Dibert. Yes on thank you through the chair when I was teaching and map from what I remember gave me weekly activities that came with the program so that I could target that with my students. Is that part of the purchase? I don't know like how that works or how school districts are using those weekly. I think there were assessments and maybe activities that as a teacher can help my students with. Thank you through the chair. So does the map package include all of the tools? Dr. Mika? Through the chair, Dr., Cindy Bicca, this is not all of The Tools. I think you can add on to it, but it does have teachers can look at where the student is currently, and it does has suggestions of where their next level of instruction should be. in order to gain proficiency or move up a step and so some of that is provided but not all of the robust activities that can come with MAP are provided. Thank you. If we could move to slide 10. review of the assessments that we give in Alaska and what the overall state testing requirements are and what they meet. So the Dibbles meets the Reads Act requirements and it also informs instruction at the classroom level and student level. The map growth and AK star, it meets As an educator we we really appreciate that it can inform instruction and can also help us know how our students are growing against a national norm. The Alaska Science Assessment is a summative assessment and it meets that federal ESA ESEA requirement as well. Okay any final questions for superintendent Thank you so much for making time to be with us and giving us that refresher on different types of assessments. We are lucky to have such knowledgeable people available to us, so grateful for your time. We're not even going to take an eddies, we're just going roll right into deed, spend about ten minutes with them. I think for this portion of the morning, if we could just write questions down, we'll submit those to our partners at deed. And we are so glad that we have both. I think we have two folks from Deed here today or more. Yeah. We have Deputy Director Kelly Manning, an assistant director, Karen Malin with the department. And then we also have online folks from NWA where math is housed. If we need to reach out to Scott Peters, Patrick Meyer, or Kelly Schmidt, they can help us with questions about the math. They are the curator of the measures of academic progress. And I think Alaska is one of the only states to embed the A.K. Star, our statewide assessment, our state-wide summative assessment is embedded in the map. And we're one of only the states to do that. And I think that's something to celebrate. So we want to make sure to bring that to light today. So, we are going to start. I think we have Ms. Manning and Ms Maline online. And we have with us in the room Deb Riddle. Are you the driver or the speaker? Okay. Deb Riddle is our driver. And so we'll go to Deputy Director Manning and Ms. Malin. Good morning for the record. This is Kelly Manning, Deputy director for the Division of Innovation and Education Excellence. Are hearing me okay? You sound great. Wonderful. Thank you so much for having us this morning. We are going to be presenting on the- state NAIP assessment, the Alaska State NAAP assessment and AK star. We will move through some slides that were covered in the NAEP presentation so that we can get through to things that weren't covered. But we will go over the NAP Assessment, what it is, the design and some reports and data information. And then we have information for you today on the Again, what that is, how the assessment is designed and reports and data, and I will hand over to Karen Molin to begin. Can we go to slide three, please? Thank you. Deputy Director Manning, this is Karen Malin, Administrator for Standards and Assessment for the last Department of Education and Early Development for their record. Just want to do a sound check. We hear you okay, our understanding was that we were going to just skip over nape today and start on slide 17. Would that be possible in the interest of time? We have about 10 minutes. That would absolutely be possible and we will do that for you, Coach, or him, shoot. Thank you so much. So, beginning on slides 17 with AK Star, just a quick overview of the assessment itself. our partner NWA is the vendor that creates our assessment. It is a required assessment as we've heard already today. About 68,100 students across the state take the assessment each spring and it is an integrated assessment and as was mentioned earlier It is one of the only in the state integrated types of assessment, and it is aligned to the Alaska math and ELA standards. Next slide, please. So about the framework, Alaska educators do participate in the creation of this assessment. They are part of that item review. They look for sensitivity and bias. and the proficiency levels are divided into four as opposed to eight that has three. There's advanced, proficient, approaching proficient and need support. Next slide please. So just a little bit about the standard setting process that happened and they reviewed the items that would appear on the assessment against the ELA and mass standards to assure alignment in the Assessment. Then after the first administration of the Assessment, Deed and NWA, our partner conducted a validity study or a validation study in the summer of 2023, and that review did showed the validation of the cut scores, and because of that process, there was a slowdown of the reporting for the timeline for 22-23 assessment. Next slide, please. So here's just a list of the reports that are available from AK star. And as you have heard earlier, the integrated portion of the assessment map growth is available to instructors and school leaders right away after the assessment. So all of those reports from the map portion of the assessment are available right away. And I will also want to put kind of an exclamation point on something that Dr. Trainee said about the projected proficiency. So in the fall and in, the winter administration of map growth, there is a report that's called the Projective Proficiency. So that allows educators to know on AK star. So there are also district level reports available. The districts get a data file and then the state reports the percentage of students that are proficient and not proficient on the assessment. Next slide please. English language arts data from the 2025 administration of A.K. Star. Next slide, please. The next few slides are going to be the same data presented in different forms. So in this first form, we're looking at grade level data across all four proficiency levels. So the orange representing the needs support, proficient, the kind of mustard color, is proficient and the blue is above or advanced. Next slide please. This slide represents all students that took the ELA And on this slide, we have a three-year trend of the ELA test. This slide however, just is representing those students that were proficient or advanced. So this is just representing two of the four reporting levels and that being the proficient and advanced, and this the trends that we've seen in the last three years. Next slide, please. And then this is that same body of data represented by district. And this again is just the proficient and advanced reporting levels and how each district fell across the continuum. or advanced category for the ELA standards and that again is at the 2025 administration. Next slide please. Now we'll talk math, similar data presented in similar ways. Here's the grade level for that 2025 Administration with all reporting levels. Next slide, please. Again, we have the math data reported off for achievement levels, and again, this is statewide all students that took the Math Assessment. Next Slide, Please. And this, again is the three year trends by grade level. And just a reminder, this is the proficient and advanced category only. Next slide, please. And here are the math graph. By district. Next slide, please, and with that, we'll take any questions that you might have. Okay, I know Rep. Elam has a question left to tell us the slide number, and we're going to take questions for about three minutes. Thank you. I appreciate that. I just wanted to go back to the graph slides because I thought it was interesting that there was some, you know, if we look at this slide you'll see above Need support is 38 that would be head and shoulders taller on the bar graphs From all of those proficient ones on both slide decks, and I'm wondering if you can explain that a little bit The the stylistic choice to not put it in right Ms. Malin so Sure the chair represented elum again for the record Karen Malene If I'm understanding your question correctly your concern, your questions is around the difference in the way that that is represented from the bar graphs to the pie graph. Correct. Because the Pie Graph, if you look at those like on math for example, 48.9 percent need support, which is almost half of the Pi Graph. Then when we look proficient. I mean, it would that that line in those bar graphs would be significantly taller for the ones that need support versus the one's that don't need. Support. And I just found it curious that we would not be including that statistic when we're looking at these. Numbers. Thank you. I can take this for you through the chair. Member Elam, this is Kelly Manning. you know, really what we've tried to do with the data is just look at it in some different ways. So, you the different slides represent different pictures of the data. Here in this particular table, the goal was to really kind of look at where do we see some maybe trends in students that might be moving into that proficient the that's what this is intended to look at but we still in in on our website and in our reporting have been sharing all of those pieces so that you can see what does it look like in the different categories when you go to the 2025 that first slide you could see the difference um great performances uh you know in each of the subject areas so that really was an effort to looking at what is what's the trend in looking at our students moving towards proficient and so you know each of the slides is a different picture of how we might look at the data and what does it look like in 25, what is it like across all grades, and then looking at what are some of trends that we may see there when we look across the years where we have comparable data now that have the three years of data from 23 to 25. Okay and our last question today is from Thank you. Through the chair, just a request, I would be great to have the need support in approaching proficient in the same graphs with the school districts that we have, kind of the bar graph. I'd like to see that. And then I d also like to answer to the question that you talked about under the nape discussion, or actually the AK stars. Why it takes so long for our spring assessment. The AK stars, the one we're really great that we were able to combine it as our national test to why it take so longer to get the results in the they don't get them to the fall is Manning. Is that a quick answer? Yes, through the chair member story I can't answer that question about the reporting timeline so As we've discussed a couple of times when we move to the new assessment design with map growth Embedded we embedded the recording that Karen shared earlier Regarding you get your map score and then the projected proficiency so that districts could track and see how might a student? perform if they continue at this rate. And then in the spring, we're able to get the map, the Spring map growth data is released. But right away, just like it is on the other map assessments, before the summative assessment, there are hand-scored items for writing. There's a number of data analysis components that the vendor does, NWEA with the data. Some of those things are aligned with federal reporting requirements. We have worked really hard with NWA to shrink that as much as possible, and right now we're level data at the end of July. Prior to AK star, we weren't able to release state results until later in October and now we have it at the beginning of September. So we have been working to shrink that as much as possible, but the summative data does have certain data analysis requirements that does take a little bit longer to get out, but that having that projected proficiency throughout and they do get a proficiency just a rank, it doesn't give them cut score and individual student data to the same level, there. Yeah. Thanks for your patience today everyone. Thank you for being here. Our friends from the department. I'm sorry things just always go over time. We have a lot of questions on this committee, but we're really grateful for your time we may come back to cut scores to try to understand how those are set. another time. This concludes our meeting for today. Our next meeting is on Monday, February 2nd, 2026 at 8 a.m. this meeting will feature the introductory hearings of HB 261 and SB 6. This meeting, will only include invited testimony saying no further business before the committee.