Release Event of The Nation’s Report Card: 2013 Math and Reading, Grade 12

Release Event of The Nation’s Report Card: 2013 Math and Reading, Grade 12


Good morning, and welcome to
Dunbar High School in Washington DC. We’re here for today’s event
to release the results of The Nation’s Report Card: 2013
Mathematics and Reading, Grade 12. Thank you for coming and
thanks to those of you who have tuned in to the webcast. I’m
Cornelia Orr, the executive director of the National
Assessment Governing Board, and moderator for today’s event. The
National Assessment Governing Board is a nonpartisan
organization created by Congress to set policy for the National
Assessment of Educational Progress, or NAEP, also known as
The Nation’s Report Card. The Governing Board is committed to
making NAEP an accessible, useful resource for you. The
report card results we’ll reveal today show how
twelfth-graders nationwide and in 13 pilot states performed on
NAEP in reading and mathematics, subjects that are critical to
student’s success in their pursuits following high school.
We will examine recent and long-term twelfth-grade
achievement, how students in the 13 pilot states compared to the
nation as a whole, and what the level of knowledge and skill
that 12th graders have. I would like to thank Dunbar high
school, a school with a rich history in student academic achievement
— for hosting us in this beautiful space. And now I would
like to invite the principle, Stephen Jackson and Jonathan
Johnson, a senior here at Dunbar, to provide our official
welcome and share some brief remarks. Gentlemen? Thank you. Welcome to the new
state-of-the-art, historic Paul
Lawrence Dunbar high school, where leaders are cultivated and
dreams are born. In our 140 year history, Dunbar has always
emphasized the importance of education. As our essential
requirement in order to become a productive member of our
society, greatness is in our past and is part of our legacy
at the historic Paul Lawrence Dunbar high school. Giants such
as Dr. Charles Drew, the Honorable Eleanor Holmes Norton
and Mayor Vincent Gray are all alumni of Dunbar’s heritage.
Since its inception, from 1870 on Dunbar has had many great
educators. Our first principal, Mary Jane Patterson, the first
African-American female to receive a college degree‚ and
she received it from Overland in 1863. The fourth woman to
receive a Ph.D., Anna J. Cooper, was our principal from 1901
until 1905. The other three African- American women that
received their Ph.D. were also graduates of Dunbar. We also had
the famous Carter G. Woodson, who is father of black history.
He taught here for many years. In fact, he founded Black
History Week at Dunbar. In addition we have the great
Charles Hamilton Houston. An attorney who became the Dean of
students and the Dean of the law school at Howard University and
he is responsible for teaching the great Thurgood Marshall. In
addition, we have the second principal of Dunbar, Richard
T. Greener. Richard T. Greener was actually the first African
American to graduate from Harvard College. Which became
Harvard University in 1870. We also had the famous Edward
Brooks, the first black senator since Reconstruction. He also
graduated from Dunbar high school. Earlier this year, the
US Department of Education announced the launch of a new
initiative. The Principal Ambassador Fellowship. In
unveiling this program at the National Association of
Secondary School principals conference, Secretary Arnie
Duncan noted that the department staff uh, after the department
staff, spent the day shadowing the principals across the DC
area, The participants highlighted the lack of
principals, voices in the dialogue surrounding education
policy. To further expand Secretary Duncan’s vision to
include more principals, he, on behalf of the US Department of
Education put together a panel for the standard committee of —
of principals for the National
Assessment of Educational Progress — NAEP. Also known as
the Nation’s Report Card. I would like to thank Dr. Peggy
Carr and her team for allowing me to be a part of this
prestigious panel of principals that represent schools from
around the country to provide expert counsel to inform the
future of NAEP. And with that said, I want to welcome you all
again to the great, historic Dunbar, and at this particular
point what I would like to do, is I would like to introduce you
to one of our seniors, Jonathan Johnson who will say a few words
of inspiration to you. Thank you. [Applause] Good morning, everyone. Welcome to the‚
sorry‚ welcome to the historic
Paul Lawrence Dunbar senior high school. Think back to a time
that you were forced to read Greek mythology by your excited
English teacher. Each Greek god of excellence you skimmed
through equated to an admiration that you remembered forever.
Picture Oedipus standing at the gates of death of Thebes or
Cupid flying too high. Or more appropriate for NAEP. Now
let’s use Odysseus. From the time of his childhood, Odysseus
had a dynamic mother and a fearless father and many wise
teachers that prepared him for battle and equipped his very
bloodline with the tools he needed to be a conqueror. My
name is Jonathan Johnson, I’m a senior here at the historic
Paul Lawrence Dunbar senior high school. And out of a random
sample of test takers for NAEP I could easily be chosen; but
would I be ready? Today the students of America should not
only have a vast vocabulary and stamina to compete in a well of
knowledge of which‚ to draw rich water. The preparation for
any test — not just our nation’s scorecard, should start
in primary schools. There are three elements that will create
a generation of scholars that America is searching for. First
the motivation by which students should learn should be
intrinsic. Second, the middle school years, where much is
lost to hormones and prepubescent discovery needs to
be filled with disciplined learning and discovery of power
and more malleable minds. And thirdly, our high schools need
to become the institutions that the founders intended: hubs of
learning and inspiration which are disciplined and cared for.
Today our panelists will ponder these questions. Are students
making progress? How are students in pilot states
performing? What level of knowledge and skills do students
have? But I ask first, when was the last time you had spent the
day in the life of a teenager? They say times have
changed,people change, but students in the role and the
definition of a student should never change. Thank you. Wow, isn’t it an honor to be in a
place with such a legacy? Thank
you so much both Principal Jackson and Jonathon Johnson for
those remarks, well thought out and well delivered. Today we
also have a panel of experts who will get their thoughts and
reactions to the report card. I will briefly mention their names
and then each will bring their remarks and I will introduce
them more formally at that time. The panel includes John Easton
the director of the Institute of Education Sciences and acting
Commissioner of at the National Center for Education Statistics.
Next we will hear from Dale Nowlin, a 12th grade mathematics
teacher in Columbus, Indiana in the Bartholomew Consolidated
School Corporation. And also a National Assessment Governing
Board member. Following that we will hear from Susan Pimentel,
education consultant and curriculum specialist in the
English language arts and vice chair of the National Assessment
Governing Board. Following Sue’s remarks we’ll have a question
and answer session during which you will be able to ask
questions about the report card. You can see there are mics —
there is a mic positioned right here. For the webcast attendees,
you can submit a question for the panelists at any time during
the webcast. Simply use the webcast screen to type your
question in the chat area at the right of your screen and then
click send to get the question to us. If you need technical
assistance with the webcast there is also a help tab at the
right of your screen that you can use. We encourage you to
join in an ongoing conversation that will take place after —
during and after this session using #NAEP. Now it’s my
pleasure to introduce first speaker — John Easton is the
director of the Institute of Education Sciences and acting
Commissioner of the National Center for Education Statistics.
He came to IES in 2009 from the Consortium on Chicago school
research at the University of Chicago where he most recently
served as its executive director. As IES director John
lead four national research centers and a staff of 200 that
oversees research and grant disbursement throughout the
United States. John also served from 2003 until 2007 as a member
of the National Assessment Governing Board and now he is
also an ex officio member of that Governing Board. He is the
author of numerous reports and articles and two books about
Chicago’s schools — Charting Chicago School Reform,
Democratic Localism as a Lever for Change; and Organizing
Schools for Improvement — Lessons from Chicago. John?
[Applause] Thank you very much,
Cornelia. It’s a great
pleasure to be here today to present the results of the 2013
12th grade mathematics and reading assessments from the
National Assessment of Educational Progress, The
Nation’s Report Card. We released results for him
students in the grade forth and eight last November. The 12th
grade result I’m presenting today include both national
results and the results for 13th states that volunteer to
participate in the 2013 grade pilot. 11 of those states also
participated in 2009, the first year of the piloted state
attestment — assessment. The pilot is examining the
feasibility of conducting the NAEP 12th grade assessments at
the state level for every state as we do in grades 4 and 8. Um,
these assessments were administered about a year ago —
early in 2013 to 92,000 12th graders We have national results
that we’re going to present both for public and private
school students and these results are based on
representative samples for the nation as a whole, not just the
13 pilot states. We present student performance on NAEP in
two different ways — average scale scores and the percentage
of students scoring in the different NAEP achievement
levels. These achievement levels were developed by the National
Assessment Governing Board and the Board set standards for what
students should know and be able to do. For each grade and each
subject the governing board establishes standards for basic,
proficient, and advanced performance levels. Ultimately,
our goal is to have all students performing at or above the
proficient level. Uh, when I talk today about differences and
scores and make comparisons, I’m only pointing out the
score differences that are statistically significant. For
the most part we will be comparing student performance in
the 2013 administration with scores from the prior
administration in 2009, the earliest assessment. In 2005 for
mathematics and in 1992 for reading. Our pilot 12th grade
program began in 2009 with the 11 states you can see on the map
shown in gray. In 2013, 2 additional states — Michigan
and Tennessee — also participated. The next slide is
important to us and shows the participation rates over time at
the 12th grade. At the most recent administration — 90% of
the schools in the initial sample agreed to participate in
the NAEP assessment; this is the highest we’ve ever had over this
time period. We require a participation rate of at least
70% for us to report results and we have easily achieved that
rate since 1998. The next slide shows the percentage of students
within those schools who agreed to participate. You can see that
we hit a low of 67% in 2005. But, at the last administration
in 2009 we were pleased to see an 84% student participation
rate. So what we’re really interested in is the overall
participation rate, which is obtained by multiplying the
school and student rates. So, for example, for the most recent
administration in 2013, we had an overall participation rate of
75% — and that’s actually 84% of the 90%. So we are quite
pleased to have reached that. Um, I want to point out some
important background or context level that the, about the
change in demographics of the 12th grade population over the
years that we’ve administered 12th grade NAEP — this slide
provides an overview of those changes. Since 1992, the
percentages of students in some of the major groups have
shifted. For example, the percentage of 12th grade
students that are Hispanic has increased from 7% to 20%, while
the percentage who are white has decreased from 74 to 58%. We
don’t show the percent for black students, but it has remained
relatively constant at around 15%. The percentage of
twelfth-graders identified students with a disability has
increased from 5 to 11% while the percentage identified as
English Language Learners has increased from 2 to 3%. At the
same time, we’re really pleased to see an improvement in
the inclusion of students who are taking the assessments. The
inclusion, the exclusion rate of English Language Learners and
students with special needs decreased from 5 to 2%. Another
change since 1992 is the size of the NAEP 12th grade sample,
which is increased from 9,900 to 47,200. And this larger sample
size increases greatly the precision of our results. The
12th grade population has also changed because more students
are graduating from high school now. The average freshman
graduation rate rose from 74% in 1992 to 81% in 2012. What this
means is that some lower performing students who in the
past might have left school before reaching the 12th grade
are persisting with their high school education. We don’t have
1992 graduation rates broken down by race and ethnicity, but
for the 2012 — but for 2012 the graduation rate ranged from 85%
for white students to 68% for black students. When we make
comparisons in 12th grade reading in 1992 and 2013 we must
remember the extent to which the 12th grade population itself has
shifted. Now we have an overview of the results of the reading
and math assessments over time. As indicated by the Up arrow,
mathematics scores were higher than in 2005, the earliest
assessment to which results from the current assessment can be
compared. But, the results were not significantly different from
2009 as indicated by the sideways arrow. Scores in 2013
for 12th grade reading were four points lower than in the
assessment — the first assessment year in 1992 but not
significantly different from 2009. So to take a closer look
at some of these results we will begin with mathematics. All of
our NAEP assessments are based on content specifications and
assessment frameworks developed by the National Assessment
Governing Board. The most current mathematics framework
was developed for use in 2009, incorporating changes made to
the previous 2005 framework for mathematics. Scores for the NAEP
12th grade mathematics assessment are reported on a 0
to 200 scale. In 2005, the average score for the
mathematics assessment was 150. In 2013, the average score was
153; higher than in 2005, but unchanged from 2009. The score
from 2005 has an asterisk, indicating that it is
significantly different than in 2013. In this case the score for
2013 was three points higher. The triangle shown for the
2009‚ for 2009 indicates that that year was significantly
different from the score of the previous assessment. Again, in
this case, a three point gain. This figure shows the changes
that have occurred in the 12th grade average mathematics scores
for the six racial/ethnic groups identified by NAEP as well as
the changes in scores for male and female students. Scores are
shown on the left in the green half of the chart, while
score changes are shown in yellow on the right. Since 2009,
scores for the six racial ethnic groups did not change
significantly as shown by the diamonds. Since 2005, scores
have increased for all groups except for American Indian
Alaskan Native students as indicated by the, arrows
in the column on the far right. The numbers to the right of the
arrow show the size of the increase ranging from 4 points
for white students to 13 points for students of 2 or more races.
In 2013, the white-black score gap was 30 points while the
white-Hispanic gap was 20 points. The male-female gap was
three points with males outperforming females. None of
these gaps changed significantly compared to either 2009 or 2005.
In 2013 the percentage of 12th graders at or above proficient
was 26% an increase over the 23% for 2005. This bar chart shows
the percentages of 12th graders below basic, at basic, at
proficient and at advanced in 2005, 2009, and 2013.
Percentages that were different from 2013 are indicated by
asterisks. For example, the percentage at a proficient was
higher in 2013 than 2009 while the percentage below basic was
lower. Now we’re going to take a look at the relationship
between student performance and students course taking in
mathematics, beginning with students who scored below the
25th percentile. So this is the lowest quarter of students by
score. As the bar chart shows, 58% of students who scored below
the 25th percentile on the assessment reported that
algebra/trigonometry was the highest level of mouth course
they took. Few of these students reported taking math courses
higher than algebra two or trigonometry. Only 9% took
precalculus and 3% took calculus. Now in contrast,
students in the top quartile, that is above the 75th
percentile — only 15% of them took Algebra 2/trigonometry as
their highest level mathematics course. 34% of the higher
achieving students reported taking precalculus and a full
half — 50% — reported taking calculus as their highest level
math course. These demographic indicators for the 13 states
that volunteered for state-level pilot assessment and help us
understand some of the background and the differences
in the student populations in these states. The participating
states varied greatly in population. On the one hand we
have Florida, one of our largest states in the nation and on to
other hand South Dakota, which is one of the smallest. They
also vary in the makeup of their population. These figures show
the range in a 12th grade population percentages for the
13 states along with the national percentage. The
percentage of black students range from 1% in Idaho to 23% in
Tennessee compared to the national average of 15. Hispanic
students made up 1% of the student population in West
Virginia.and 27% in Florida compared to the national average
of 21. There is also a wide range of students in the
percentage of students that attend suburban schools. From a
percentage that approaches zero in South Dakota to as much as
77% in New Jersey, compared to a national average of 35%. The
range of percentage of students reporting having a parent who
graduated from college is narrower. From 41% in Arkansas
to 58% in Iowa, New Hampshire, and South Dakota compared to a
national average of 47%. On this map we are looking at the state
performance of the volunteer states. In 2013 the percentage
of 12 grade students at or above proficient in mathematics was
higher than the national average in South Dakota, New Jersey,
Connecticut, Massachusetts, and New Hampshire. The percentages
were comparable to the nation in Idaho, Iowa, Illinois, and
Michigan, and lower than the national average in Arkansas,
Tennessee, West Virginia, and Florida. So we’re able to show
changes in scores for 11 of the 11 states that
participated in both 2009 and 2013. Scores increased in 2013
for the four states shown at the top of this figure. Arkansas,
Connecticut, West Virginia, and Idaho. In New Jersey, Florida,
New Hampshire, South Dakota, Illinois, Idaho, Iowa and
Massachusetts there was no significant change in scores. If
we look at the state gains according to different subgroups
within the overall student population, we see different
patterns across these four states. In Arkansas, scores
increased for male students but not for females. Average scores
increased for white, black, and Hispanic students. In
Connecticut the scores increased for male students but not for
female students. Mathematics scores did not change
significantly for any of the other‚ for any of the three
ethnic groups. For Idaho. scores did not change significantly for
either male or female students. They did increase for white
students, but not for Hispanic students. For black students the
sample size was not large enough for us to report results. In
West Virginia we again saw an increase for male but not female
students as well as increases for white and black students.
Now we’ll turn to reading. The NAEP scale is a little different
for reading it spans from 0 to 500 rather than 0 to 300 as we
saw on math. In 2013, the average reading score was 288,
unchanged from 2009. But lower than the score of 292 recorded
in 1992, the first year of the assessment. The triangles for
assessments in the intervening years indicate increases over
the prior assessment when they point up and they indicate
decreases when they point down. This table shows the changes
that occurred in the 12 grade average reading scores for the
six racial/ethnic groups identified by NAEP as well as
the changes in scores for male and female students. The
diamonds indicate that there were no changes in scores since
2009. Since 1992, scores declined for black students,
male students, and female students. In 2013 the
white/black score gap was 30 points wider than five points
compared to 1992. The white/Hispanic gap was 22 points
in 2013 and did not show a change from 2009 or 1992. The
gender gap in reading favored female students — the reverse
of the gender gap for mathematics. Female students
scored 10 points higher than males in 2013 with no change
either from 1992 or 2009. This bar chart shows the percentage
of 12th graders at the three achievement levels for the 12th
grade assessment since 1992. In 2013, 32% of students were at
proficient and 5% were at advanced. The percentage at
proficient in the 2013 was lower than 1992 while the percentage
of advanced was higher. The combined percentage of students
at or above proficient was lower than in 1992. The NAEP
assessment asks a number of what we call background questions.
And they asked students how often their class discussed
interpretations of what they read. As this bar chart shows,
the more frequently the students discussed their reading in
class, the higher the scores were likely to be. In 2013 the
percentage of 12th graders at or above proficient in reading was
higher than the nation in seven states — Idaho, South Dakota,
Iowa, New Jersey, Connecticut, Massachusetts, and New
Hampshire. The percentage was comparable to the national
average in Illinois, Michigan and Florida, and lower than the
nation in three states — Arkansas, Tennessee, and West
Virginia. 11 states participated in the first 12th grade pilot
program in 2009. Scores increased in 2013 for the two
states shown at the top of the figure — Connecticut and
Arkansas‚ and the remaining states — the score changes are
not significant. In 2013, four of the participating states
recorded increases as compared to 2009 for at least one of the
student groups shown here — male and female and white,
black, and Hispanic. Average scores for Arkansas and
Connecticut improved between 2009 and 2013. While the average
score for Idaho and West Virginia did not. In Arkansas,
male students and white students had higher scores and in
Connecticut there were increases for male, white, and black
students. In Idaho, scores increased for white and Hispanic
students and in West Virginia scores increased for male
students only. This final map shows a summary of state results
for of reading and math. As we have seen, 2 states — Arkansas
and Connecticut, had score increases in both subjects.
Idaho and West Virginia had increases for mathematics only.
The two bar charts show the percentage at the bottom, show
the percentage of 12th graders at or above proficient in 2013
for the 13 participating states in mathematics and reading. In
mathematics the percentages ranged from a high of 34% in
Massachusetts to 14% in West Virginia. In reading they ranged
from 50% in Connecticut to 28% in West Virginia. There is much
more information on student performance both nationally and
for the 13 states that participated at the state level
pilot on our interactive website. In addition you can
access release questions through the NAEP question center and
runs your own analysis using the NAEP data explorer, our online
data analysis tool. Now I would like to take a few minutes and
briefly demonstrate some of the new features we added to the
online report card for the 12th grade release. Additional data
from 12 grade mathematics and reading assessments are
available online. I’m going to walk through a brief demo of the
interactive report which allows the users to explore areas of
interest in greater detail. Because the website for grades
8 — 4 and 8 have the same interactive features, today I
will focus on many of the enhancements made to the 12
grade report. As you can see, the homepage provides a visual
overview of the results. Each section of the page highlights a
different aspect of findings. If you scroll down and you will
find the nation’s average score changes in reading and
mathematics, The percentage of students at or above proficient
in both subjects for the nation and select student groups, the
score gains and percentages of students at or above proficient
in the 13 volunteer pilot states and information highlighting
students experiences in the classroom. You can dive directly
into the report and learn more about the data presented by
clicking on the blue links located at the bottom right hand
of the screen. So now let’s take a look at the new features on
the report by using the navigation bar on the left. On
the, changes by student group‚ page, the table at the
top shows score changes by student groups since the last
assessment and the base year. The drop-down menus above
the chart allow you to choose a subject and student group of
your choice. Click on the labels in the left column to see the
results by percentile. If we select mathematics and parental
education level, for example, we see there has been a six-point
score increase from 2005 to 2019 for students who reported that
neither parent finished high school. When we click on the
group label, we can see that the average score gains occurred in
the 10th and 25th percentiles with a seven and eight point
gain respectively. Another new feature of the report is the
ability to review results by race/ethnicity in the context of
other student characteristics. For example, we can now view
race ethnicity results by parental education level,
gender, and school location. We can isolate and compare any
category across each racial ethnic group by clicking the
appropriate bar. For instance, the chart shows the scores for
each racial ethnic group by parental education level. The
key at the bottom tells us that the black bar denotes the scores
of students who reported that their parents did not finish
high school. By clicking any of the black bars you can see the
scores across all racial ethnic categories. By re-clicking the
bar, the graph’s resets and other factors can be explored.
If you scroll down the page to see how racial ethnic groups are
making gains over time by the same characteristics. You can
click on each of those trend lines for information about — on
specific scores. The report also highlights a number of factors
related to mathematics and reading education under focused
learning context. Here, you can see scores by subject and
learning context variables including among others highest
mathematics scores and the question — mathematics is my
favorite subject. For example, the top graph for mathematics
shows the scores of students by highest level of mathematics
course they reported taking. You can see that students reported
taking calculus scored higher than those in lower level
courses. You can see the exact questions and answer options
students encountered on the questionnaire by clicking the
orange link to the right. Scroll down the page to see student
responses to these survey questions by gender, race
ethnicity, and parental education level. For example, by
selecting highest level mathematics course, race
ethnicity, we can see that the percentages of Asian-Pacific
Islander students who reported taking calculus,that’s the
dark green on the right — is higher than other groups and
more than double that of students overall. Click on any
bar to see exact percentages. As in the previous releases for the
2013 results, we offer several ways to explore state results.
Users can view maps and charts showing state achievement level
results, score gains, and achievement gaps by selected
racial ethnic groups. For example, the map on the state
achievement gaps page shows whether the gap between white
and black students, where white and Hispanic students has
narrowed, has widened or stayed the same in the volunteer pilot
states. Scroll down the page to see how these gaps and
respective scores compare across participating states and the to
nation’s public schools. Here you can see that the nation’s
score gap between white and black students in mathematics,
that’s that black bar at the top — is 29 points. If you
look just below to the green bar representing West Virginia,
you’ll notice that the 12 point white-black gap is significantly
narrower than the nation. In fact, West Virginia has the
smallest white-black score of all the participating states
gap. If you focus on average scores, you can see that black
students in West Virginia scored on par with the nations, while
white students scored significantly lower. As always
we encourage you to further explore this site. This wraps up
my part of the presentation today and I would like to close
by thanking the schools and students who participated in
this assessment and who helped make NAEP possible. Thank you.
[Applause] Thank you, John. That
was a lot of data. I’m sure you
have a perspective on what you saw on the data that he
presented and he showed us an opportunity of how to dig a
little deeper. And I hope that some questions are running
around in your mind that you can ask at the end of the our
presentation today. We’re first going to hear the
perspectives of a mathematics educator and an English language
arts educator about what these results mean to them. Our next
speaker is Dale Nowlin, a 12th grade mathematics teacher and
also a member of the National Assessment Governing Board. Dale
is a 37 year teaching veteran who has taught in his home state
of Indiana since 1979, long before some of you were born. He
has taught at Columbus North high school since 1997, served
as the math department chair for North high school and North Side
middle school of the Bartholomew Consolidated school corporation.
Dale is also an adjunct faculty member at Indiana University in
Bloomington where he teaches calculus. Dale is past president
of the Indiana Council of teachers of mathematics and has
won numerous awards and grants including the presidential award
for excellence in mathematics and science teaching. And also
the Lily endowment teacher creativity grant. We are pleased
he could be with us here today and we look forward to your
comments. [Applause] Thank you. It’s really a pleasure to be
here in this fabulous building. As a high school teacher I
always enjoy seeing different school buildings and this makes
me a little envious. I have 5 minutes or so to share my
thoughts and reactions, really looking at the data through the
lens of the classroom teacher. This is a rare treat for me
because as a high school teacher I rarely get to speak for five
or six uninterrupted minutes. The,you saw the big picture
in mathematics — basically the big picture is compared to 2005
there were some gains made but since 2009 it’s been kinda
flat. And you also saw the breakdown by ethnicity. The
first thing that I would like to mention is that-and the first
thing doesn’t relate to the slide you’re looking at‚ the
first thing is if you look at the breakdown by achievement
levels, most of the ethnic groups made numerical gains in
the achievement levels, but they weren’t significant. Looking
from 2009 to 2013. There was one group that was significant and
it was the Hispanics — the percent of Hispanics who scored
at or above basic had a significant gain from 2009. That
was the only one of those that was significant. John showed you
data on achievement gaps and this is another way to look at
that data. If you look at the graph all the groups — we are
looking at whites, blacks, and Hispanics, no that’s just
white — but they’ve made gains. But the size of the gap between
the two groups has remained the same. And if you look at the gap
between whites and Hispanics‚ maybe we don’t
have that one, but you’ll see a similar thing. Oh,
there’s all three. All three groups have made some gains
since 2005, but the size of the gaps has really remained
unchanged. And that’s really important because what you saw
in Dr. Easton’s slides was the changing demographics. Just to
relate that to my local level, when I came to North high school
in 1997, we had a handful of Hispanic students. Few enough
that you could count them on your fingers. And today we are
pushing close to 200 Hispanic students. So if these gaps
persist, and our scores keep going up, our overall average
could still go down because our demographics are changing.
It’s really really important that we look at ways to close
these gaps. And if you think about the changes — I mean,
I’m in a small city in southern Indiana and if the
demographics are changing that much there, that‚ that really
says that they are changing all across the country. The next gap
I want to look at is the male-female gap. The interesting
thing about this; it’s not very large but it has been
really consistent at about a three point gap. And the other
gaps you might attribute to things like they might live in
inner-city versus suburban or difference in parent educational
background or socioeconomic differences. But, if you’re
looking at male-female, they’re coming from the same
places and they have basically the same parent background. So,
we don’t have other things we can attribute those differences
to. When I see those, I wonder if it goes back to the myth that
boys are better at doing math and we need to do all we can to
debunk the myth. I saw that actually recently‚ at my
school, we had 4 students that were in the national finals for
Moody’s mega math challenge in New York City. And I’m telling
you that for one of two reasons: one is that I get to brag about
the students because they went to New York City for the
national finals‚ we were only one of the 6 final schools that
was not a magnet school for math or science. But the other is
that our team was all 4 guys. We didn’t send any girls to the
finals, which had me a little upset . And when I actually
watched the finals on a webcast of the six schools there were 23
teams and only 5 of them were girls. So at all levels we see
that we really need to encourage girls to get into higher-level
math classes and higher-level science classes as much as we
can. When the students take the NAEP assessment, obviously when
they do the math assessment they are asked a whole lot of
questions about math, but they’re also asked other
questions. And you saw those on some of the slides that Dr.
Easton was showing us. They’re called contextual questions and
on the National Assessment Governing Board we have spent
time in the past couple of years rethinking those and looking at
those and how much information comes from those kinds of
questions. And you saw on the data explorer that you can dig
in yourself and look at all kinds of things relating to
those contextual questions. For me they provide some interesting
insight. One of the contextual questions is about parental
level of education. It really looks at the student background
and it also is a way to look at socioeconomic level. And what I
noticed that there is obviously a big difference. It makes a big
difference to a student and here’s where I see that in my
school. We have a lot of students who come from homes
where they get a lots of support from the parents. Their parents
either help them with homework or by the time you’re in high
school maybe they can’t help but at least they make sure that the
homework gets done. They take them on trips every chance they
get and they have educational parts to those trips. They
support them in lots and lots of ways. But we also have students
whose parents can’t afford to take them on trips or their
parents are working a couple of jobs, a couple of
minimum-wage jobs just to survive. In fact, we have many
students where the students are working 20 to 40 hours a week.
We have parents who might have not such a good experience
themselves with school so they might not necessarily value the
education or they’re intimidated by the schools so
they really can’t support their son or daughter in that way. So,
the challenge for us as educators is we need to set up
systems that support those students that don’t get the
support at home. Often we set up systems that work really really
well if the parents are involved in working with their son or
daughter. And the challenge is how do you make it work when
they don’t have all that support at home. The next contextual
question I want to look at is — highest math course taken. We
saw that in a couple of slides that Dr. Easton showed us. And
there’s a definite difference between — this one is looking
at the lowest 25% — those that scored at or below the 25th
percentile versus those that scored at or above the 75th
percentile. Obviously, if you take calculus you ended
up‚ tended to end up with a higher score. One conclusion
from that could be that we put all the students into calculus.
But that would probably — that would definitely be a bad idea.
Because we don’t want to put students into courses that
they’re not ready for. This actually reminds me of several
years ago, we had‚ as professional development‚ we
had a futurist come and speak to our faculty. And one of his
comments to us was that we should have 30% of our students
taking calculus. And of course afterwards all the math teachers
came to me said, that was certainly ridiculous, wasn’t
it? And at the time we had about 5% of our students taking
calculus, we had about one or two calculus classes a year. And
I said, well, that seems like a stretch but I think we can
probably get more students than we have and we worked at the
putting things into place to get more students. We opened up 8th
grade algebra to more students, we started a seventh-grade
algebra program, we rethought how we teach algebra to
students. We offered students the opportunity to do a block
algebra 2/geometry class so they could do 2 math courses in a
year. And next year we have — I don’t know the exact number, but
we have 31% of our students enrolled — 31% of our students
will of had calculus before they graduate next year. What seemed
like unrealistic when we first heard it now seems like normalcy
to us. I think the challenge there is that students really
are capable of a lot more than we think they are if we set
things in place to help them achieve that. I should add; I am
a calculus teacher so I’m a little biased but aside from the
31% taking calculus, we also have three sections of students
taking AP statistics and three sections taking college credit
finite math and one section of students taking multiple
variable calculus. So we have come a long way from where we
were not too many years ago when we heard the futurist come speak
to us. Another contextual variable is student engagement.
And I suppose it shouldn’t surprise us that 36% of the
students in the lower quartile found that their math class was
never or hardly ever engaging. And at the same time that is a
challenge because if we want those students to do better,
then we have to do something to make sure that they are engaged
in math class. In our school the way that we approach that‚ our
theme is‚ who’s doing the math? And if the students are
spending their class time watching the teacher do the math
at the board, then they’re not engaged in they’re not
learning math, they’re just watching someone else do math.
They come away thinking that teacher is really good at math.
So we’ve continued to make efforts to make sure that during
the class time is the students doing the math and that
they’re engaged in doing that. Finally, kind of a related
question, when it comes to the questions of,do students find
their math class to be challenging‚ and I thought
this was interesting‚ there’s not much difference from the
25th percentile and the 75th percentile here. And from a
practical standpoint you would think that if students are
performing poorly on the test, they should find everything they
do in their math class to be challenging. So I don’t know if
this is what this data is showing us, but it reminds me of
recently there was a NAEP curriculum study that came out
and the results were that often math classes‚ the title of the
course didn’t necessarily follow what was being taught.
And one of the pitfalls that we always have to be careful of I
guess, is if we have students who are struggling, which would
be the students in the lower quartile, we can’t water down
the curriculum for those students. We have to find ways,
work really, really hard at finding ways that they’re
doing challenging math and that they stay engaged in it. And
that’s not an easy task. Finally, I’ll just put a
little local perspective on it: I live in a community in the
Midwest that is really heavily into manufacturing. We make a
diesel engines where I live. And I’ve had the chance to tour
several different factories in town. And one thing that really
amazes me is that — actually when I was in college I worked
in a steel mill — and the difference between my experience
working in a factory and when I see the factories now, the
mathematics that typical line workers need in the factories
I’ve been in is comparable to the mathematics that several
years ago was required to go to college. So, we really can’t
separate when we’re looking at math achievement, well these
kids aren’t going to college and these kids are, if
we’re going to prepare kids for the workforce, we really
need to look at preparing all students. Thank you. [Applause] Thank you, Dale. I think he
makes me want to be in his math
class. Our next speaker today is Susan Pimentel, an education
consultant and curriculum specialist and also vice chair
of the National Assessment Governing Board. Sue is a
founding partner of the nonprofit, Student Achievement
Partners, and a standards and curriculum specialist. For close
to three decades her work has focused on helping communities,
districts and states work together to advance during
education reform and champion proven methods for increasing
academic rigor. Her local and state level curriculum work has
included advising such school systems as that of local one —
Prince Georges County Maryland. On the national level Susan was
a lead writer of the common core state standards for English in
language arts and literacy, as well as a chief architect of the
American Diploma Project benchmarks, which were designed
to close the gap between high school demands and postsecondary
expectations. We look forward to your insights, Sue. Thank you and good morning. I just
want to start my remarks by saying that
I think that the 12th grade NAEP is so critical because it gives
us a glimpse into how well prepared — academically
prepared — our students are to go on to college and careers. So
I just want to start with just a little recap of the results that
John, you gave us, that really show that since 2009 not much
has changed. Which isn’t maybe bad news, but it isn’t really
good news. But more troubling are the next results, which show
that from 1992 the trend is down. And that’s really
concerning. Because here we are trying to prepare our students
for college and careers and‚ and — and things are
dropping. It’s more important than when we look at some
results for unsettling trends‚ we’ll call them — for
certain students. So if we could have the next slide, go to the
next slide. Okay, so African-American performance —
we can see this here, right? The average reading score was 5
points lower in 2013 than 1992. The gap has gotten wider when we
give so much attention to closing the gap and then look at
this one — only 4% of students are scoring at the 75th
percentile and yet African-Americans make up —
students make up — 14% of the population. So, this is
troubling. The next slide will show us something about ELL
students. And here, what I think I want you to know is that it
was 10 points lower since 2005 and look at the whopping gap
there is between ELL students and non-ELL students- 53 points.
So, what do all of these results mean? Because we want to make
sense of them. So I want to read to you‚ and you can see it
here, just read along with me — an ELL expert, Lily Wong
Fillmore whom maybe several of you know. And she says, Language
minority students are‚ and I just want to pause
here. Language minority students are students who speak English,
so they’re not or they haven’t come from other
countries with other languages — but they are scoring at about
where English Language Learner scores. So that is why the
researchers call them language minority students — finding
themselves increasingly segregated whether by schools or
by classes where the materials are pitched at a much lower
level than materials meant for mainstream students‚
English learners, Lily says, are especially provided adapted
text. Materials that are so greatly simplified that they
provide virtually no exposure to the forms and structures of the
language they should be learning. There is a lot of
attention and energy focused in turning ELLs into English
speakers and not nearly enough on educating them. What they
ELLs and language minorities need are authentic and
age-appropriate texts which they work on with appropriate
instructional support from teachers who know how to support
language development. Now this is about English language
learners, but these practices also trap a lot of our students.
I believe they drop a lot of our African-American students into a
vicious cycle. And when we look at the contextual variables, you
can begin. I want to talk about these because they corroborate
what the researchers are telling us may be going on. So the first
one that you see is that reading is enjoyable. So students that
strongly disagree with this score 45 points below‚ lower.
45 points. Now, you can say do students that read well enjoy
it?, which is like chicken and egg, or do they enjoy it
and therefore they read better? We don’t know. But then look at
the next; 45 points too.,When I read books I learn a lot. So
students who strongly disagreed with this‚ this statement
scored 45 points below. So now, go back to what Lily Wong
Filmore said. She told us that a lot of students are reading,
mush‚ they’re reading simplified text that aren’t
going get students where they are. So, if you are getting mush
to read, how are you going to learn much? You’re not going
to learn much from it, right? And you wont’ see text as
something you can learn from. So I think there is a correlation
there. And the contextual variables go on, again, from the
student’s own mouths and thoughts. The next slide? So we
got into‚ and John, you mentioned this one too —
students being able to discuss reading interpretations in
class. So again you see, students who do it rarely or
never scored lower. The next one which is students that were
able to explain what they read in class — the students who do
it a lot — score about 28 points higher than students who
do not. So, practice matters here and practice of the right
stuff matters here. So I sort of come down to the place here
where we sort of think about students being ready for college
and careers — we think about them being able to read
challenging text and to be able to cite evidence from it. Which
is what they’re going to be called on to do in the workplace
and it’s what they’re going to be called on to do in their
college and classes. So I do think that while the results are
troubling and they’re stagnant at best, they are going down and
gaps are getting wider. I think we have something of a way
forward here when we put all of the data together. Which is
asking 12th grader students, and I might say asking 12th
grade students and I also might say asking students from, and I
think we heard this, from our Dunbar student, is from the
earliest ages on up, that they’re asked to talk and
write about the challenging text that they read. And that
practice here matters. And I know that this is all
correlational data, but if you think back into your own life,
there are a couple of us who I suppose, you know, if
you’ve never skied before and you go down the slope and would
do it like an Olympic champion. But for most of us, we need to
start and then we need to have lots of practice with it. So I
think that there’s a way forward here as we think about
preparing all of the students for college and careers.
Thinking about what students are telling us they are getting
practice with and they’re not and that they need the
challenging text so that they can learn from it. And with
that, I will stop. Thank you. [Applause] Thank you, Sue. Now
it’s time for questions and answers. And I know that there
have been some issues that you may have formulating in your
mind. If you would like to ask a question, I want you to come
over to this microphone on my right. And we’re going to
begin with a question from our web audience first while you get
your thoughts together so we can pace ourselves. But if you’ll
come over here, I’ll come to you second. Valerie, do we have
a question from the web audience? We do, Cornelia. This
first question is for John — how much do earlier scores
predict a student score in grade 12 and what about graduation
rates? Does that provide any indication? Let me first address
the question about correlation. We don’t have longitudinal NAEP
data, that is we don’t follow the same students so we can’t do
NAEP studies to look at the correlations over time. But
typically, research has shown for decades that test scores are
highly correlated from year to year. One year after another
they tend to be very highly correlated and they get less
correlated over time. Now I’m not sure what exactly you meant
about the graduation rate, but I’m going to go off NAEP a
little bit and talk about research that I did in Chicago
and my colleagues continue to do. We find that the highest
predictor of high school graduation is how well kids do
in their freshman year in high school. High attendance and good
grades are the most important predictor of high school
graduation. I would be happy to provide a citation for that.
Thank you, John. Right, from our audience now — step to the mic
and introduce yourself and then ask your question. Thank you,
I’m Lisa Hansel with the Core Knowledge Foundation. Thank you
for an interesting morning here. Just quick question actually for
Susan — my favorite part of the common core standards is
actually the call for content rich curriculum. And I really
appreciate everything you said about talking about writing
about challenging text. But I think there’s another part of
that, that’s — I would love for you to talk a little bit
about building the vocabulary of knowledge across disciplines.
Absolutely. So the content core really talks about a content
rich curriculum. In fact, some cases — it even names the
founding documents in Shakespeare and early American
lit as content that students should have. The fact is — the
notion is that you know, the 45 points below when they were —
when it was the whole notion about I don’t learn from
text? The whole point of being able to learn from text and a
coherent sequence of topics and that you read about coherent
sequences of text around a topic is so critical. We know from the
research that if students don’t have a knowledge base, they are
less likely to be able to understand a text. You can
imagine yourself or you can imagine myself — physics, for
example. Hard for me to get. If I had — the more knowledge I have
about physics — the better able I then am able to understand
challenging text about physics and vocabulary is critical.
We’ve known for, I want to say, the research dates back
about 100 years, about the critical nature of vocabulary
and the way that you build vocabulary is by reading a lot,
talking a lot about topics. About the knowledge of the world
so that is how you build a vocabulary. So just taking me
back to physics — the more I read in the area of physics and
I get the understanding of the context of a physics and the
kind of vocabulary used, the way the information is getting
presented, the more I understand about physics and the more I am
able to read about physics. So you’re absolutely right and
that’s why that 45 points in I don’t learn from text‚
it is especially troubling. So it is about challenging text
about the things we want students to learn and about
things that students want to learn. I’m glad you raised
that, thanks. Thank you, Sue. We will take another question from
the audience. Please introduce yourself and ask your question.
Good morning. Erica Taylor from the National Education
Association and my question is for John Easton. To what do you
attribute the huge change in sample size from 2009 to 2013 in
participants? That’s in easy one, it’s the participation of
the volunteer states. Okay, thank you. Alright, another from
the audience, go ahead and step to the mic. I’m not sure
that I got the question‚ please introduce yourself, I’m sorry,
please introduce yourself. I’m Kerry Thornhill and I’m a
member of the Dunbar Alumni Federation we are delighted to
have you here. The last question may have related to my concern
about understanding better the methodology because you are the
nation’s report card and but yet you have volunteer states and so
I want to understand how are you‚ are you using the data
from the pilot states to generalize for the nation or how
are you determining what the nations rate is? That’s a very
good question and a number of people have asked a similar
question because it is confusing. We, first of all,take
a nationally representative sample‚ separately,
independently. Okay. Then in the 13 volunteer states we take
extra sampling. And we — so, we use the entire nationally
representative numbers to present the national findings.
Okay. I have a second part to my question. And that is, during
the course of the study, did you learn from any of the
communities what is helping the most in terms of the gains in
both reading and math that we can use to help us advance those
areas in our jurisdictions. I can’t give you specific
findings, but I can tell you some of our process. Dr. Peggy
Carr, who is sitting right close to you, meets with the
state coordinators and meets with them before the public
release and asked the states the kinds of things that might be
going on. So we saw, for example, in Connecticut, very
nice improvement for black males in reading. We saw in West
Virginia a great improvement for black males in mathematics. So,
we can’t actually give you an answer, but we strongly
recommend that some folks out there do some investigation in
this that can be shared more broadly. Okay. Yes, that exceeds
the limits of what NAEP is able to do I suspected so, but,
I’m sure they learned something that we can use. Yes. Thank you
very much. We’ll take another question from the audience now.
Hello, my name is Devin Daniels, and I am a junior at Dunbar high
school. I wanted to ask more along the lines of — when you
showed that from 2009 to 2013 there was not a reasonable
change within reading and math — can you tell us, like, if there
a certain reason for this? Or was it something that you may
have noticed why there wasn’t much of a change during
this‚ sort of, time period‚ in grade 12? I’m afraid that’s
really another question that we don’t know the answer to. We are
asking basically the same — not exactly the same questions, but
the same types of questions, the same format. So nothing is
changing in the test. As I pointed out, there’s been a
slight change in the demographics of the kids who are
taking the test. But we don’t — we are not really able
to answer the questions about why did this happen. That goes
back to the question we just heard, that this assessment
doesn’t really provide much about the causes of changes.
It’s just really descriptive about what is going on. So Sue
Pimentel does work nationally and may have some theories about
your question. Yes, so I do think Dale can probably respond
a little to this too. While we can’t say cause and effect, I
think when we look at what students self report in terms of
the kinds of practice and the kinds things they are asked to
do and the kind of math classes they are allowed to take and
opportunities to take and you look at the differences in their
scores which are substantial, I think there are places here —
if I were still in a school and a principal of the school or a
teacher in the school, I would be looking at this data and I
would look at not just what is happening, but I would then go
back and see‚ well you might know from the data what is
happening in my own class and what is happening in my own
school and I would have a conversation about — are we
asking students to read challenging text? Are we asking
them to read a coherent selection of text so they learn
from the text, remember we had the 45% — are students
being asked to explain and talk about the text they are reading
in class? Because if the students are saying that they
are not and they are scoring a lot lower, I would want to make
sure my own school, that something would be changing. So
I think the contextual variable give us some hints about we need
to pay attention to as educators. I don’t know Dale,
if you have anything to add to that? I don’t know if I have
anything different to add, but the contextual variables are the
key there. They don’t tell us what causes things. But if we
see certain things in place, those students are getting
better scores, then, it would make sense to try to put those
things in place. Do what we can do to put those in place well so
they work. Thank you for your question. Alright, one more from
the audience and then we will go back to the web questions. My
name is Todrick Williams and I’m a junior here at Dunbar
senior high school. I have a question about the graph that
you did about students not liking reading. So I would ask
you, how will you be able to like, make sure that students
actually start to like it and how do you know it’s not based
on their motivation? It can be based on motivation, but I think
as educators our job is to help students see the wonders of
reading. And for teachers to bring forth text that they are
passionate about because they are content rich or they are
just beautiful words and well-written. So, yes, I think
that there are a lot of distractions in life for — for
all of us with technology and the like, but I think that as
educators, and you may all have some ideas about how to
make sure that students are reading, reading a lot and
enjoying it. I do think that there is a mix between really
good text in class where a teacher is bringing these texts,
assisting these students and sort of grappling with this.
Also, make sure that the students are given a volume of
reading. Where you have choices about what you want to read
deeply on or have choices about what you’d like to research
on. But, you know, I do think that there is — when I’ve seen
students — we have seen this with some of the data, that
students like to be challenged. They don’t want just some simple
thing. They want their brains to be respected. If it is
challenging and it feels important, we’re finding
around the country that — that as teachers bring more challenging
text and content rich text as we heard one of our questioners ask
about, the students are really getting into it. So you may have
some ideas, too. Thank you. Alright Valerie, from the
website — and then we’ll go back to an audience question.
Sure, Cornelia. I have a question — we talked a lot
about what teachers and students can do and we talked a lot about
contextual background variables, but let’s talk about what parent
leaders can do. We have a great question from Sheila Holmes,
she’s an NAACP parent leader, as well as a PTSA leader in her
school. And so she wants to know — with this knowledge from this
data — what are the next steps can parent leaders do? Sue and
Dale, would you like to take a stab at this one? Yes, that
actually — that actually reminds me of not just this recent NAEP
release, but the math curriculum study released. I think that
there’s — many of the same things keep coming out. One
thing parents can do is really get an idea what is going on in
their schools and what is going on in the classes. Do all
students have access to high-level classes? And then not
just do they have access to high-level classes but go deeper
into the title of the course and find that they are teaching the
curriculum as it is meant to be taught? And, I guess as a
personal bias, I would say, to find out what is going on in the
classroom in terms of student engagement. Are the students
spending most of their time watching things be done or are
the students jumping in and actually — whether it is reading
or math — are the students doing it and getting engaged in
the discussion and conversation? So I say‚ yay, yay — for everything
that Dale just said. And I think that if I went underneath, that
if I went to a parent meeting now, I might have some of those
contextual variables in my mind — whether it is higher-level
courses or whether it’s being able to — how often are the
students asked to explain what they are reading. I might even
go in and observe. The other thing is though — I do think
that parents can really help — we just talked about motivation
for reading, that parents can help challenge their kids to
read — read through the summer. We know from the research that
the volume of reading is important and it can be about a
topic that they are learning in school where the parents can
help find some other text that can be interesting for them
around those topics or it could be just of a topic of interest
to the child — the teenager himself or herself. So I think
that parents have a role to play here in the volume of reading.
Alright, now a question from our audience again. Good morning. My
name is Mary Ann Wilmer. First of all, I want to congratulate
NAEP and NCES in what has happened since the onset of NAEP
and how far it has been developed from the beginning to
the present time. And I think we’re at a state of the game
where we can collect a lot of the information that people here
in the audience have asked for regarding what has made a
difference. And one way to do this perhaps is to offer to
graduate students looking for topics for thesis and
dissertations to collect that research data and perhaps NAEP,
or NAGB, or the Department of Education can do that. Offer
some research grants to teachers or people in graduate programs.
Thank you, Mary Ann. Alright, Valerie? Cornelia, this will be
our last question. And I think a lot of people might be wondering
this — we have a question from our webcast audience — this is
for you, John. If our state was not included in 2013, is there
an equation or is there some way that we can find an equivalent
measure on math and reading levels? I want to repeat the
question because I’m not sure I get it, so if a state did not
participate in 2013 12th grade, is there a way they can estimate
their performance? Absolutely. At this point there actually is
not. We have done specialized studies in the past that relate
state results to NAEP scores in elementary grades. I’m not
aware of such a study with the 12th grade. Thank you, John. I want to thank this
audience. You have been a
wonderful audience and very engaged in listening to the
presentation today. And thank you to all of the participants
on the webcast. I hope that today’s events marks for you a
beginning of an engagement with this Report Card and the
additional data that are available on the web side. We
know that there are many more questions and ideas to be
considered after hearing the results of this report. We hope
that after today’s event you will continue the conversation
on twitter #NAEP Please visit the NAEP report card website
that features the report card data from today and also on the
event page you can actually find links to the comments from the
panelists and other materials that were presented today. Next
Wednesday actually, the governing board will release
findings about today’s results about 12th graders on NAEP and
their academic preparedness for college. This might answer the
last questioner’s question about,how can a state know
how well they are doing‚ because there are some links in
the research that will be presented to the national
assessments. The difficulty is that NAEP is the only assessment
that tests all 12th graders. So other assessments test samples
of students even in 11th or 12th grade. You can find out about
upcoming events by following the Governing Board and NAEP on
Facebook and Twitter. Journalists with additional
questions on the Report Card might call the Governing Board
Specialist, Stephaan Harris. In closing, I want to thank each of
you — John, Dale, and Sue for your presentation today. And of
course I would like to thank all of you for your participation in
this event. Thank you very much. [Applause]

Leave a Reply

Your email address will not be published. Required fields are marked *