Blog List

Measuring the Impact of the Pandemic on Grade 2-3 Reading Performance for 2020-21

For this brief report, the impact of being out of school and doing home-based learning for grade 3 students was examined in a rural school district. The sample size is approximately 300 students.

Measures:

  1. I-Ready beginning of the year (BOY) reading test overall percentile score converted to an NCE (normal curve equivalent) score.
  2. Beginning of Grade 3 (BOG) ELA test percentile rank score converted to NCE score.
  3. Difference scores between each student’s I-Ready BOY NCE and the corresponding BOG NCE score. By using intra-individual score differences, differences in overall average score from the two different cohort groups were not a confounding factor for this analysis.
  4. BOG percent proficient was computed using counts of level 3-5 and dividing it by the total number of scores. This was computed for each school.

Procedure:
For the fall 2019 and the 2020 (current year) the BOG scores were collected and matched into a single table with the previous year I-Ready BOY data score.

  1. The joined table included columns for the year, school, SID, I-Ready tier, NCE BOG, and NCE I-Ready scores.
  2. A difference score was computed for each student by subtracting the BOY I-Ready NCE score from the BOG NCE score. For example:
    2020 BOG NCE = 50 and the BOY I-Ready NCE score = 45  difference = +5
  3. For each school the average of the differences was computed. 
  4. NCE score differences were disaggregated by school and by reading tier (Tier 1, Tier 2, and At-risk for Tier 3).

Findings: 

  1. The average I-Ready to BOG NCE difference for the 2019-20 school year for the district was 6.5 with a range of 1.0 to 9.7.
  2. The average I-Ready to BOG NCE difference for the 2020-21 school year for the district was -19.8 with a range of -17.8 to -24.2. 
  3. Differences by Tier for the two different school years were as follows:
    1. 2019 Tier 1   5         Tier 2  6.7        At-Risk for Tier 3  10.1
    2. 2020 Tier 1   -26.8      Tier 2  -20.7     At-Risk for Tier 3  -9.9
  4. The 2019 BOG district proficiency was 24.1% and this is a consistent score for the last three years. The 2020 district proficiency was 11.5%.

Conclusion:
The impact of being out of school for the spring of 2020 due to the pandemic is measurable for grade 3 students using the methodology described above. Furthermore, the results suggest that there is a negative impact on reading performance for the student in this district, and the impact cuts across all reading tier levels.

Actions:
 Present this data to the schools and make a school roster of students for each school so that the school leaders and teachers can identify the students who experienced the greatest negative impact so that interventions can be provided.

Use the I-Ready diagnostic data to determine if there is a pattern of weak areas across the most negatively impacted students. For each student determine specific areas had the greatest negative impact and provide targeted individual intervention.  

Policy Impact:
While growth for grade 3 students can be determined by comparing BOG to EOG NCE scores, Percent proficient for grade 3 is quite likely to be lower than the 2018 EOY percent proficient. This is likely to be the case for ELA for grades 3-8. Accountability using the current targets will be problematic.

Test Score Analytics Using APEX

What is the value for a school district to have all of its student performance data in one database?

The answer is quick and accurate multi-year analytics. After the initial report is created, each year is uploaded and populates all of the reports.

Example

Here is an example from a school district, that is looking forward to this year’s Beginning of Grade 3 test (BOG) data and making comparisons to other test data. Using a brief SQL statement the mean scores and standard deviations can be computed and reported for each school for each year in the APEX data system.

Then the distribution of scores in the form of a histogram for each of the years can be viewed. Using a simple drop-down filter, the year and the individual school can be overlaid and viewed .

The question is then what is the relationship between the BOG scores and the i_Ready reading scores when both are converted to Normal Curve Equivalent (NCE) scores? Here is the sample SQL code for that report:

select
    BOG.SCH_CODE as SCH_CODE,
    count(BOG.BOG_NCE),
    round(CORR(BOG.BOG_NCE, I_READY_RD.IR_RD_NCE), 3)
 from I_READY_RD I_READY_RD,     BOG BOG
   where I_READY_RD.SID=BOG.SID and BOG.YEAR = 2020
      and I_READY_RD.TEST_DATE = ‘BOY’
        group by BOG.SCH_CODE
      order by BOG.SCH_CODE

The next question is: What does this data look like as a scatter plot. In the scatter plot, an entire district can be shown for a testing year or the scatter plot can be made to show one school, and then the other data color will show al of the remaining schools. In conclusion, when a district has the BOG and I_Ready (or I_Station) data for this year, it will be better able to understand the significance of the information and perhaps understand the impact on being out of school for months has had on its students.  

Curriculum: What Is Being Taught and Learned?

Perhaps it is time to finally put informative assessment into action.

Preface
My work involves collecting, reporting, and analysis of student performance data from common assessments taken by students. With schools closed and the suspension of state and local assessments, there is no data to collect and use for guiding schools and teachers. Due to this situation I have had time to revisit the topic of formative assessment and the underlying concept of curriculum and instruction.

Facets of Curriculum
The intended curriculum refers to the curriculum documented in state and local curriculum guides. These curricula are also further refined and detailed in curriculum unpacking and pacing guide resources. These documents and resources form the outline of what is to be taught from a policy and standards perspective.

The assessed curriculum is closely linked to the intended curriculum when assessments are built to determine the extent to which the intended curriculum was learned by the student. It is understood and expected that there is alignment between the two curricula. Frequently, the assessed curriculum is documented in statements of the standards and the weight each standard is represented on the assessment used to measure mastery of the curriculum.

There is a danger of creating a mismatch between the intended curriculum and the assessed curriculum when teachers create assessments of what they have taught. Differences in interpretation of the standard, poor matching of items to assess the standards, and lack of rigor in the teacher’s assessment can provide teachers with what may appear to be valid results but may not provide meaningful information as the students’ performance relates to the intended curriculum.  At the core of assessment is validity. An assessment item or task must be representative of the intended learning outcome.

To compound the problem of alignment and validity, what is taught in the classroom may not be aligned with the intended and or assessed curriculum. The enacted curriculum is what is actually taught in the district and classroom. District differences between intended and enacted curriculum may be due to local emphasis on some content, availability of instructional materials, teacher preparation, and school or teacher bias. Efforts have been in place for decades to ensure there is an alignment between the intended curriculum and the enacted curriculum at the classroom level. Principals have had a practice of reviewing teacher lesson plans and more recently long –term instructional plans as a means of monitoring alignment.

The final facet of the curriculum paradigm is the learned curriculum. While this concept is closely linked to the assessed curriculum, it differs in that the learned curriculum is what is actually acquired by the student. The learned curriculum connects the enacted curriculum to student performance and requires some means of assessment to determine if there was a positive connection between the enacted and received curricula.  

Documenting Enacted Curriculum
The curriculum schools need to be informed about is what teachers actually teach: the enacted curriculum. From this data evaluation can be made which compares what is going on in the classroom to what should be going in the classroom from a curriculum perspective. Documenting the enacted curriculum has been done by looking at a teacher’s lesson plans and surveying teachers.

There are two major problems with this information: 1) planned instruction is still in the realm of intended curriculum and 2) surveying teachers may not be an accurate representation of what was actually taught if the data collection is not done regularly and in a systematic way.

In a short-term limited study in math, I had a teacher each day select from a list of math curriculum standards, skills that were being taught that day. Additional information such as the depth of knowledge (DOK), the instructional methodology employed (direct instruction for example).were also in the online data collection system. Over time, the system was able to provide reports of the dates, standards taught, a count of the standards taught, and the sub-skills for each standard. This information was then compared to the district pacing guide.  It was determined by this data collection that the teacher’s enacted curriculum was matching the district’s intended curriculum. While this data collection could be done daily, the data was collected each time a new standard and its sub-skill was taught. The strength of the data collection system was the granularity of the data collected and the reporting.

Documenting the Received Curriculum
While it was worthwhile to know what was being taught, a missing component was being able to gain insight into the received curriculum. Essentially, this took on the form of curriculum-based assessment. Phase two of the study added a student performance data collection using a simple 6 point scale (0-5). For each class the teacher’s instruction was recorded in the system, the teacher also recorded each student’s performance from absent = 0 to 5 = being full mastery.  Using this data, the data system could then report a class average of performance by standard, and a student profile of all of the standards and the average performance on each standard. Teachers could use this data as informative assessment and modify classroom instruction to improve class performance or to identify students who need extra instructional attention. 

The resulting information provided a summary look at the long-term enacted curriculum, data on class and individual student performance, and a student report which could be shared with the student. Student performance as recorded in this system could then be compared to the student’s benchmark or end of year’s assessment results.  

Sources:
https://repository.upenn.edu/cgi/viewcontent.cgi?article=1058&context=cpre_researchreports
https://www.jstor.org/stable/10.1086/428803?seq=1

Oracle Application Express 20.1 Version Released

Oracle continues to support the ongoing improvement of Application Express (aka APEX). This application building tool is a low-code environment that makes building database-driven applications easier than writing all of the HTML and SQL code from scratch. Also, because it is embedded within the Oracle database, there is no need for add-on reporting tools. This low-code development tool saves hundreds of hours in development time and is very versatile.

For example, NC DPI uses APEX for the development of its EDDIE system.
(See http://apps.schools.nc.gov/ords/f?p=125:1)

This new version includes an enhanced search feature for its reports, responsive report widths in its interactive reports, and an expanded library of graphing and charting types.
To see more about its features go to this URL.  
https://apex.oracle.com/en/platform/features/whats-new/

Data Smart LLC uses APEX to build custom student reporting systems for school districts. Other applications created include a student performance recording for direct daily measurement (DDM), a team meeting recording system, and a curriculum development tool. Within the “family’ of applications, all data can be combined and accessed for reporting and analysis.

Do You Provide Files to Vendors?

If the school district does, then you will want to learn about ETL

What is ETL?

ETL stands for Extract, Transform and Load. It is the data process in which data is exported to an ETL program, changed in some way, and then pushed to a new location, such as a local or remote database table.

The process can be accomplished by a series of scripts that require expertise to write, attention to see if they are executing correctly. In my work at Data Smart LLC, I no longer write scripts to transform data for uploading to our Oracle database. Instead, I use an ETL program that automates the process.

While I have used a few ETL programs, however, I found an ETL program that I really like and recommend. The program is EasyMorph  https://easymorph.com/   You really need to check out their website and watch the video demonstration. Now here is the great news about this program.  There is a free version so you can get the feel of the program and I use it for most of my ETL work. The professional version is about $750 per year. 

Here is a typical process that I create (in about 30 minutes):

On a pre-defined schedule that I set up in the program, the process is run.

  1. Set up a scheduled export from a source like Power School so the file is saved on your computer.
  2. EasyMorph will Import the file(s) from my desktop or a database source. (EXTRACT)
  3. Then the program will change the column headings, data type if I need number instead of text, remove unwanted characters such as “N/A”, and change the file from a .txt or tab-delimited to a .csv separated file type and save it. (TRANSFORM) This transforming process is accomplished by selecting pre-programmed actions which are arranged in a timeline sequence.
  4. The program then moves on to the process step of uploading the data file to the destination. (LOAD) The destination can be back to your desktop for attaching to an email, a local database or a remote hosted database table.
  • The real power comes from the ability to execute SQL queries on the data table BEFORE uploading the data. So if you are replacing data daily to a remote server, you can either on the front end merge your previous data file and select only new data, or run a TRUNCATE statement to remove all data and start fresh or run an UPDATE statement to add data if the data is changed.
  1. Lastly, I get an email from the program telling me that the process ran and a log of errors if there were any errors.

This is by far the best solution to getting your data from point A to point B in a format that meets the vendor’s specifications. No this is what I call be data smart.

Dr. Lew Johnson
Oracle APEX Developer
Data Smart LLC