Translate

Wednesday, May 31, 2006

Data analysis helping teachers tailor lessons

A good use of computerized systems would be digital portfolios that would follow children from grade to grade, enabling truly holistic and authentic systems. -Angela

Data analysis helping teachers tailor lessons
High-tech systems comb test results to try to boost students' skills

Tuesday, May 30, 2006
By ANDREW D. SMITH / The Dallas Morning News

Stacy Kimbriel has been teaching kids to read for more than a decade. But only this year – with the help of a computer – could she quickly discern which skills were eluding which students at Meadows Elementary School in Plano.

And after receiving the year's first round of TAKS results, she reported a success not only for most of the 27 students pulled aside for extra help but also for the new data-management system that combed through test results to single them out for tutoring.

Such systems, according to the U.S. Department of Education, have the potential to help improve education across the nation.

"Schools can analyze student performance today in ways they couldn't have dreamed of a couple of years ago," said Tim Magner, the department's director of education technology. "More importantly, these systems often allow them to analyze data in real time, so they can solve problems as soon as they arise."

School districts across the country are working to roll out new technology. Locally, new systems are being discussed in Coppell, installed in Mesquite and used in Frisco, Highland Park, Irving, Plano and Richardson. Allen, McKinney and Dallas are expanding their systems.

"None of what we do now would have been possible just five years ago," said Jim Hirsch, Plano's assistant superintendent for technology.

"No one was selling the sort of data storage and data analysis systems that schools need to spot trends or to provide customized instruction for each student."

In addition to keeping long-recorded information such as attendance and grades in a single, accessible place, some new systems note each standardized-test question, the skill it measures and each student's answer. By matching student errors with skills tested, the systems show who knows what. Systems can also spot classwide weaknesses, so teachers know when they are underteaching, or misteaching, particular topics.

Individualized teaching

Kathy Hargrove, associate dean of the school of education at Southern Methodist University, said that ideally, improved test scores would be a byproduct, rather than the goal, of more individualized teaching.

And though there is the potential for data misuse, such as teaching to the test, she said she would have relished such a program during her 15 years as a classroom teacher.

"It would make things much more efficient than having whole-class instruction," she said. "You can group and regroup" students according to what they have mastered.

Joan Herman, co-director of the national Center for Research on Evaluation, Standards and Student Testing at UCLA, said data systems are as good as the tests they analyze.

"Does that slicing and dicing give you a reliable result?" she said. "The first question should always be the caliber of the test."

For example, an awkwardly worded question might trip up students who know the material, she said. But if such traps are avoided, she said, data analysis systems can be an invaluable tool.

System vendors make no claim that data analysis systems can replace good teaching or hard work by students.

"Data systems are not a magic bullet," said Jonathan Harber, CEO of the system vendor SchoolNet. "Our systems are tools. They help teachers and schools measure what kids need and what strategies work. But measuring ... is only the first step down a long road toward improvement."

Computer programs designed to handle high-level math and statistics first became commercially available in the early 1990s. Then it took nearly a decade before companies started selling products tailored for schools. Widespread use of the products is newer still; several vendors sold more systems last year than in all previous years combined.

The systems still are not able to do anything that people cannot do for themselves. Teachers and administrators can track achievement skill by skill, student by student and school by school. They can also give benchmark tests – and many Texas districts long have – to figure out which kids are ready for tests such as TAKS.

Without computers, however, the process is slow. Ms. Kimbriel, the literacy specialist from Plano, used to spend several weekends a year holed up with calculator and graph paper, analyzing test scores. It took her weeks to compile fairly basic information.

The Dallas school district decided two decades ago to compile data and use computers to analyze each school's effectiveness, but only now is it parsing all its data at the individual student level.

"We're working now to bring our system into individual classrooms, so that teachers can view all the data for the kids in their classes," said Bob Mendro, assistant superintendent for research and evaluation.

After testing this spring, the system should go districtwide in the fall, he said: "Teachers should have immediate access to benchmark tests. As time goes by ... and we transfer more data from our old Microsoft storage programs to our new Oracle system, the teachers will be able to see more and more."

With achievement-tracking technology still in its infancy, few districts have used it long enough to gauge its impact.

The largest and longest case study comes from Philadelphia. In 2002-03, just 22 of the district's schools met federal standards for making adequate yearly progress. Last year, after two years of using data analysis to guide teachers' efforts, 132 schools met the mark.

"We have obviously made a lot of reforms recently, but I think our data management is easily the most important," said Gregory Thornton, Philadelphia's chief academic officer.


$17.88 per pupil

To finance its system, which soon will let parents monitor their children's progress, Philadelphia spends $17.88 per pupil. "I think we spend more money on bathroom supplies," said district CEO Paul Vallas.

Basic models can cost less than $2 per student per year, and the most elaborate 10 times as much. But at that price, when new test scores come out, the vendor types them in.

Plano just switched from a system that costs $340,000 a year – about $6 a student – to one that costs $260,000 upfront and $87,000 a year going forward.

Mesquite will pay $342,000 this year to set up its system and $150,000, or just under $5 per student, going forward. Coppell expects to go with a larger system that would cost $15 to $18 a year per student.

Despite the systems' potential, some see peril as well.

"With so many vendors and such rapidly evolving products, school districts are naturally quite nervous about committing to what turns out to be the wrong system," said James Rusk, a former science teacher who oversees Mesquite's installation.

Still, even skeptics are optimistic about computerized student evaluation.

"Things will go wrong," said Todd Oppenheimer, author of The Flickering Mind: Saving Education From the False Promise of Technology. "Some districts will waste millions on lousy systems. Others will use these systems as an excuse to focus even more narrowly on skills that can be measured by standardized testing.

"But these systems are fundamentally different from most of what has preceded them. They don't promise to change the learning process and make life easy for the student. They promise to make life easy for the teachers, to show teachers where students need help and give teachers more time to provide that help. It's hard to argue with those goals."

E-mail asmith@dallasnews.com


THE CANS AND CANNOTS OF DATA ANALYSIS

Given sufficient data, analysis systems can:


Quickly determine which skills individual students have and haven't mastered.

Spot classwide trends such as a consistent problem with certain skills.

Enable administrators to see the strengths and weaknesses of various teachers.

Evaluate the efficacy of textbooks and teacher training programs.

Allow parents to track their kids and measure them against their peers.

Data analysis systems cannot:

Grade essays or short-answer questions.

Devise questions to measure student ability.

Determine why students are struggling or how to help them.

Guarantee success on the TAKS or the SAT or any other test.

Do anything that a person could not do, given time, with the same data.


DATA MANAGEMENT IN LOCAL SCHOOLS

Districts using data management systems:

Frisco

Highland Park

Irving

Plano

Richardson

Expanding their systems:

Allen

Dallas

McKinney

Installing a new system:

Mesquite

Discussing a new system:

Coppell

Online at: http://www.dallasnews.com/sharedcontent/dws/news/localnews/stories/DN-schooldata_30met.ART.North.Edition1.1354bd38.html

No comments:

Post a Comment