September 10, 2010
Critics of his administration felt the city had juked its school stats. To address their concerns, money was set aside for the Independent Budget Office to hire a DOE data watchdog. Nearly a year later, Raymond Domanico has arrived as the IBO’s Director of Education Research. Prior to joining the IBO, Domanico worked for 11 years as the Senior Education Advisor to the Industrial Areas Foundation – Metro NY, a network of community organizations.
What about being the IBO’s director of education research appealed to you?
Back in July, I was hosting a group of people from Germany, from Berlin, who had come to visit our schools. At the close of dinner they said to me, “Ray, if you were in charge, what would you do with the school system?” And I gave them the same answer I’ve been giving a lot of people over the last year and a half. I said, “You know, there’s been so much change in the New York City schools and it’s happened so quickly, and we really don’t have a very deep sense of what worked and what has not worked.”
And so I found myself unable to answer the question as to what we should do going forward. It seems to me that given the amount of change that’s gone on, this is the appropriate time to step back and to do some in-depth analysis.
How will this job compare to your data analysis work for the Board of Education in the 1980s?
The world has changed a lot. In those days we never evaluated schools; we only evaluated programs. The whole concept of talking about good schools or low achieving schools was not even on the radar. Towards the end of my tenure in the mid-to-late 1980s was the first time we started putting together indicators of school performance. So this is sort of a return to where I started in some ways.
I authored the first cohort report on the graduation and drop out rate before I left. Prior to the release of that report, as is the case today, there was a big argument over whether the Board of Ed is overstating the graduation rate. When we looked at the data, there was a third factor that hadn’t been discussed before and that was simply that some students, many students, were not resolved after four years. They were coming back for a sixth or seventh year. And that’s really the first time we were able to shed light on that.
State officials recently admitted that, over the past several years, the state tests have gotten easier and students’ scores have become inflated. Do you think this is true of other data points — graduation rates, for example?
I think all data points are worthy of in depth and fair analysis. We know what some of the issues are around the graduation rate, a lot of it we read in GothamSchools. We have an analyst beginning to look at those questions. Some of the questions about the current graduation rate we’ll be able to answer quickly, I think. Others are going to require really delving into the data the Department of Ed, in the last couple of weeks, has started to provide us on individual students.
So for example, on the issue of credit recovery, it’s not clear what the best indicator is of what’s actually going on. As far as we know, there’s not an indication in the automated record that the child was given credit because of credit recovery. So the question of the impact of credit recovery on the graduation rate is something we’ll really have to roll up our sleeves and get into. There’s not going to be a quick answer to that.
What data are you the most interested in beginning to look at? Are graduation rates first on your list?
No, that just happens to be a part that was under way before I got here. I should say that prior to my arrival, the senior staff at the IBO had engaged in a rather extensive listening tour, talking to many different people about what they think some of the issues are. I’m going to continue that work, I think it’s very important that as the director of this effort, that I be keeping my ear to the ground as to what are the issues of concern that people in both government circles and in the larger public want answers to.
What issues have people told you they want you to look into?
I think if we went year-to-year, there are always hot button issues. So last year, the big issues seemed to be the closing of those high schools and certainly the attendant issue of ATRs and whatnot. I think a lot of people are interested in having us look at the means by which the DOE evaluates schools and comes up with those decisions. And also to look at some of the total costs and benefits of closing schools.
I think that two issues I’d like to investigate are: what is the impact of what we now know about the state testing program and the way the Department of Ed uses that data to come up with the progress reports? And secondly, the definitions and cutoffs have changed from year to year, and so I think in any weighting system or assessment system —whether it’s in education or in the financial world — you want to see consistency from year to year.
You’ve supported some of the chancellor’s initiatives in the past (e.g. fair student funding). Is it going to be difficult to convince people that you’re impartial?
Well, I don’t know that I’ve publicly supported a lot. Certainly fair student funding is an issue that we were concerned about, before it was called that, and before even Joel Klein came to the job. We worried a lot about the allocation of senior teachers in the school system. Klein picked up on this on his own, this thing called fair student funding emerged.
But the story of fair student funding has gotten real, real complicated. I’m not sure at this point whether or not it fulfilled its promise. I would have no problem reporting, if the data says this, that hey, it’s not working the way that we had hoped. I don’t think I’d have any problem being impartial about that.