Second attempt (2005)
On the surface, the rejected 2004 proposal had laid out a methodology very similar to that used in the 1999-2000 pilot study in Baltimore. It called for many more volunteers700 instead of the fewer than 150 that took part in the pilot. But after five years in the community, it had become easier to recruit based on word-of-mouth. By that time, says GHCC director Miller, the esprit de corp s among members became so strong that they just brought people in. [Participants] were the best recruiters. [15]
The 2004 proposal for an RCT had also called for more schools: 20 in the treatment group and 20 in the control group, compared to only three each in the pilot. This seemed feasible to Fried and her team. In the years since the pilot, Experience Corps had been implemented in 12 schools; GHCC and COAH could hardly keep up with demand from interested principals.
Doubts . But the 2004 grant application had in fact masked growing tensions between COAH and GHCC over methodology. McGill and Millerwhile convinced of the value of a large-scale studywere increasingly concerned about the way it was to be carried out. Recruitment of 700 volunteers was not a sure thing from their perspective. Since the 1999-2001 pilot, it had become easier to recruit based on word-of-mouthbut volunteers in the intervening years did not have to agree to be randomized.
Similarly, it had been relatively straightforward to get the six schools within GHCCs neighborhood to agree to be randomized for the pilot, but GHCC had been working with those principals for years. By 2004, the program had expanded well outside GHCCs home turf; the RCT would take in schools even further afield. Would these principals agree to randomization? How do you explain to somebody that really wants to go into the schools and do good, and to principals who are dying for the program to happen, that you have half the people sitting outside? McGill asks. She believed it would not be an easy sell.
Another factor was politics. Miller cites four constituencies that had vested interests in shaping the RCT: in addition to COAH and GHCC, there was also the school system (BCPSS) and the city government, including the mayor and members of the City Council. BCPSS was in a period of turmoil, with frequent turnover of school superintendents. In the early years, Fried and McGill and their organizations had worked directly with school principals, instead of going through a formal approval process with BCPSS.
But as the program spread, and with the prospect of a long-term research study on the horizon, BCPSS wanted to assert some authority. One of its demands was that Experience Corps be introduced to schools with the greatest neednot randomly selected ones. The mayor and City Council, meanwhile, wanted the treatment schools to be allocated geographically, with each members district hosting at least one. Again, this precluded a randomized sample. McGill and Miller felt these demands had to be acknowledged. Political support would be important to the long-term viability of Experience Corps. The politicians and bureaucrats could not simply be ignored.Finally, there was the matter of funding. Although the cost of recruiting volunteers would be covered by the NIH grant, the burden of raising money for implementing the much-expanded program would fall on GHCC. The NIH money provided a great deal of resources for the research, but provided almost no resources for the implementation, says Miller. Yet what most GHCC funders wanted to see was immediate impact. They didnt want to wait years to learn the result of an academic study. The same was true of principals, the school system and the mayors office. Quick data would be more persuasive than robust data. Everybody talks about the need for measurable data and quantifiable data and all of that, says Miller. But as much as they all say that, my experience with foundations is, they make their decisions based on anecdotal information.
If they won an NIH grant, the COAH researchers could afford to operate on a longer timeframe and fully analyze the data for scientific publication. So there was a real conflict in priorities based on their need for analysis and withholding the data until it could be published, and our need to use that data, says McGill. Because of GHCCs immediate needs, it had steadily collected its own performance data. True, that did not carry the same validity as a randomized study. But, as she points out, one of our big supporters was the mayor, and you dont tell the mayor you dont have any idea whether this is working.
McGill on the conflict over publishing data.
These latent disagreements came to a head as Fried prepared to resubmit for 2005. McGill, Miller, and the GHCC board were increasingly concerned that the future of Experience Corps in Baltimore would be jeopardized by the strictures of the RTC. Recruitment, political support and funding were at risk. In most matters related to the study, they had been willing to defer to the researchers. They supported Frieds mission and believed in the value of the academic research. But the methodology for randomizing the schools, they felt, required compromise. I had to get my director [Miller] to back me up saying, No, we arent going to give up, recalls McGill. As Miller describes it: We had a little bit of a different agenda. Both agendas I felt were valid. But they were different agendas.
Miller on agendas for fundraising.
They met with Fried and her research team to raise their concerns. Fried was attuned to the challenges of conducting research in the community. She understood the perspectives that militated against randomization. At the same time, there was a place for robust research. That was what policymakers and major foundations required if they were to support the model embedded in Experience Corps. Quick data would not be enough to truly understand the benefits of the program. McGill summarizes the standoff:
The researchers, appropriately enough, felt under the gun with NIH funding the study, that it had to be done right, so they couldnt let me foul that up. On the other hand, I was responsible to my community and to my funders, and I couldnt let them foul that up, either.
Fried considered how to develop the 2005 NIH grant application, and whether there was a way to alter the methodology from the previous years submission. The NIH required the strictest scientific methods, and had already rejected an application that observed those. Would it accept the proposal if the schools were not randomized? On the other hand, the NIH also wanted proof that the community-at-large was on board for the duration of the study, and that was now in doubt. If the researchers could develop an alternative to randomizing schools, would the NIH accept it as the price of winning buy-in? What would the methodology be, exactly? Implementing the program only in the worst-off schools was risky; Experience Corps might not produce a measurable impact in a school with overwhelming needs. Moreover, even small adjustments to the methodology up front could lead to difficultiesanticipated and unanticipatedwhen it came time to analyzing and making sense of the data later on. How might the validity of the results be affected? Would they be convincing?
If COAH insisted on strict randomization, it risked losing political support, which could endanger the long-term survival of Experience Corps in Baltimore. Was it worth jeopardizing the program in one city to prove its value more generally? Or GHCC might simply refuse to carry out the study altogether, and continue delivering Experience Corps without COAHs involvement; the researchers would lose the opportunity to see if it really worked. In a series of internal meetings, and in consultation with executives at GHCC, Fried and her team considered how best to proceed.
[15] Authors telephone interview with William Miller, on December 13, 2013. All further quotes from Miller, unless otherwise attributed, are from this interview.