1
Sir — Your Editorial “Taking a hard line on conflicts” (Nature 433, 557; 2005) under- states the extent of the embarrassing failure by the US National Institutes of Health (NIH), during the past year, to disclose information about NIH scientists serving as paid consultants to private companies. Having been a scientist at the NIH since 1967, as a physician, a cell biologist and now a science administrator, I must add that the habit of non-disclosure continues, making further embarrassments likely. In November 1995, the then-director, Harold Varmus, made an unpublicized change to the rules on paid consulting by NIH scientists. The permissiveness of the 1995 rules opened the way for a spree of consulting that ended on 1 February 2005, when a stricter policy was announced. The previous decade left a legacy of harm, only part of which has come to light. In articles published since December 2003, the Los Angeles Times has given ten specific examples of NIH scientists with financial conflicts of interest. However, details of hundreds of others remain hidden, and the extent of the damage caused since 1995 is unknown. NIH director Elias Zerhouni told a Senate subcommittee last year that he had found no harm to patients as a result of NIH scientists’ outside financial arrangements. However, he provided no evidence to support this claim and did not explain how he arrived at his conclusion. On 2 February 2005, I attended a large meeting of NIH scientists at which Zerhouni spoke of “very disturbing” product endorsements by NIH scientists “speaking for a company on behalf of a product to entice physicians to prescribe a product at greater levels”. He provided no correspondence NATURE | VOL 434 | 17 MARCH 2005 | www.nature.com/nature 271 information about the products, the companies, the payments or the scientists. During the past year, Zerhouni has said that achieving transparency (defined by the NIH as public availability of financial information filed by NIH employees) is “absolutely critical” and “first on the agenda”. But much of the information about past consulting, going back to November 1995, remains undisclosed. The NIH’s defensive approach — one congressman called it stonewalling — has proved a disastrous miscalculation. If the NIH does not unflinchingly seek the facts and release them, they may come out anyhow in other ways. Then the NIH’s reputation, already at the lowest point in its history, will suffer further. Ned Feder National Institutes of Health, 2 Democracy Plaza, Bethesda, Maryland 20817, USA Limits to growth may be subtle but still inexorable Sir — Dick Taverne, in his review of James Gustave Speth’s recent book Red Sky at Morning, comments about a “wildly inaccurate” 1972 report by researchers at the Massachusetts Institute of Technology (MIT): “The Limits to Growth predicted that we would run out of gold, zinc, mercury and oil before 1992” (“When greens see red”, Nature 432, 443–444; 2004). This is a very common myth which needs correction. The precipitous collapse of economic growth projected by the MIT report does not occur before 2020 in any of the model runs, so it is hard to see how its doomsday projections have been disproved. For a 30-year-old projection using some eight state variables, the MIT model did a surprisingly good job of predicting where we were in 2004, as an update shows (D. A. Meadows et al. Limits to Growth: 30 Years On, Earthscan, London, 2004). One run of the model even had the natural-resource constraint removed by assuming an unlimited source of energy. The model’s mechanism of collapse is more subtle than simple resource exhaustion. Suppose, as an example, that when the Chinese want to replace the steel mills they are now building, they find that materials are available but costs are 10 times higher, while their productivity in the interim has only doubled. Hard-pressed to keep steel production growing at all, they will have even less mill capacity available to make materials for replacement in following years, and so eventually will be over- whelmed by the spiralling legacy of repairs. Whether or not this is a model for a reckless world’s future, failure of revenue to cover accelerating physical depreciation is a frequent explanation for urban decline. David Fisk Department of Civil and Environmental Engineering, Imperial College London, London SW7 2AZ, UK More power needed to probe cloud systems Sir — By using the distributed power of personal computers around the planet, D. A. Stainforth and colleagues (Nature 433, 403–406; 2005) have quantified uncertainty in forecasts of global warming resulting from a doubling of CO 2 . If warming occurred at the upper end of the predicted range, the effects for humankind would be utterly catastrophic. Such forecast uncertainty arises, in large part, from the way cloud systems are represented in existing global-climate models. Because of limitations in computer power, the underlying laws of physics cannot be used directly to simulate such systems; instead they are represented by approximate formulae with uncertain parameters. In the paper by Stainforth and colleagues, an ensemble of many thousand individual global-warming forecasts was analysed. The forecasts were all made with the same climate model. However, the parameter values varied from one forecast to another — the intra-ensemble variation in these values being consistent with their inherent uncertainty. Global-climate models are now being developed in which one of the most important types of cloud system, ‘organized deep convection’, is computed explicitly. Such convective systems, with horizontal scales of tens of kilometres, play a key role in climate: they cool the Earth’s surface, they transport water from the surface into parts of the troposphere where they can contribute to the greenhouse effect, and their kinetic energy influences global-scale climate circulations. Unfortunately, even on the most powerful computers, climate simulations in such models take nearly as long as real time. Such high-resolution climate models cannot be run efficiently using distributed computing technology. To reduce uncertainty in estimates of global warming, the climate-modelling community requires substantially more powerful computational resources than are currently available, so that more of the climate system can be simulated directly from the known laws of physics. In view of the seriousness of the global- warming problem, plans for developing such a facility for climate prediction should occur at the international level, so that national resources can be pooled and scientific collaborations enhanced. Within Europe, the European Union could play a leading role. T. N. Palmer European Centre for Medium-Range Weather Forecasts, Reading RG2 9AX, UK NIH must tell whole truth about conflicts of interest Little has yet been disclosed about a consulting spree that has left a legacy of harm. Nature Publishing Group ©2005

NIH must tell whole truth about conflicts of interest

  • Upload
    ned

  • View
    212

  • Download
    0

Embed Size (px)

Citation preview

Sir — Your Editorial “Taking a hard line onconflicts” (Nature 433, 557; 2005) under-states the extent of the embarrassing failureby the US National Institutes of Health(NIH), during the past year, to discloseinformation about NIH scientists servingas paid consultants to private companies.Having been a scientist at the NIH since1967, as a physician, a cell biologist andnow a science administrator, I must addthat the habit of non-disclosure continues,making further embarrassments likely.

In November 1995, the then-director,Harold Varmus, made an unpublicizedchange to the rules on paid consulting byNIH scientists. The permissiveness of the1995 rules opened the way for a spree ofconsulting that ended on 1 February 2005,when a stricter policy was announced. Theprevious decade left a legacy of harm, onlypart of which has come to light.

In articles published since December2003, the Los Angeles Times has given tenspecific examples of NIH scientists withfinancial conflicts of interest. However,details of hundreds of others remainhidden, and the extent of the damagecaused since 1995 is unknown.

NIH director Elias Zerhouni told a Senate subcommittee last year that he had found no harm to patients as aresult of NIH scientists’ outside financialarrangements. However, he provided noevidence to support this claim and did notexplain how he arrived at his conclusion.

On 2 February 2005, I attended a large meeting of NIH scientists at whichZerhouni spoke of “very disturbing”product endorsements by NIH scientists“speaking for a company on behalf of aproduct to entice physicians to prescribe aproduct at greater levels”. He provided no

correspondence

NATURE | VOL 434 | 17 MARCH 2005 | www.nature.com/nature 271

information about the products, thecompanies, the payments or the scientists.

During the past year, Zerhouni has saidthat achieving transparency (defined by the NIH as public availability of financialinformation filed by NIH employees) is “absolutely critical” and “first on theagenda”. But much of the informationabout past consulting, going back toNovember 1995, remains undisclosed.

The NIH’s defensive approach — onecongressman called it stonewalling — hasproved a disastrous miscalculation. If theNIH does not unflinchingly seek the factsand release them, they may come outanyhow in other ways. Then the NIH’sreputation, already at the lowest point in its history, will suffer further.Ned FederNational Institutes of Health, 2 Democracy Plaza,Bethesda, Maryland 20817, USA

Limits to growth may besubtle but still inexorableSir — Dick Taverne, in his review of JamesGustave Speth’s recent book Red Sky atMorning, comments about a “wildlyinaccurate” 1972 report by researchers atthe Massachusetts Institute of Technology(MIT): “The Limits to Growth predictedthat we would run out of gold, zinc, mercuryand oil before 1992” (“When greens seered”, Nature 432, 443–444; 2004). This is avery common myth which needs correction.

The precipitous collapse of economicgrowth projected by the MIT report doesnot occur before 2020 in any of the modelruns, so it is hard to see how its doomsdayprojections have been disproved. For a 30-year-old projection using some eightstate variables, the MIT model did asurprisingly good job of predicting wherewe were in 2004, as an update shows (D. A. Meadows et al. Limits to Growth: 30 Years On, Earthscan, London, 2004).

One run of the model even had thenatural-resource constraint removed byassuming an unlimited source of energy.

The model’s mechanism of collapse is more subtle than simple resourceexhaustion. Suppose, as an example, thatwhen the Chinese want to replace the steelmills they are now building, they find thatmaterials are available but costs are 10 timeshigher, while their productivity in theinterim has only doubled. Hard-pressed tokeep steel production growing at all, theywill have even less mill capacity available tomake materials for replacement in following

years, and so eventually will be over-whelmed by the spiralling legacy of repairs.

Whether or not this is a model for areckless world’s future, failure of revenueto cover accelerating physical depreciationis a frequent explanation for urban decline.David Fisk Department of Civil and EnvironmentalEngineering, Imperial College London,London SW7 2AZ, UK

More power needed toprobe cloud systemsSir — By using the distributed power ofpersonal computers around the planet,D. A. Stainforth and colleagues (Nature433, 403–406; 2005) have quantifieduncertainty in forecasts of global warmingresulting from a doubling of CO2. Ifwarming occurred at the upper end of thepredicted range, the effects for humankindwould be utterly catastrophic.

Such forecast uncertainty arises, in large part, from the way cloud systems are represented in existing global-climatemodels. Because of limitations in computerpower, the underlying laws of physicscannot be used directly to simulate suchsystems; instead they are represented byapproximate formulae with uncertainparameters. In the paper by Stainforth andcolleagues, an ensemble of many thousandindividual global-warming forecasts wasanalysed. The forecasts were all made with the same climate model. However, theparameter values varied from one forecast

to another — the intra-ensemble variationin these values being consistent with theirinherent uncertainty.

Global-climate models are now beingdeveloped in which one of the mostimportant types of cloud system,‘organized deep convection’, is computedexplicitly. Such convective systems, withhorizontal scales of tens of kilometres, playa key role in climate: they cool the Earth’ssurface, they transport water from thesurface into parts of the troposphere wherethey can contribute to the greenhouseeffect, and their kinetic energy influencesglobal-scale climate circulations.Unfortunately, even on the most powerfulcomputers, climate simulations in suchmodels take nearly as long as real time.Such high-resolution climate modelscannot be run efficiently using distributedcomputing technology.

To reduce uncertainty in estimates ofglobal warming, the climate-modellingcommunity requires substantially morepowerful computational resources than are currently available, so that more of theclimate system can be simulated directlyfrom the known laws of physics.

In view of the seriousness of the global-warming problem, plans for developingsuch a facility for climate prediction shouldoccur at the international level, so thatnational resources can be pooled andscientific collaborations enhanced. WithinEurope, the European Union could play aleading role.T. N. PalmerEuropean Centre for Medium-Range WeatherForecasts, Reading RG2 9AX, UK

NIH must tell whole truth about conflicts of interestLittle has yet been disclosed about a consulting spree that has left a legacy of harm.

17.3 Correspondence 271 PJ 14/3/05 2:54 pm Page 271

Nature Publishing Group© 2005