Monday, October 08, 2007

Getting Rid of Scientific Noise: Lifetime Publication Limits?

Here's an interesting thought experiment proposed in a Nature Medicine editorial: what if each scientist could only publish 20 papers during his/her entire career?

"you get 20 tickets. Every time you publish a paper, you hand over one of them. Once you run out of tickets, your publishing days are over. As simple as that.


If we adopted this model, many articles reporting incremental advances would no longer be written, and many specialized journals would disappear. And with far fewer papers to read, each one reporting a much more complete piece of research, search committees or funding bodies could directly evaluate the work of a given scientist, instead of (as is often the case) leaning on surrogate indicators such as a journal's impact factor or number of citations.

At the extreme, we might not even need journals (and editors) anymore; everything would be published in preprint servers like those used by physicists, and the community would simply evaluate and rank the different contributions as they become available. This way, the whole community could act as reviewers, doing away with the existing peer-review process. This is somewhat reminiscent of what some websites are already trying to do, so far with limited success. But if everybody agreed to publish just 20 papers to keep the size of the literature manageable, then the journal of the future might conceivably be a preprint server."

It's a thought-provoking proposal, and I would certainly love to see a renewed focus on quality, rather than quantity in scientific research. Good science requires time and commitment to a specific problem, not a mad rush to produce meaningless data and minimum publishable units. You know there's a problem with the system when even at the undergraduate level, many students don't even seem to feel they have the time to learn how to pH solutions or perform basic assays because they're so busy calculating the shortest possible path to getting their names on their lab's next paper.

However, imposing an arbitrary lifetime publication limit is not a practical solution whatsoever. First of all, it's impossible to say what the appropriate number of lifetime "quality pubs" is going to be, and that number is going to vary from one researcher to the next, one field to another, etc, etc. Second, such a proposal does not address the issue of career advancement through the academic hierarchy. Currently, the total number and impact factor of pubs is what gets nascent scientists recognition, respect, jobs, and funding. What we need if we want to get back to a focus on quality over quantity is a shift in scientific culture, not a voucher system.


3 comments:

Anonymous said...

The most important flaw in this proposal is that it would be a step in the opposite direction from an open communicative scientific community. Increasing the time between discussing results with colleagues, which is what this would cause, in "the information age" is stupid. This proposal is not thought through.

rasmussenanders said...

I work on the cerebellum, and the role of the purkinje cells in classical conditioning. Though this may seem like a small field I read one estimate in Nature Neuroscience that there is one new article about Purkinje cells being published every single day. In other words it is completely and utterly impossible to read everything new even about a single cell type in a not so hyped part of the brain...

It is really a problem when researchers think in terms of "smallest publishable unit" - an extremee focus on quantity. I think that some measures favoring "larger" more comprehensive articles would definetly be desirable. On the other hand 20 publication limit does sound a bit radical. Maybe the change has to come from the journals?

Great post, as always!

Keith Robison said...

There are worse ideas out there, but this one is a doozy.

First, the limit is set ridiculously low. Many very good scientists have published more than 20 solid papers in their lifetime.

Second, the idea that publication needs to be restricted is seriously flawed. Which is worse: too many papers, or scientific findings never seeing the light of day. Only when findings are published can others build upon them. In integrative fields such as systems biology any solid information can be drawn into the analysis and pay dividends.