Answer by Josh Siegle, Ph.D. student at MIT:
First of all, I want to say that being a scientist is awesome. I love getting paid to think deeply about the mysteries of the world. I love being able to choose the focus of my research and to take credit for all my achievements. It's hard to imagine myself in any other career. At the same time, I can't help but turn my inquiries onto the practice of science itself. Are there ways in which science could be made more efficient? Absolutely. Are these shortcomings severe enough to warrant leaving science altogether? Certainly not. But they're still worth mentioning, with the hope that reforms will one day change things for the better.
So here's what frustrates me most about being a scientist: Although the tools have become more advanced, the general practice of science hasn't changed in hundreds of years.
Everything from the way projects are funded to the way results are publicized is based on a model that's been around since the time of Newton. There are some notable exceptions, but the majority of institutions are structured in a way that worked in the past but now hinders scientific progress. I speak mainly from the perspective of the biological sciences, but I think the following antiquated practices affect most fields.
Submitting papers to peer-reviewed journals. Journals used to be the perfect solution for disseminating scientific findings. Because publications had to be distributed on paper, authors were required to compete for the limited space available each week, month, or quarter. The peer-review process ensured that all the studies appearing in a given journal satisfied its exacting standards. It was a completely sensible solution to a very real problem. Now, in the age of the Internet, this model is obsolete. Space for hosting publications is virtually unlimited. Sites like Quora demonstrate that crowdsourced peer review can be effortless and elegant. And the stipulation that every manuscript be a self-contained, immutable entity now seems downright ludicrous. Although things will likely change in my lifetime, it's frustrating that I still have to deal with the politics of publication.
Funding individuals rather than projects. The idea of the scientist as a lone thinker—experiencing "Eureka!" moments in the midst of heavy contemplation—still dominates the public imagination. But I don't think solitary science is a viable way forward. This is partly inspired by the optimistic view that the Web will make collaboration easier, encouraging scientists to engage in projects that span multiple laboratories. But it also stems from the pessimistic view that all the easy problems have been solved. Perhaps scientists have been hacking away in isolation for so long that they've exhausted all the fundamental laws of nature that can be discovered this way. It's possible that the next big breakthrough will necessarily come through massive collaboration. The "ideal" scientists may no longer be Darwin with his notebooks or Pasteur with his flasks, but rather the nameless workers at NASA or the Large Hadron Collider who make progress by pooling their mental and monetary resources.
Entangling labs with universities. At one time, it made sense that the creation of new knowledge and the teaching of old knowledge took place at the same institutions. Universities furnished the money, the books, and an endless supply of malleable minds, while famous professors bestowed prestige upon their gracious hosts. But now we've reached a point where neither party is upholding their end of the bargain. Researchers can no longer depend on universities for money, and they must look to the government to fund their science. Cutthroat competition for funding leads scientists to neglect their teaching duties; few and far between are the professors who value their courses as much as their publications. Once again, the Internet makes it much more efficient to do away with the old model. If science moved to dedicated research institutes and lecture courses moved online, everyone would benefit.
Training scientists through apprenticeships of indeterminate length. This is the thing that frustrates me the most as a current Ph.D. student. Yes, it's true that the path to any prestigious position will have a high rate of attrition. But it's not clear that all those who make it through the Ph.D. process are well-prepared to be professors. And those who fail are usually too disgruntled to stay in science. All their enthusiasm for the subject they once loved so dearly has been sucked out of them, and they leave to pursue more lucrative undertakings. Ph.D. programs could be improved by making expectations more realistic and especially by promoting respectable endpoints that involve staying in science but not becoming a professor. The system made sense when available professorships were abundant relative to the number of incoming students, but it's almost criminal in today's environment.
Taken together, these practices have the effect of discouraging risk-taking, which is exactly the opposite of what we'd like to do to scientists. The all-encompassing importance of publications makes it necessary to tackle projects that have a high likelihood of success. Negative results—despite the crucial role they play in moving science forward—rarely make it past peer review. Funding agencies also favor proposals that promise incremental progress. While this has probably helped deny a few crackpots from receiving government money, I'm sure it's also stopped many able-minded individuals from pursuing radical, potentially game-changing ideas. Meanwhile, the university system bogs down professors with teaching duties, committee meetings, and other responsibilities that are tangential to their research. Tenure may give them job security, but they're still stuck in a cycle of writing papers to support their grant proposals. It's no wonder that so many visionary projects get shelved in favor of safer undertakings. On top of all this, graduate students are getting pulled into a system where growing demoralized and leaving entirely is the rule rather than the exception.
Clearly something is wrong with today's science practices, and with scientists being as rational as they are, almost everyone agrees. So why isn't anything being done about it? A quotation by Machiavelli in 1513 sums it up perfectly:
It ought to be remembered that there is nothing more difficult to take in hand, more perilous to conduct, or more uncertain in its success, than to take the lead in the introduction of a new order of things. Because the innovator has for enemies all those who have done well under the old conditions, and lukewarm defenders in those who may do well under the new.
Basically, scientists are aware of the problems, but nobody wants to take the reins and guide the system in a new direction. The people best situated to bring about change are also those with the least to gain from it. And the young scientists (like me), who hope for a better future, would put their careers on the line for a cause that might not succeed.
It's hard to know what to do, especially when I'm thriving in my current position. I hope that once my job outlook is more secure, I'll be able to fight to implement new policies. But at that point I'll have others that depend on me, and they might not be so keen to step outside the comfort zone of incremental progress. It's easy to come up with new models for how things should work, but maybe I should take advantage of the one we have now. In the scheme of things, it could be much, much worse.
Furthermore, the "solutions" to the above problems could easily have unforeseen consequences that end up changing things for the worse. For example, the current system of journal publications may be somewhat arbitrary, but at least it makes your goals concrete: publish in Nature or Science in order to get a job. If this model is scrapped in favor of something "fairer," it may well move the arbitrary judgments to another step in the process. Likewise, changing the funding system to promote visionary research may lead to cries for more of the incremental projects that have become unfashionable.
My general feeling is that we should take steps to update science for the 21st century, but we shouldn't complain too much about the current state of the system. In fact, we should actually be doing the opposite. There are frustrating things about every industry, but there's only one that gives you such incredible freedom to probe the workings of the universe. While we should take note of the ways in which the practice of science has become outdated, we shouldn't get too distracted. There's research to be done!
* * *
Answer by Colin Gerber, Parkinson's researcher:
How much luck ends up being involved with your experiments and the fact negative results are almost impossible to publish.
You can design a great experiment and perform it very well, and after a year or two of working on it, you can still end up with nonsignificant results. Unfortunately, now you have two years of work that you cannot publish because journals rarely accept nonsignificant results.
This bothers me a lot because as a result of negative experiments (ones with nonsignificant results) not being published, different scientists will repeatedly do the same experiment that gets those negative results without knowing that someone else had already done the experiment and found that the results were not significant. Sometimes knowing that something is not involved can be just as important as knowing that something is involved.
More questions on Scientists:
TODAY IN SLATE
The Budget Disaster that Sabotaged the WHO’s Response to Ebola
Are the Attacks in Canada a Sign of ISIS on the Rise in the West?
PowerPoint Is the Worst, and Now It’s the Latest Way to Hack Into Your Computer
Watch Little Girls Swear for Feminism
Fascinating Maps Based on Reddit, Craigslist, and OkCupid Data
Welcome to 13th Grade!
Some high schools are offering a fifth year. That’s a great idea.
The Actual World
“Mount Thoreau” and the naming of things in the wilderness.