College Week

America’s Top University

Does college need to be reformed?

Click here to read more from Slate’s “College Week.”

A recent survey of “the world’s top universities” by Jiao Tong University in Shanghai reports that 17 of the top 20 institutions are in the United States, with Cambridge (No. 3), Oxford (No. 8), and Tokyo (No. 14) the exceptions on the list. The rankings are largely based on quantifiable measures of research performance, mostly articles published in prestigious journals and internationally significant research awards, such as Nobel Prizes. Reviewing these results, a recent issue of the Economist concluded that our country has “almost a monopoly on the world’s best universities [and] provides access to higher education for the bulk of those who deserve it.”

It is indeed clear from such objective measures that we are very successful at supporting research and, because of its close connection to research, graduate education. But I am not so sure how to evaluate our success in undergraduate education. We are quite good at drawing a large number of highly diverse students into our undergraduate programs, but there is a growing chorus of complaints from parents and students—and some professors themselves—about excessively large classes, too many courses taught by grad students, and a lack of educational guidance for undergraduates. How shall we determine whether we are doing the right thing educationally for them?

The purposes of undergraduate education have changed fairly dramatically over the past century. Higher education in the 19th century was conducted primarily by private church-sponsored colleges, four-year institutions that provided a very traditional undergraduate curriculum based in the classical subjects, from mathematics to rhetoric, and geared toward the moral development of young men (and some women). The few private universities at the time were little more than colleges of the sort just described, and they coexisted with the emerging professional graduate schools, especially law schools. The counterpoint to this set of institutions was the advent of “land grant” universities under the Morrill Act of 1862. Embodying a strongly utilitarian notion of public higher education, the act established public universities dedicated to training in the practical arts, especially agriculture and engineering.

With the creation of Johns Hopkins University and the University of Chicago late in the 19th century, however, the German notion of the secular, more scientifically based, research-oriented university began to emerge. At the same time, the professionally organized, modern academic disciplines developed—economics, sociology, physics, chemistry, and the like. The disciplines were soon institutionalized into academic “departments,” which quickly became the organizing principle of higher education. These new universities were dedicated to research and Ph.D. training, a model soon adopted by both public and private universities.

The collegiate tradition remained strong, though it also became secular, especially in the rapidly growing number of so-called “liberal arts” colleges. That tradition also took root within the research universities. The older curricula were replaced by “the liberal arts”—the notion that undergraduates needed to be exposed to something like the full range of disciplinary knowledge. In the first half of the 20th century, the universities struggled to balance their commitment to the academic disciplines around which graduate education was organized with their historical commitment to liberal education. Undergraduate educators began to experiment both with “general education,” broadly interdisciplinary courses aimed at giving students a more sweeping perspective on their cultural heritage, or with distribution requirements (a smorgasbord, requiring students to choose one course from history, one from life sciences, etc.) to force breadth of knowledge upon students. Both colleges and universities also strove to add education for democratic citizenship to their agendas, in response to the patriotism of World War I and, especially, as progressive sentiments and ideology infused the public culture. Social science courses were instituted, for example, that used urban problems as sites for investigation and reflection on how local democracy should work.

By the middle of the 20th century, the agenda for undergraduate education was broad and growing. It soon had to cope with the tremendous expansion of college attendance after World War II when, thanks to the GI Bill, the democratization of higher education was under way. Curricula did not change much, although there was renewed interest in providing the sort of broad, interdisciplinary underclass (freshman-sophomore) courses that Columbia University had pioneered early in the century, and that the University of Chicago had championed in its “Great Books” approach before World War II. After the war, Harvard instituted a new general-education program for underclassmen enunciated in its “Red Book,” General Education in a Free Society. Many other schools followed suit, embracing the Red Book’s argument that the more socioeconomically diverse undergraduate student bodies needed the broad exposure to the intellectual currents of Western culture that had primarily been transmitted to the elite educated in private secondary schools.

General-education programs like Harvard’s were a response to the new demographics of higher education, but were built on a half-century-old tradition. Much bigger changes began to occur in the 1960s. To some extent they were the product of the political radicalization of students and younger faculty in response to the civil rights movement, the ongoing tensions of the Cold War, and the grinding pressures of the Vietnam War. The traditional hierarchical lines of authority in the university came under attack and students demanded empowerment, insisting on greater freedom to think and act for themselves. At the same time, and for some of the same reasons, new and frequently interdisciplinary fields of study began to appear, ranging from Afro-American and women’s studies to biophysics and neuroscience. Simultaneously, universities were compelled (by the same forces) to recognize that not all “civilization” was Western. One of the few beneficial effects of the Cold War, after all, was the emergence of awareness of the rest of the world—Africa, Asia, Latin America, the Pacific—in university curricula.

Knowledge in general was expanding at such a stupendous pace that it was hard to know any longer what ought to count as “coverage” in undergraduate education—or even whether “coverage” was a plausible goal in an information age dominated by identity politics. The assumptions of general education began to look naively blinkered, or tradition-bound, or at any rate like a hopelessly inadequate attempt to bring students into meaningful contact with the bewildering range of intellectual life. One widespread response to this sense was the development, at Harvard and elsewhere, of “core” programs. These were (and are) intended to divide the life of the mind into methodological rather than substantive categories: quantitative and historical reasoning rather than the great ideas of literature and philosophy or the history of science. In essence they teach undergraduates to use the same analytical categories as their instructors.

But if educators were unsure what to do for undergraduates, the implications for graduate education were clear enough: The drive to ever greater research-based specialization was on. Over the past two decades in particular, universities have further reorganized themselves to emphasize research, especially scientific research. This has meant adopting the superstar model of faculty recruitment (which generally includes an enticing package of high salaries, research funding, and reduced teaching). It has also meant the creation of research centers, stocked with graduate and postgraduate students, as sites often equal in importance to the disciplinary departments, and more important than departments for their capacity to attract external research funding. The rapidly growing research specialization of the university has had the effect of making the content of undergraduate majors themselves more and more specialized and research-based.

This has not happened everywhere, especially not in liberal arts colleges, which have mostly remained bastions of general education, focused as they are entirely on undergraduate students. But even among these colleges, there are many Harvard wannabes that demand high levels of research productivity from faculty members who used to be primarily teachers. These institutions also encourage the same sort of disciplinary specialization for students that has distorted the mentoring capacities of their teachers.

To be sure, the news is not all bad. Many of the best research scholars are also brilliant and dedicated teachers. The same can be said of many of the graduate students who increasingly instruct younger students. In addition, countless millions of dollars have been poured into improvements for libraries and other physical facilities, many of which are primarily for the use of undergraduates.

Still, I do not think we are doing all we can do to come to terms with either the intellectual or the structural difficulties that confront American undergraduate education in the 21st century. I’m dubious that the U.S. Education Department’s recent appointment of a commission on higher education to develop what Secretary of Education Margaret Spellings calls “a comprehensive national strategy” will offer much more than blue-ribbon-style pronouncements on the thorny financial problems facing a higher-education system that has become prohibitively expensive.

But there are promising signs of interest elsewhere. The Association of American Colleges and Universities has launched a decade-long initiative “to expand public and student understanding of what really matters in college—the kinds of learning that will truly empower them to succeed and make a difference in the 21st century.” Harvard University has begun a serious effort to re-imagine its core curriculum, and what Harvard does always has an impact on other educational institutions. The University of California has appointed a prestigious commission to re-examine undergraduate general education across all of its campuses. The fact of the matter is that our system of higher education is so diverse and complex, and the challenges are so many, that there are not going to be national or simple answers.

I see twin issues confronting us. The first is organizing undergraduate educational experiences in light of the breadth and complexity of contemporary knowledge across all fields. Can we sustain the ambition of the first half of the last century to cover or at least sample the great ideas of the (Western) world? What constitutes “general education” in a globalized world? Or do we need to reconceive the problem and require students to dig deeper and more imaginatively? How to do that? This is a difficult intellectual problem, but it is also a pedagogical problem. Do the techniques of teaching and learning that we have traditionally employed for undergraduates suffice in our new intellectual circumstances? What we have learned, I think, is that the most effective learning is active learning, that teaching must involve presenting students with problems to solve rather than merely lecturing about those problems. We need to ask whether we are getting the most out of technology for both teaching and learning, and how we can use information technology as a better handmaiden of active learning—for instance, by creating Web sites that permit students to research a wide variety of primary sources in order to create their own solutions to the sorts of problems that animate their courses.

The second issue is structural, and it particularly (but not solely) concerns universities. I suppose we are past the point of no return in restructuring the university as an organization based on research centers, and the recruitment of faculty almost entirely according to their aptitude for research. If so, what can we do within the university to utilize this reality for the benefit of undergraduate education? There is, for instance, widespread agreement on the importance of undergraduate research as an effective learning strategy. It has been highly successfully, especially in the sciences. We are coming to believe that students in all fields must engage in collaborative learning experiences. How can those be better used in the humanities and social sciences? Is there anything to be done about reorienting the reward system in faculty recruitment, promotion, retention, and compensation to encourage more engagement with undergraduate students? Does “Mr. Chips” have to be a figure of fun in the contemporary university, or could he (or she) be a model to emulate?

I once carelessly said that if I had a magic wand I would know what to do in order to begin reforming undergraduate education. So, Slate has asked an assortment of academics—professors and a president, from large and small, public and private institutions—to answer the question: What would you do with the magic wand? Their answers will post over the course of the next three days.

Click here to read answers to the question: What should undergraduates emerge from college knowing?