Higher Learning in the Digital Age:
Download selection to desktop
View as PDF
First, Some Nostalgia
The Challenge of Change
The ITFRU Project
The Executive Leadership Core Workshops
Some Final Remarks
This event brings back many memories. It was precisely 15 years ago this week, in mid-October, 1989, when I attended my first Educause (rather, then Educom) meeting. In fact, it was one of my first duties as a newly installed president of the University of Michigan was to host the Educom conference in Ann Arbor.
It was a memorable experience for another reason. As some of you may remember, it was during that meeting, on October 17th, 1989 that the Loma Prieta earthquake hit the San Francisco Bay area (during the World Series, in fact). We had many hundreds of guests from the Bay Area, and fortunately using our networks, we were able to provide them with access to their families and friends.
This week’s meeting seems a bit more secure, at least geologically. However when we look back over the past 15 years, many of the developments in digital technology over that period are shaking the foundations of higher education, much like that earthquake. And this is my subject this morning.
My talk will be divided into three parts:
- First, I think it important to share with you some of my own background, since, besides the fact that my six decades on the planet have roughly overlapped the existence of the electronic digital computer, but my discipline, nuclear science and engineering, drove much of the remarkable evolution of digital technology.
- Next, I will summarize a National Academies study of the past several years aimed at understanding the impact of information technology on the future of the university and share with you some of the conclusions of that study.
- However most of my remarks will concern the recent activities of a follow-on effort known at the IT-Forum of the National Academies, aimed at first getting information technology squarely on the radar scope of academic leaders as one of the most critical strategic issues they must deal with and then helping them develop strategies appropriate for their institutions and the higher education enterprise more broadly (including federal policy.)
First, Some Nostalgia
Throughout my life I’ve been an insatiable consumer of digital technology. As a nuclear scientist, it was key to my research and teaching. Then later as a dean, provost, and president, it remained essential to my role as an academic leader.
Early in my career I worked on national projects such as the Rover Program at Los Alamos to develop nuclear rocket engines to power a manned mission to Mars and then later in Q-Division at Lawrence Livermore National Laboratory to explore laser-driven thermonuclear fusion. Since both phenomena are rather hard (and expensive) to study in a laboratory, we used the fastest computers in the world to simulate these nuclear systems. In fact, during the early 1990s, I found myself in hog heaven at LLNL with an allocation of 1 hour a day of CDC 7600 time for my calculations!
As the shift from big iron to minicomputers to microcomputers continued, my computing needs also shifted, from one of the first HP-35 calculators to a room full of Apple II computers (networked into our mainframes) that I used to teach one of the first microcomputer classes in the 1970s. In the early 1980s, we formed a relationship with Steve Jobs and at one point probably had the largest installation of Apple Lisa computers in the world! (We gave one to Chuck Vest as a memento, reconfigured as a planter, when he left Michigan for the presidency of MIT.)
Later in the 1980s, we recruited Doug Van Houweling from Carnegie-Mellon to help us set up and manage NSFnet, a precursor to the Internet, where we joined with IBM and MCI to build and operate the backbone. It was just about a decade ago, when one of my colleagues, Dan Atkins, dropped off a little piece of software he had picked up at the University of Illinois and asked me to give it a spin over the holiday season. It was MOSAIC…and my days as a “Gopher” and “FTP-er” had come to an end. The World Wide Web captured the attention of the commercial sector, and the Internet (and dot-com bubble) took off! Michigan stepped out of the way of the commercial juggernaut, but we have continued in Ann Arbor with important collaborative efforts such as the SAKAI project to develop the middleware to learning and scholarship, Internet2 to develop the next generation internet, and several other important initiatives “soon to be announced”…
During the last years of leading the University of Michigan, we dreamed up an interesting experiment, known as the Media Union, a place where we could use information technology to bring together not only the disciplines but the various approaches to scholarship and learning, creativity and innovation. (In fact, this immense playpen for students and faculty now carries a new name: “The Dude”.)
But my real interests in recent years have concerned how digital technology is reshaping the university itself, and that brings me to my first subject: ITFRU!
The Challenge of Change
Most of you in this audience realize that higher education has entered a period of significant change as our universities attempt to respond to the challenges, opportunities, and responsibilities facing them in the new century. The forces driving change are many and varied:
- The globalization of commerce and culture,
- The advanced educational needs of citizens in a knowledge-driven global economy,
- The exponential growth of new knowledge and new disciplines,
- And the compressed timescales and nonlinear nature of the transfer of knowledge from campus laboratories into commercial products.
We are in a transition period where intellectual capital is replacing financial and physical capital as the key to prosperity and social well being. In a very real sense, we are entering a new age, an age of knowledge, in which the key strategic resource necessary for prosperity has become knowledge itself, that is, educated people and their ideas.
Our rapid evolution into a knowledge-based, global society has been driven in part by the emergence of powerful new information technologies such as digital computers and communications networks.
- Modern digital technologies have vastly increased our capacity to know and to do things and to communicate and collaborate with others.
- They allow us to transmit information quickly and widely, linking distant places and diverse areas of endeavor in productive new ways.
- This technology allows us to form and sustain communities for work, play, and learning in ways unimaginable just a decade ago.
- It has broadened access to knowledge, learning, and scholarship to millions throughout the world.
Information technology changes the relationship between people and knowledge. And it is reshaping in profound ways knowledge-based institutions such as our colleges and universities.
Of course higher education has already experienced significant change driven by digital technology. Our management and administrative processes are heavily dependent upon this technology. Research and scholarship are also highly dependent upon information technology, for example, the use of computers to simulate physical phenomena, networks to link investigators in virtual laboratories or “collaboratories,” and digital libraries to provide scholars with access to knowledge resources. There is an increasing sense that new technology will also have a profound impact on teaching, freeing the classroom from the constraints of space and time and enriching learning by providing our students with access to original source materials.
Yet, while information technology has the capacity to enhance and enrich teaching and scholarship, it also poses certain threats to our colleges and universities.
- We can now use powerful computers and networks to deliver educational services to anyone, anyplace, anytime, no longer confined to the campus or the academic schedule.
- Technology is creating an open learning environment in which the student has evolved into an active learner and consumer of educational services.
- Faculty loyalty is shifting from campus communities and universities to scholarly communities distributed in cyberspace.
- The increasing demand for advanced education and research from a knowledge-driven society, the appearance of new for-profit competitors, and technological innovations are stimulating the growth of powerful market forces that could dramatically reshape the higher education enterprise.
In fact, some believe that the very future of the university, at least as we know it, is at risk:
Peter Drucker: “Thirty years from now the big university campuses will be relics. Universities won’t survive. It is as large a change as when we first got the printed book.”
William Wulf: “If you believe that an institution that has survived for a millennium cannot disappear in just a few decades, just ask yourself what has happened to the family farm.”
Frank Rhodes: “I wonder at times if we are not like the dinosaurs, looking up at the sky at the approaching comet and wondering whether it has an implication for our future.”
The ITFRU Project
It was just such concerns that stimulated the National Academies of the United States (i.e., the National Academy of Science, the National Academy of Engineering, the Institute of Medicine, and their umbrella research organization, the National Research Council) to launch a major project to understand better just how this technology was likely to affect the research university, a project that I have been chairing for the past three years.
The National Academies have a mandate to track the health of the nation’s scientific and technological capability, and the nation’s research universities represent a very significant component of that intellectual infrastructure.
The premise of the National Academies studies was a simple one: The rapid evolution of digital technology will present many challenges and opportunities to higher education in general and the research university in particular. Yet there was a sense that many of the most significant issues are neither well recognized nor understood either by leaders of our universities or those who support and depend upon their activities.
The first phase of the ITFRU Project (Information Technology and the Future of the Research University) was aimed at addressing three sets of issues:
- To identify those technologies likely to evolve in the near term (a decade or less) that might have a major impact on the research university.
- To examine the possible implications of these technology scenarios for the research university: its activities (teaching, research, service, and outreach); its organization, structure, management, and financing; and the impact on the broader higher education enterprise and the environment in which it functions.
- To determine what role, if any, there was for our federal government and other stakeholders in the development of policies, programs, and investments to protect the valuable role and contributions of the research university during this period of change.
The steering group for the effort was comprised of leaders from higher education, the chief technology officers of major IT companies, and leaders in national science policy.
ITFRU Steering Group
James Duderstadt (Chair), President Emeritus, University of Michigan
Daniel Atkins, Professor of Information and Computer Science, University of Michigan
John Seely Brown, Chief Scientist, Xerox PARC
Marye Anne Fox, Chancellor, North Carolina State University
Ralph Gomory, President, Alfred P. Sloan Foundation
Nils Hasselmo, President, Association of American Universities
Paul Horn, Senior Vice President for Research, IBM
Shirley Ann Jackson, President, Rensselaer Polytechnic Institute
Frank Rhodes, President Emeritus, Cornell University
Marshall Smith, Professor of Education, Stanford; Program Officer, Hewlett Foundation
Lee Sproull, Professor of Business Administration, NYU
Doug Van Houweling, President and CEO, UCAIC/Internet2
Robert Weisbuch, President, Woodrow Wilson National Fellowship Foundation
William Wulf, President, National Academy of Engineering
Joe B. Wyatt, Chancellor Emeritus, Vanderbilt University
Raymond E. Fornes (Study staff), Professor of Physics, North Carolina State University
Over two years the steering group met on numerous occasions to consider these issues, including site visits to major technology laboratories such as Bell Labs and IBM Research Labs and drawing upon the expertise of the National Academy complex. At the end of this period, we assembled over 100 leaders from higher education, the IT industry, and the federal government, and several private foundations for a two-day workshop at the National Academy of Sciences to focus our discussion. Beyond the insight brought by these participants, perhaps even more striking was their agreement on a number of key issues:
The first finding of the Academies’ steering committee was that the extraordinary pace of the IT evolution is likely not only to continue but could well accelerate.
In thinking about changes to the university, one must think about the technology that will be available in 10 or 20 years, technology that will be thousands of times more powerful as well as thousands of times cheaper. Put another way, over the next decade, we will evolve from “giga” technology (in terms of computer operations per second, storage, or data transmission rates) to “tera” and then to “peta” technology (one million-billion or 1015). We will denominate the number of computer servers in the billions, digital sensors in the tens of billions, and software agents in the trillions. The number of people linked together by digital technology will grow from millions to billions. We will evolve from “e-commerce” and “e-government” and “e-learning” to “e-everything,” since digital devices will increasingly become our primary interfaces not only with our environment but with other people, groups, and social institutions.
The second finding of the committee, in the words of North Carolina State University chancellor Mary Anne Fox, was that the impact of IT on the university is likely to be “profound, rapid, and discontinuous,” affecting all of its activities (teaching, research, service), its organization (academic structure, faculty culture, financing, and management), and the broader higher education enterprise as it evolves toward a global knowledge and learning industry.
If change is gradual, there will be time to adapt gracefully, but that is not the history of disruptive technologies. As Clayton Christensen explains in The Innovators Dilemma, new technologies are at first inadequate to displace existing technology in existing applications, but they later explosively displace the application as they enable a new way of satisfying the underlying need.
Although it may be difficult to imagine today’s digital technology replacing human teachers, as the power of this technology continues to evolve 100- to 1000-fold each decade, the capacity to reproduce with high fidelity all aspects of human interactions at a distance could well eliminate the classroom and perhaps even the campus as the location of learning. Access to the accumulated knowledge of our civilization through digital libraries and networks, not to mention massive repositories of scientific data from remote instruments such as astronomical observatories or high-energy physics accelerators, is changing the nature of scholarship and collaboration in very fundamental ways. Each new generation of supercomputers extends our capacity to simulate physical reality to a higher level of accuracy, from global climate change to the biological function at the molecular level.
The third finding of the committee suggests that although information technology will present many complex challenges and opportunities to universities, procrastination and inaction are the most dangerous courses to follow during a time of rapid technological change.
More specifically, we concluded that for the near term, meaning a decade or less, the university will continue to exist in much its present form, although meeting the challenge of emerging competitors in the marketplace will demand significant changes in how we teach, how we conduct scholarship, and how our institutions are financed.
Because of the profound yet unpredictable impact of this technology, we urged universities to adopt strategies that included:
The first phase of this study, its conclusions, and its recommendations were published in a report, Preparing for the Revolution, available both online and through hard copy from the National Academy Press.
- The opportunity for experimentation,
- The formation of alliances both with other academic institutions as well as with for-profit and government organizations, and
- The development of sufficient in-house expertise among the faculty and staff to track technological trends and assess various courses of action.
More recently, the National Academies have extended this effort to involve directly a large number of research universities by
- Creating a National Academy Forum on Information Technology and Research Universities (“the IT-Forum”) to track the technology and identify the key issues,
- Conducting a series of workshops for university presidents and chief academic officers in an effort to help them understand better the transformational nature of these technologies and the importance of developing strategic visions for the future of their institutions, and
- Raising the awareness of research sponsors such as nonprofit foundations and government agencies as to the potential of these technologies for engaging research universities to better address national and global priorities.
These events revealed not only a broad interest in and awareness of the importance of these issues, but a willingness to explore new paradigms such as national consortia, open-source projects, and knowledge commons. It was our sense that the leadership of U.S. research universities is prepared to undertake major efforts and consider very substantial changes (in organization, function, and culture) to respond to the opportunities and challenges posed by information technology.
IT Forum MembershipJames Duderstadt, President Emeritus, University of Michigan (Forum chair)
Daniel Atkins, Professor, School of Information, University of Michigan
John Seely Brown, Chief Scientist, Xerox Corp.
Jared Cohon, President, Carnegie Mellon University
Stuart Feldman, Vice President, Internet Technology, IBM
Nils Hasselmo, President, Association of American Universities
Brian Hawkins, President, EDUCAUSE
Shirley Ann Jackson, President, Rensselaer Polytechnic Institute
Sidney Karin, Professor of Computer Science and Engineering, University of California, San Diego
Kevin Kelly, Editor-at-Large, Wired
Shirley Strum Kenny, President, Stony Brook University
Susanne Lohmann, Director, Center for Governance, University of California, Los Angeles
Anne Margulies, Executive Director, OpenCourseWare, Massachusetts Institute of Technology
Michael McRobbie, Chief Information Officer, Indiana University
Diana Oblinger, Vice President, EDUCAUSE
James O'Donnell, Provost, Georgetown University
Marshall Smith, Professor, School of Education, Stanford University, and Program Officer for Education, Hewlett Foundation
Lee Sproull, Professor, Stern School of Management, New York University
Doug Van Houweling, CEO, University Corporation for Advanced Internet Development
Robert Weisbuch, President, Woodrow Wilson National Fellowship Foundation
Wm. A. Wulf, President, National Academy of Engineering (Program chair)
Over the past two years the IT Forum’s activities have included
- 2-22-03: A kickoff meeting at the National Academies (Washington)
- 4-15-03: Summit Meeting of the AAU Presidents (Washington)
- 9-5-03: IT Forum on Learning at Carnegie Mellon (Pittsburgh)
- 9-9-03: Summit Meeting of the AAU Provosts (California)
- 10-29-03: Tutorial for NSF Leadership (Washington)
- 3-11-04: IT Forum on Virtual Worlds at Institute for Creative Technologies (LA)
- 3-15-04: NSF Cyberinfrastructure Program
- 9-1-04: Executive Leadership Core Workshop (MIT-CMU-Cornell) (Cambridge)
- 11-12-04: IT Forum on Cyberinfrastructure (Ann Arbor)
- 1-10-05: Executive Leadership Core Workshop (Austin)
- Executive Leadership Core Workshops (Chapel Hill, California, Chicago)
The AAU Presidents’ Workshop (April 15, 2003)
We first targeted the presidents of the nation’s leading research universities, namely those members of the Association of American Universities, by asking them to stay for a daylong workshop following their annual meeting last spring.
Heeding the old adage that to get a mule to move, you first need to whack it over the head with a 2x4, we asked Lou Gerstner, former CEO of IBM, to kick off the meeting the evening before with a dinner address, describing how he had transformed IBM. Gerstner made two key points that quickly gained the presidents’ attention. He noted that when he arrived at IBM during the early 1990s, IBM’s stock value was plummeting and there was serious consideration given to breaking the company up because despite the fact that IBM was developing much of this technology, the company really didn’t understand the implications of its disruptive character for their own corporation. Furthermore, technology strategies require the attention of the very highest level of an organization’s leadership. To simply delegate this assignment to others such as CIOs or CFOs puts the organization at great risk.
With full awareness that university presidents listen most carefully to their own voices, we structured workshop the next day into panels of presidents: First, we asked several presidents to discuss what was currently in their in-out box, the here-and-now issues. As you can imagine, these included concerns such as how they could meet the seemingly insatiable demand for computing resources (particularly bandwidth); how they could pay for this technology; and how they could handle privacy and security issues. You will also not be surprised that most of the presidents believed that they had these issues well in hand (a perception quite different than we were to find with their provosts several months later).
We then tried to move the presidents group somewhat farther into the future, by asking them to speculate about technology challenges for the decade ahead. Here, we stimulated the discussion by having members of the IT Forum toss occasional hand grenades into the conversation.
For example, Stu Feldman of IBM asked how the presidents would respond to the strong possibility that he would be able to hand them a device the size of a football (choosing an object particularly familiar to university presidents) that would contain the entire Library of Congress.
Dan Atkins (Michigan), coming off his recent experience as chair of the NSF Blue Ribbon Panel on Cyberinfrastructure, asked how the presidents believe faculty loyalty and mobility would be affected by the rapid emergence of knowledge nets, cyberspace-based environments for scientific collaboration clearly independent of space and time.
Bob Dynes (UC) observed that technology is moving so fast that there are vast differences between the seniors and the freshmen at his institution. The freshmen are completely wireless, and communicating in very unexpected ways. If we enable students, they will drive us. He also noted that campus boundaries are less and less meaningful, which poses additional challenges.
Stu Feldman (IBM) raised two more important questions: (1) is it possible to manage universities as unified enterprises, or will they always function as decentralized entities? and (2) will the university build its value proposition around the student (e.g. the University of Phoenix) or the professor? He noted the degree to which E-infrastructure could disintegrate, disaggregate, reintegrate and reaggregate functions and roles of a university. The real disruptive force is the marketplace, brought onto campuses by new technologies in a highly competitive and disruptive fashion He questioned whether the current package of activities that have emerged as the U.S. research university would survive intact.
Bill Wulf (President of the National Academy of Engineering) noted that past predictions of future social impacts from technological advance have been notably bad. They typically assume some version of the status quo, only faster, cheaper, bigger, etc. that is quickly blown apart by unanticipated.
After about an hour of this wide-ranging discussion, one of the presidents stood up and said: “OK. Now you have convinced me. This technology is creating a future that is so uncertain that I don’t have a clue how presidents can provide effective leadership. We need your help!”
Hence, we had managed to bring the group far along the “seven stages of death”, from denial to acceptance to bargaining to seeking help…
The AAU Provosts’ Workshop (September 9, 2003)
We had an opportunity to conduct a very similar workshop for the AAU provosts, following their September 2003 meeting in Newport Beach, California. (Several months later I engaged the provosts of the major public universities in a similar session at the annual NASULGC meeting.)
The provosts workshop was organized very similarly to the AAU Presidents’ workshop, by first asking a panel of provosts to lay out the issues as they saw them at the moment, then to move the discussion to a longer-term perspective, and finally to conclude with a discussions of next steps.
It is not surprising that many of the near term issues raised by the provosts were very similar to those raised by the presidents:
Network and bandwidth management
How do we pay for this technology?
How do we protect security and privacy?
Data management and preservation issues
We next tried to bump the discussion up a notch to look at longer-term issues such as:
The digital generation (students and faculty)
The emerging needs for cyberinfrastructure
Competition vs. cooperation
The instability of the current research university paradigm
The survival of the research university (an issue that would have been hard to put on the table with the university presidents)
Perhaps not surprising was a far greater degree of sophistication among the provosts in understanding and addressing these issues than shown by the presidents, since they were on the front line. But there was an even more significant difference: unlike the presidents, the provosts already recognized that these were very difficult issues, and they certainly didn’t have the answers. This was also an interesting contrast with a quite similar workshop on technology held five years earlier when the provosts neither understood nor accepted the strategic nature of technology issues. Clearly these academic leaders have moved far beyond denial about the transformative nature of technology issues and are searching for effective strategies.
Some of the highlights of the discussion include:
There was a growing concern about the degree to which universities were being victimized by the effective monopolies created by providers such as PeopleSoft, Blackboard, and, of course, Microsoft. As one provost put it, universities act like deer paralyzed in the oncoming headlights, continuing to re-invent the wheel and getting devoured by the marketplace. The provosts were essentially unanimous in their belief that it was time for the universities to set aside their competitive instincts and to build consortia to develop together the technologies to support their instructional, research, and administrative needs through an open-source paradigm that would break the stranglehold of the current marketplace. Similar cooperation was needed in areas of cyberinfrastructure such as Internet2, the Open Knowledge Initiative, SAKAI, and the Open CourseWare effort.
Many provosts suspected that while the faculty believed they knew how their students learned, in reality they didn’t have a clue, particularly in technology-rich environments. (This was a theme we were to encounter again and again in our later workshops). The provosts believed that their universities needed far more sophisticated help (perhaps through NSF-sponsored programs) to understand the learning and cognitive processes, although the provosts also recognized the disruptive nature of these studies which might eliminate over time the rationale for the lecture-classroom paradigm.
IT-Forum Meeting at Carnegie Mellon University on “Cognition, Communication, and Communities” (September 5, 2003)
To learn more about how learning occurs in technology-intensive environments, we held the September meeting of the IT Forum at Carnegie Mellon, famous both as one of the nation’s most wired—and now wireless—campuses, and also for its great strength in the cognitive sciences.
As their faculty put it, their students these days are “electrified.” They are a transformative force, frequently forcing the CMU faculty to react to their learning activities. An example is the way students use this technology for communication. From instant messaging to e-mail to WiKi’s to Blogs, students are in continual communication with one another, forming groups or entire communities that are always interacting, even in classes (as any faculty member who has been “Googled” can attest).
A second example: a young professor of physics told us he had been forced to give up trying to “teach” difficult concepts in his classes. Instead he introduces a topic by pointing to several resources until a few students in the class figure out a way to teach themselves the concept. Then they teach their fellow students, and through peer-to-peer learning, the concepts propagate rapid through the class.
As Kevin Kelly (Wired) put it, the CMU students are using instant messaging and Google to create their own learning environments. THEY will determine not only which learning technologies but as well, which learning methods work best. The faculty is reduced to catching up to formalize what the students have developed.
In fact, many CMU faculty have now concluded that perhaps the best approach is to turn the kids loose, to let information learning lead and shape formal learning in a way that responds to the great diversity in how students learn. Peer-to-peer learning is rapidly replacing faculty teaching as the dominant educational process on this technology-rich campus. There is not yet a consensus among the faculty as to where they are headed, but there is strong agreement that IT is changing the learning process in very fundamental ways. The students are forming learning communities on their own, using instant messaging, e-mail, and other IT-mediated communications technologies.
The traditional classroom paradigm is being challenged today, not so much by professors, who have by and large optimized their teaching effort and their time commitments to a lecture format, but by our students. Members of today’s digital generation of students have spent their early lives immersed in robust, visual, electronic media—Sesame Street, MTV, home computers, video games, cyberspace networks, MUDs and MOOS, and virtual reality.
Unlike those of us who were raised in an era of passive, broadcast media such as radio and television, today’s students expect—indeed, demand—interaction. They approach learning as a “plug-and-play” experience. They are unaccustomed and unwilling to learn sequentially—to read the manual. Instead they are inclined to plunge in and learn through participation and experimentation. Although this type of learning is far different from the sequential, pyramidal approach of the traditional college curriculum, it may be far more effective for this generation, particularly when provided through a media-rich environment.
John Seely Brown and his colleagues at Xerox PARC have studied the learning habits of the plug-and-play generation and identified several interesting characteristics of their learning process. First, today’s students like to do several things at once–they “multitask,” performing several tasks simultaneously at a computer such as website browsing and e-mail while listening to music or talking on a cellular phone. Although their attention span appears short, as they jump from one activity to another, they appear to learn just as effectively as earlier generations.
Furthermore, it is clear that they have mastered a broader range of literacy skills, augmenting traditional verbal communication skills with visual images and hypertext links. They are particularly adept at navigating through complex arrays of information, acquiring the knowledge resources they seek and building sophisticated networks of learning resources.
To be sure, for a time, such students may tolerate the linear, sequential lecture paradigm of the traditional college curriculum. They still read what we assign, write the required term papers, and pass our exams. But this is decidedly not the way they learn. They learn in a highly nonlinear fashion, by skipping from beginning to end and then back again, by building peer groups of learners, and by developing sophisticated learning networks in cyberspace. In a very real sense, they build their own learning environments that enable interactive, collaborative learning, whether we recognize and accommodate this or not.
However, their tolerance for the traditional classroom and four-year curriculum model may not last long. Students will increasingly demand new learning paradigms more suited to their learning styles and more appropriate to prepare them for a lifetime of learning and change
One can imagine the impact of millions of students from the digital generation as they seek the interactive, collaborative, and convenient learning experiences they have already experienced from other digital media. We should not underestimate the impact of the plug-and-play generation on the university.
After all, their use of digital technologies such as Napster and other peer-to-peer applications quickly overloaded our IT infrastructures and threatened the recording industry. Their use of the Net and other digital resources is already far more sophisticated than most faculty and staff. They will drive rapid and profound change in higher education since they will demand that we adapt the university to their learning needs and characteristics through market forces.
The Institute for Creative Technologies, Marina-del-Rey
To understand this better, the spring meeting of the IT-Forum was held at the Institute for Creative Technologies in Marina del Rey. Here, the University of Southern California is applying the entertainment and gaming technologies developed by Hollywood and others to create a “holodeck” to use train military officers in higher level decision making. They have learned something that universities have yet to grasp: how technology can be used to create an emotional connection between knowledge and learning.
Some of the key themes of this discussion included:
Simulation-based learning: Play is, in reality (at least for animals) an evolution process for learning. Emotion is necessary for sustained learning. Play evolves to solve the problem of learning a skill required for a situation that hasn’t happened yet. To be sure, there are topics that require a more structured learning strategy. But remember, games teach not facts but rather how to build goals and strategies.
Virtual Worlds: The entertainment industry has figured out how to engage large numbers of people with quite primitive technology. We can talk all we want about this stuff as scholars, but the students are moving on. 30% of Everquest people “live” in Norvath. We need to understand better what is going on here.
This technology is forcing us to rethink the nature of literacy: From literacy in the oral tradition…to the written word…to the images of film and then television…to the computer and multimedia. Of course there are many other forms of literacy: art, poetry, mathematics, science itself, etc. But more significantly, the real transformation is from literacy as “read only, listening, and viewing” to composition in first rhetoric, then writing, and now in multimedia.
From another perspective, our society increasingly values not just analysis but synthesis, enabled by the extraordinary tools of the digital age. Increasingly, we realize that learning occurs not simply through study and contemplation but through the active discovery and application of knowledge. From John Dewey to Jean Piaget to Seymour Papert, we have ample evidence that most students learn best through inquiry-based or “constructionist” learning.
As the ancient Chinese proverb suggests “I hear and I forget; I see and I remember; I do and I understand.” To which I might add, “I teach and I master!!!”
But herein lies a great challenge. While universities are experienced in teaching the skills of analysis, we have far less understanding of the intellectual activities associated with creativity. In fact, the current disciplinary culture of our campuses sometimes discriminates against those who are truly creative, those who do not fit well into our stereotypes of students and faculty.
The university may need to reorganize itself quite differently, stressing forms of pedagogy and extracurricular experiences to nurture and teach the art and skill of creation. This would probably imply a shift away from highly specialized disciplines and degree programs to programs placing more emphasis on integrating knowledge.
Perhaps it is time to integrate the educational mission of the university with the research and service activities of the faculty by ripping instruction out of the classroom–or at least the lecture hall–and placing it instead in the discovery environment of the laboratory or studio or the experiential environment of professional practice.
A Tutorial for the National Science Foundation
Last October we were invited to conduct a day-long “tutorial” for the leadership of the National Science Foundation concerning the potential impact of information technology on learning, broadly defined. We began with our concern that the changing learning needs of our society, and the disruptive nature of digital technology, may extend beyond the capacity of our existing learning infrastructure of schools, universities, training programs, and cultural institutions. Approaching the challenge by reforming existing institutions may not be sufficient. After all, “a butterfly is not simply a better caterpillar!” Instead perhaps it was time to explore entirely different types of learning organizations and ecologies.
It is important to realize that digital technology drives a shift in epistemology from “learning about” to “learning to be.” While traditional approaches to education focus on content, IT-based learning focuses on process, on being and doing. This is important, since it is likely that IT will have more impact on informal learning than on schools and curricula. Gaming provides an excellent example: the popular computer game, TheAge of Empires is really a graduate course on the Middle Ages. Can digitial technologies be used to link informal and formal learning? This could be a very rich and powerful linkage, particularly important for the motivated student. But what about other students who may not be motivated or cyber-literate? Rather than “linking” perhaps the more appropriate focus should be “integrating.” After all, formal and informal learning or communication are more of a continuum than a dichotomy.
Technology has created a huge experience gap, more determined by the exponential pace of technology development than age. The current generation of educators (and NSF program directors and proposal reviewers) is probably the wrong one to deal with these issues, since we really don’t understand the digital generation very well. Do we really know how young people learn in technology-intensive environments? Is NSF funding enough “observational” efforts simply to look, listen, and learn?
Furthermore, it is important to look more broadly at cyberinfrastructure—not just technology, but people, organizations, and policies. Here we were not recommending an “EDUnet” but rather a more holistic approach, a knowledge environment, a learning ecology that adapts, mutates, and evolves. Yet most educational institutions (schools, colleges, universities) are not true learning organizations. Neither is NSF, for that matter. Where are the experiments today that are as bold as those in the 1950s (UCSC, UCSD)? Probably in the for-profit sector (e.g., the University of Phoenix).
There is an urgent need to broaden the EHR portfolio far beyond its traditional programs, practices, and policies, all of which tend to constrain the directorate to funding the past rather than shaping the future. We need to put even bolder questions on the agenda. For example, why doesn’t NSF (EHR) launch major projects that attempt to explore the design of entirely new learning ecologies that begin with a “green-field” approach to determining the needs of citizens (and workforces) in a global, knowledge-driven economy, then build resource maps and conduct a gap analysis to see what is missing in our existing educational infrastructure, and finally develop technology roadmaps aimed at developing new educational resources? (This approach worked fairly well in post-cold war Eastern Europe to help people break out of old organizational structures and apply their skills and talents in new ways.)
Another example: Information technology is driving quite extraordinary change in higher education on both a national and global scale comparable to the restructuring of other economic sectors such as health care, financial services, transportation, and energy. What is NSF doing to understand and influence these changes in a way that protects the scientific capacity of the nation?
Throughout its half-century-long history, NSF has stepped up from time to time as an important change agent to address major national priorities. The partnership between the federal government and higher education articulated in Vannevar Bush’s Science, the Endless Frontier, created the American research university as we know it today. Much of the digital revolution in scientific research, education, and our broader society was stimulated by NSFnet and the resulting Internet. Today the human resource needs of the nation, an increasingly competitive global, knowledge-driven economy, and the challenge and promise presented by exponentially evolving digital technology presents a new and compelling challenge to NSF to provide leadership and stimulate change in our nation’s learning enterprise.
We felt it important to stress the urgency of the human resource crisis facing the nation and the role that NSF-EHR could (should, indeed MUST) play in addressing this national priority. The turnover in the nation’s K-12 teaching cadre will occur over the next 5 to 10 years. If substantial reform in teaching education and training is not accomplished soon, it will be a generation lost (of both teachers AND students).
There is an urgent crisis in the availability of STEM human resources precipitated by the discontinuity in the flow of talented international students to the United States as a consequence of the concerns about homeland security and global attitudes toward America in the aftermath of 9/11 and Iraq. This is a crisis of monumental importance to high-tech industries (not to mention research universities) in this country, and it should be high on the list of NSF priorities. Is it? Does the current portfolio of EHR or NSF activities address such issues? If not, why?
New federal and state policies in testing and school accountability are driving a revolution at the K-12 level. This provides both a challenge and an opportunity to NSF: a challenge if teaching to the test dominates the student learning environment, and an opportunity if NSF were able to influence the testing and accountability process in STEM areas to enhance learning.
Finally, the human resource implications of a global, knowledge-driven economy is driving massive change in the workforce education and training needs that must be addressed at all levels of the educational enterprise: K-12, higher education, postgraduate, workplace, and lifelong learning. Again this poses both a challenge and an opportunity for NSF.
Clearly time is not on our side in addressing these multiple human resource challenges. The NSF needs to determine what it can accomplish in the near term with existing resources. But to do so, it needs to approach its current inventory of activities in a much more strategic and rigorous fashion and then make the necessary changes. It also must launch far bolder initiatives that anticipate a radically different future for learning and learning institutions.
In other words, NSF first needs to know what it knows. It then must transform itself into a learning institution capable of providing leadership, stimulating change, and responding to the needs of the nation.
Our most recent activities have been stimulated by an important study by the National Science Foundation Blue Ribbon Advisory Panel on Cyberinfrastructure, chaired by Dan Atkins. The Panel concluded that we are approaching an inflection point in the potential of rapidly evolving information and communications technology to transform how the scientific and engineering enterprise does knowledge work, the nature of the problems it undertakes, and the broadening of those able to participate in research and the related educational activities. To quote the concluding paragraph of its report:
“A new age has dawned in scientific and engineering research, pushed by continuing progress in computing, information, and communication technology, and pulled by the expanding complexity, scope, and scale of today’s challenges. The capacity of this technology has crossed thresholds that now make possible a comprehensive ‘cyberinfrastructure’ on which to build new types of scientific and engineering knowledge environments and organizations and to pursue research in new ways and with increased efficacy. Increasingly, new types of scientific organizations and support environments for science are essential, not optional, to the aspirations of research communities and to broadening participation in those communities. They can serve individuals, teams, and organizations in ways that revolutionize what they can do, how they do it, and who participates. This vision has profound broader implications for education, commerce, and social good.”
Clearly, cyberinfrastructure is not only reshaping but actually creating new paradigms for science and engineering research, training, and application. The availability of powerful new tools such as computer simulation, massive data repositories, massively ubiquitous sensor arrays, and high-bandwidth communication are allowing scientists and engineers to shift their intellectual activities from the routine analysis of data to the creativity and imagination to ask entirely new questions. New paradigms are evolving for the sharing of scientific knowledge, such as the open source movement (Open Knowledge Initiative) and powerful search engines (Google). Globalization is a particularly important consequence of the new forms of scientific collaboration enabled by cyberinfrastructure, which is allowing scientific collaboration and investigation to become increasingly decoupled from traditional organizations (e.g., research universities and corporate R&D laboratories) as new communities for scholarly collaboration evolve.
Several more specific examples illustrate the importance of this project:
- The globalizations of scientific activity, as new collaborations enabled by cyberinfrastructure compete with traditional organizations such as the research university for the loyalty and participation of scholars. The evolution of global research communities, increasingly independent of traditional institutions such as universities or industry
- Newly emerging scientific communities that compete with and break apart the feudal hierarchy that has traditionally controlled scientific training (particularly doctoral and postdoctoral work), empowering young scholars and enabling greater access to scientific resources and opportunities for collaboration and engagement.
- The impact of cyberinfrastructure on the “culture” of scientific activities and institutions, e.g., publication, collaboration, competition, travel, and the ability of participants to assume multiple roles (master, learner, observer) (leader, learner, lurker) in various scholarly communities, the increasing importance of creativity relative to analysis as powerful new tools of investigation (e.g., simulation, massively pervasive sensor arrays) enabled by cyberinfrastructure appear.
- At its most abstract, the “university” is a community of masters and scholars (universitas magistorium et scholarium), a school of universal learning that embraces every branch of knowledge and all possible means for making new investigations and thus advancing knowledge. These two characteristics, scholarly community and breadth of both intellectual topics and tools, have remained the core elements of the various forms taken by the university from medieval times (e.g., Paris and Bologna), through the Renaissance and Enlightenment, to today’s research universities. We already see these elements appearing in new forms enabled by cyberinfrastructure, e.g., global, domain-specific communities of scholars detached from traditional institutions such as universities, and exceptionally broad digital collections of knowledge such as digital libraries or the archives of search engines such as Google. Could these be the precursors of a new form of the university, essentially appearing spontaneously out of the vacuum state of the cyberspace enabled by cyberinfrastructure?
While promising significant new opportunities for scientific and engineering research and education, the digital revolution will also pose considerable challenges and drive profound transformations in existing organizations such as universities, national and corporate research laboratories, and funding agencies such as NSF. Here it is important to recognize that the implementation of such new technologies involve social and organizational issues as much as they do technology itself. Achieving the benefits of IT investments will require the co-evolution of technology, human behavior, and organizations.
There is a clear need to involve and stimulate as well those organizations that span disciplinary lines and integrate scholarship and learning. Perhaps the most important such organization is the research university, which despite the potential of new organizational structures will continue to be the primary institution for educating, developing, and sustaining the American scientific and engineering enterprise.
But universities will need help, to understand, explore, and develop the cyberinfrastructure necessary to support their educational and scholarly activities. Indeed, without assistance, federal efforts such as those at NSF are unlikely to achieve their potential, since the existing culture, structure, and function of the research university will likely resist and possibly reject new approaches that challenge the status quo.
There is a sense among many in the research university community that we will see a convergence and standardization of the cyberinfrastructure necessary for state-of-the-art research and learning over the next several years, built upon open source technologies, standards, and protocols, and that the research universities themselves will play a leadership role in creating these technologies, much as they have in the past.
The Executive Leadership Core Workshops
One of the important concerns voiced by the provosts was the difficulty in getting universities to recognize the strategic implications of rapidly evolving digital technologies as they reshape the most fundamental aspects of learning and scholarship. They believed that part of the challenge was getting the executive leadership core of the institution–the president, provost, CFO, CIO, director of libraries, key deans–on the same page, communicating with one another rather than simply dumping a diverse array of issues and demands on the CIO and saying, “Handle it!”
To this end they suggested that we conduct a series of roundtable workshops around the country, bringing together the executive leadership of several institutions in a facilitated roundtable discussion to compare notes on what they see has challenges and opportunities. The sense was that engaging in a candid and confidential discussion with peer institutions would force each of the participating teams to get their act together. They would learn from each other and perhaps develop the basis for further collaboration.
We were able to persuade Chuck Vest to host the first of these executive leadership core workshops at MIT in early September, and as peers we chose Carnegie-Mellon and Cornell, since all three institutions have been long distinguished by leadership in IT (e.g., Project Athena at MIT, Project Andrew at CMU, and the Theory Center and Supercomputers at Cornell).
Bob Zemsky, Chair and CEO, The Learning Alliance
Carnegie Mellon UniversityJared Cohon, President
Mark Kamlet, Provost
Joel Smith, Vice Provost/Chief Information Officer
Randal Bryant, Dean, School of Computer Science
Jeannette M. Wing, Head, Computer Science Department
Pradeep Khosla, Dean, Carnegie Institute of Technology
Cornell UniversityJeffrey Lehman, President
Biddy Martin, Provost
Polley McClure, Vice President, Information Technologies
Robert Constable, Dean for Computing and Information Science
Sarah Thomas, University Librarian
Robert C. Richardson, Vice Provost for Research
Massachusetts Institute of TechnologyCharles M. Vest, President
Robert A. Brown, Provost
John Curry, Executive Vice President
Jerrold Grochow, Vice President for Information Services and Technology
Jeffrey Schiller, Network Manager
Lorna Gibson, Professor of Materials Science and Engineering
M.S. Vijay Kumar, Assistant Provost and Director of Academic Computing
Pat Dreher, Deputy Chair, IT-SPARCC
Forum membersJim Duderstadt, President Emeritus, University of Michigan (Forum chair)
Michael McRobbie, VP for IT and CIO, VP for Research, Indiana University
John Seely Brown, Former Chief Scientist, Xerox Corp.
We were fortunate in persuading Bob Zemsky, former provost at Penn and long-time leader of the Pew Higher Education Roundtables to serve as facilitator. Although we promised to keep the discussions confidential, there are some general observations I believe I can make.
We began with a dinner in the MIT Faculty club the evening before the workshop, by asking each president to lead a discussion of “What excites you?” and “What frightens you?” Here the key issue was the degree to which technological change was driving a dramatic change in the student culture that increasingly distanced them from both the faculty and the traditional curriculum.
Instant messaging, blogs, WIKIs, Google, always-on-always-connected, were allowing students to build quite sophisticated social and learning communities, relying more and more on peer-to-peer relationships for learning than formal classroom experiences. Although the faculty was frustrated and perplexed (longing for a return to 110 bd modems and banning wireless networks from the classroom), others in the workshop believed that the multitasking, rapid context-switching, and community building nature of today’s students (at least at MIT, CMU, and Cornell) could well be the best preparation for leadership roles in the very complex, fast moving social situations characterizing 21st century society.
The next day we launched into the workshop itself, working through the various elements of IT from applications in research, instruction, and administration to issues such as connectivity, bandwidth, security, and, of course, financing. Strategically, all three institutions felt we are in a time of chaos and rapid change. Hence it is important to pick what you can manage, and let the rest go. Pick your places, where you can see strategic opportunities for influence. The biggest threat is the frustration over constant change, both technological evolution and security. People don’t like to live with constant change.
One of the greatest challenges is managing expectations. Students and the faculty believe that ‘bandwidth should flow like water from a faucet…” Instead, just give them the tools and let nature take its course (although there is still the question of how one pays for this.)
What is the driver here? Students? Or the faculty? Or technology itself? Where is the faculty? Are they going to lead, follow, or just get out of the way…or perhaps LEARN!
How does one develop a strategy?Here the discussion broke into two groups:
- The Optimists: Just let it happen; we’ll be OK!
- The Pessimists: We need to guide the “revolution!” We need to get the faculty engaged in this as an intellectual challenge.
Perhaps the new game is to outsource the stable infrastructure and focus our resources on the evolution. What utilities does the institution need most control over? Bandwith? Connectivity? Security? Clearly, since these are the real challenges that could bring down the whole enterprise if not done correctly. These have to be solved to keep the revolution going.
There was a strong sense of confidence at this first workshop that while the revolution would continue, these institutions would remain in a leadership role. (One colleague mentioned the old proverb that one needs not outrun a tiger, but only outrun your companion…)
However there was also very little conversation about the major changes occurring in scholarship (not just research). There was little acknowledgment of the way that that higher education is rapidly morphing into something far beyond conventional learning. Just look at the challenge faced by business CIOs, challenged with rebuilding the infrastructure every 18 months while reducing IT costs (per unit of activity) by factors of 10 to stay competitive.
There is one final observation from this first workshop. The confidence (perhaps complacency) we noted is not simply a characteristic of leaders like MIT, CMU, and Cornell but of the AAU-class research universities as a group. It may not characterize other elements of the higher education enterprise that do not have the vast resources and history of leadership to keep the wolf from the door.
Whatever! It is clear that many institutions, including some of our leaders, are still not thinking deeply about the strategic implications of the extraordinary evolutionary pace of digital technology.
We will be conducting further executive leadership core workshops around the country, in Chapel Hill, Austin, Newport Beach, and San Francisco to name a few.
The next meeting of the IT Forum will be in November in Ann Arbor. We will concentrate on two issues:
- What cognitive scientists can tell us about how learning occurs in media-rich environments? (Some of you may have seen the NAS report, “How People Learn” that summarizes the extraordinary advances in neuro and cognitive science over the past couple of decades. We believe we need to understand more about this.)
- A deeper exploration of cyberinfrastructure: Note the CIOs are reaching a consensus on what the IT infrastructure for a research university should look like for the next five years or so (e.g., open source, Sakai, Internet2). The scaffolding for knowledge work is being built, and the research and CIO communities are converging.
The Most Difficult Question of All…
Those of you in this audience know the good news-bad news character of digital technology. We overestimate the impact in the near term, because we implicitly assume that the present will continue, simply at an accelerated pace, and fail to anticipate the disruptive technologies and killer apps that turn predictions topsy-turvy. Yet, we also know that far enough into the future, the exponential character of its evolution makes even the boldest predictions about digital technology come true.
This is the good news-bad news: This stuff is just as disruptive as we predicted it to be! (Good News) And this stuff is just as disruptive as we predicted it to be! (Bad News)
To this end, the IT Forum is beginning to shift its attention from exploring the question of “How to save the university” to consider instead “How will research and learning occur in the digital age?”
This is another reason for this shift in emphasis. While university presidents are sometimes reluctant to speculate about the longer-term future of their institutions, our workshops found the provosts somewhat less inhibited? In fact, our discussions with provosts frequently covered a very broad range of very fundamental issues such as the mission, roles, values, and traditions of the university.
One of our IT Forum members, Susanne Lohmann, reminded the group that within a single generation after the Civil War period, American higher education changed essentially every one of its characteristics in a radical fashion:
- Evolving from the colonial colleges to the Humboldtian model of a research university.
- Empowering the faculty.
- Growing from institutions with hundreds to thousands of students
- Through the Land-Grant acts, creating the new paradigm of the engaged public university.
- Adding research and service to the mission of education (or, in many of the colonial colleges, socialization).
Everything that could change, in fact, did change.
The consensus in several of our workshops has been that we are well along in a similar period of dramatic change in higher education. In fact, some of our colleagues were even willing to put on the table the most disturbing question of all: Will the university, at least as we know it today, even exist a generation from now? Disturbing, perhaps. But certainly a question deserving of very careful consideration, at least by those responsible for leading and governing our institutions.
So what might we anticipate as possible future forms of the university? The monastic character of the ivory tower is certainly lost forever. Although there are many important features of the campus environment that suggest that most universities will continue to exist as a place, at least for the near term, as digital technology makes it increasingly possible to emulate human interaction in all the sense with arbitrarily high fidelity, perhaps we should not bind teaching and scholarship too tightly to buildings and grounds.Certainly, both learning and scholarship will continue to depend heavily upon the existence of communities, since they are, after all, high social enterprises. Yet as these communities are increasingly global in extent, detached from the constraints of space and time, we should not assume that the scholarly communities of our times would necessarily dictate the future of our universities.
Some Final Remarks
Although we feel confident that information technology will continue its rapid evolution for the foreseeable future, it is far more difficult to predict the impact of this technology on human behavior and upon social institutions such as the university. It is important that higher education develop mechanisms to sense the changes that are being driven by information technology and to understand where these forces may drive the university.
The impact of information technology on the university will likely be profound, rapid, and discontinuous—just as it has been and will continue to be for the economy, our society, and our social institutions (e.g., corporations, governments, and learning institutions). It will affect our activities (teaching, research, outreach), our organization (academic structure, faculty culture, financing and management), and the broader higher education enterprise as it evolves into a global knowledge and learning industry.
As information technology continues to evolve at its relentless, indeed, ever accelerating pace, affecting every aspect of our society and our social institutions, organizations in every sector are grappling with the need to transform their basic philosophies and processes of how they collect, synthesize, manage, and control information. Corporations and governments are reorganizing in an effort to utilize technology to enhance productivity, improve quality, and control costs. Entire industries have been restructured to better align with the realities of the digital age.
Yet, to date, the university stands apart, almost unique in its determination to moor itself to past traditions and practices, to insist on performing its core activities such as teaching much as it has done in the past. Our limited use of technology thus far has been at the margins, to provide modest additional resources to classroom pedagogy or to attempt to extend the physical reach of our current classroom-centered, seat-time based, teaching paradigm. True, many of our faculty and students make extensive use of this technology in their work–and their play. But information technology is rarely considered a strategic issue for university leadership, at least at the level of other cosmic issues such as fund-raising, campus construction, and for all too many, intercollegiate athletics.
It is ironic indeed that the very institutions that have played such a profound role in developing the digital technology now reshaping our world are the most resistant to reshaping their activities to enable its effective use. For all the institutional inertia, there is considerable change underway in higher education. Yet, as one moves up the higher education hierarchy, from community colleges to regional universities to research universities, there is less and less activity, particularly at the level of research universities. While many AAU-class institutions are involved in interesting experiments such as Internet2, the Open Knowledge Initiative, NSF’s new cyberinfrastructure program, and Sakai, these are largely “hands off,” without broad faculty participation.
As a result, most research universities are simply not learning how to implement this technology. To some degree this has to do with their privileged position, at the top of the higher education food chain. It may also be due to their relative prosperity, which buffers them from the pressures of external forces such as technological change and the marketplace.
But sooner or later, a technology characterized by exponential growth will overcome all resistance. To use a often-exploited analogy, today’s research universities may be like bathers sunning on the beach in the warm glow of the complacency engendered by past success, unaware that the gentle surf lulling them to sleep is the precursor of a 100 foot tsunami of technology-driven market forces beyond the horizon which could sweep over them before they can react or escape.
To repeat an earlier conclusion, for at least the near term, meaning a decade or less, we believe the university will continue to exist in much its present form, although meeting the challenge of emerging competitors in the marketplace will demand significant changes in how we teach, how we conduct scholarship, and how our institutions are financed. Universities must anticipate these forces, develop appropriate strategies, and make adequate investments if they are to prosper during this period.
Put another way, for the near term (meaning a decade or less), we anticipate that information technology will drive comprehensible if rapid, profound, and discontinuous change in the university. For the longer term (two decades and beyond), all bets are off. As we have noted implications of a million-fold increase in the power of information technology are difficult to even imagine, much less predict for our world and even more so for our institutions.
To be sure, there are certain ancient values and traditions of the university that should be maintained and protected, such as academic freedom, a rational spirit of inquiry, and liberal learning. But, just as it has in earlier times, the university will have to transform itself once again to serve a radically changing world if it is to sustain these important values and roles.
Erich Bloch, Director of the National Science Foundation, testimony to Congress, 1988.
 Clayton M. Christensen, The Innovator’s Dilemma (Harvard Business School Press, Cambridge, 1997).
 Preparing for the Revolution: Information Technology and the Future of the University (Washington, D.C.: National Academies Press, 2003), www.nap.edu.
 Daniel E. Atkins (chair), (2003). Revolutionizing Science and Engineering Through Cyberinfrastructure, Report of the National Science Foundation Blue-Ribbon Advisory Panel on Cyberinfrastructure. National Science Foundation, Washington, DC.