This July 16th will mark the fiftieth anniversary of the launch of the Apollo 11 spaceflight that landed the first two humans on the moon. The computers aboard the command and lunar modules that assisted in the historic and successful lunar landing were revolutionary for their time, yet they would be considered primitive compared to today’s smartphones. Beyond computing power, the ubiquity of modern computing devices has transformed nearly every aspect of business and society: how we shop, communicate, define privacy, and conduct our work, to name a few. This pervasive impact of computing is perhaps why record numbers of students want to study computer science. As a result, enrollments are skyrocketing at universities nationwide. Indeed, computer science is now the most popular major at many top universities. Unfortunately, there are not enough trained faculty to adequately prepare this new wave of computer scientists. This shortfall is concerning because it limits the ability of our modern workforce to innovate and secure vital systems. Doing nothing will contribute to the United States falling behind in key areas of computing such as artificial intelligence, machine learning, and cybersecurity, all of which impact technological and economic competitiveness, as well as national security.
Soaring enrollments are creating many challenges, including overcrowded classes, long course waiting lists, and overburdened computer science faculty. The gap between student enrollment and available computer science faculty is exacerbated by the abundance of employment opportunities outside of academia; many potential faculty members are taking high-paying positions in the private sector. Based on a 2017 survey, computer science major enrollments have more than tripled since 2006 and more than doubled since 2011. According to a recent study conducted by the National Research Council (NRC), there is no indication that the enrollment boom will end anytime soon. The NRC study captures well the challenge to universities, the future of computing, and society more generally, stating that “the field of computer science (CS) is currently experiencing a surge…which is straining program resources at many institutions and causing concern…about how best to respond to the rapidly growing demand. There is also significant interest about what this growth will mean for the future of CS programs, the role of computer science in academic institutions, the field as a whole, and U.S. society more broadly.” Finding solutions to this challenge is critical as students seeking to enter the field of computer science today will fuel tomorrow’s technology innovation.
Many universities have begun taking unique steps that might serve as a roadmap for others. The Stanford Computer Science department, for instance, recently created a graduate certificate program in computer science education for individuals who already have a doctorate in another area. The purpose is to cross-train experts from fields outside of computer science with the hope that these experts will teach undergraduate computer science courses. The program requires that applicants demonstrate a track record of excellence in teaching and, as such, this model of cross-training could be adopted by colleges and universities to help ease the tension between enrollment and faculty resources.
Another option concerns partnerships between industry and academia that allow a researcher to have a foot in both camps. In multiple instances, computer science faculty members who would otherwise be teaching the next generation of innovators in computer science have been tapped to lead or conduct research at corporations like Facebook and Uber. A number of universities encourage dual appointments in which a trained computer science professional can productively split time between a university and a corporation. This sort of relationship produces an important win-win situation. Namely, corporations can assist in solving the university computer science teaching gap, and the corporation benefits by building a strong relationship with a local university. Each side may benefit from access to specialized expertise not available in their respective organizations. Beyond helping fill the teaching gap, computer science departments benefit from these partnerships through access to up-to-date information about workforce needs and what graduates need to know for future employment. Encouraging these university-corporate relationships may be the key to ensuring that the nation’s best innovators have ample opportunity to inspire and mentor the next generation of scientists.
Facing their own version of the digital skills gap, the British government has launched the Institute of Coding, an ambitious and important consortium of over sixty universities, businesses, and industry experts designed to enhance computing skills in the United Kingdom. Significantly, the British government is investing £20 million in the consortium, and an additional £20 million of matching funding will come from industry partners. Highlights of this comprehensive consortium include a focus on undergraduate and master’s education for university students or at-work learners, as well as on previously under-supported groups across the United Kingdom. Additional areas of emphasis include enhancing equality and diversity in technology education, re-training using new digital technologies, and conducting research to anticipate future skills gaps. Industry partners in the consortium include large technology firms such as IBM, Cisco, BT, and Microsoft, as well as a collection of small and medium businesses.
Like this British initiative, a number of U.S. universities have also begun investing in digital technologies, such as online delivery and automated assessment, to reduce the instructional burden on existing faculty. The most popular approach by far is hybrid (sometimes referred to as “blended”) online content delivery, whereby university faculty record lectures and graduate students or adjunct instructors handle face-to-face. While the scalability of this approach is manageable, numerous questions exist about the quality of instruction, student accountability, and student retention. Moreover, the upfront costs of building and maintaining online delivery systems can limit universities’ ability to implement such programs. While these programs will serve to expand the reach of existing instructors, the systems cannot be scaled indefinitely without limiting faculty-student interactions such as organic discussion, detailed grading feedback, and personalized mentoring. Other technology solutions have also been proposed, such as Massively Open Online Courses (MOOCs). However, a recent study in Science summarizes the significant limitations of these courses, which include poor retention, quality, and rampant cheating—all of which call into question the overall benefit of the program to either students or universities. In other words, technology will help to mitigate the effects of instructor shortages, but should not be viewed as a panacea to the problem. More research is sorely needed to understand how automated delivery and assessment can be applied without significant loss in quality. Such research projects should be incentivized and expanded rapidly because the use of these technological methods is inevitable as the scalability problems worsen.
At the United Kingdom’s Institute of Coding, the use of digital technology is built around connecting industrial partners with academia, providing incentives for universities to scale up their course offerings and giving industrial partners more say in specializing curriculum to their hiring needs. Comparatively, the current trajectory of U.S. universities has been toward expanding existing curriculum through digital offerings. Universities must pursue industrial relationships at their own discretion. There is much the United States can learn from this new British initiative, and its progress and success will be important to monitor.
We must have a workforce that can innovate, create, and defend in the digital age. It is imperative that we think through and take action to overcome instructor shortages that may limit the number of students who are able to achieve a high quality computer science education. The transformation that we are witnessing today will continue, and when considering the rapid advances that are taking place in artificial intelligence and machine learning, the possibilities are both exciting and sobering. These possibilities will, at least in part, be a function of the number of students studying computer science and the quality of their instruction. These students will become part of the workforce that will help with system design, creating new algorithms, developing and testing software, and working to make certain that systems are as secure as possible from malicious attackers, among many other things. Ensuring an ample and sustainable supply of well-trained computer science students has never been more important. The challenges are formidable, but the alternative is simply no longer an option.
Frederick R. Chang is Professor and Chair of the Department of Computer Science at Southern Methodist University, and the Bobby B. Lyle Centennial Distinguished Chair in Cyber Security. He is the former Director of Research at the National Security Agency and is a member of the National Academy of Engineering.
Eric C. Larson is an Assistant Professor in the Department of Computer Science at Southern Methodist University. He is active in machine learning education for computer scientists and is an active member of the ACM.
Mark E. Fontenot is a Clinical Professor and Assistant Chair of Undergraduate Programs in the Department of Computer Science at Southern Methodist University. He teaches courses at all levels of the undergraduate computer science curriculum