Chapter 5: Research

5-A       Gen-1 (1985–1990)

5-A(a)    Directed Research from Fundamentals Through Testbeds to Systems

Engineering Research Centers came on the scene as powerful forcing functions for partnerships in research across sectors and disciplines, in order to expand the capacity of academic engineering research to move beyond the exploration of first principles to advancing emerging technology so as to strengthen U.S. competitiveness. The prevailing scene in academe was one of disciplinary faculty working largely alone on research projects exploring first principles of their disciplines—a culture of “silos,” where disciplines developed their own vocabulary and functioned in a reductionist mode rather than a synthesis mode. The incentive structure—grants for single investigators and rewards for solo achievements—reinforced this culture. Industry too had its silos—research, separated from design, which was in turn separated from product development and manufacturing—although the ultimate goal, but not always the incentive structure, was for synthesis to achieve cost-effective and innovative processes and products.

By the 1980s a culture change in both academe and industry was needed if the U.S. were to become competitive in the face of rising competition from Japan and Europe. As the NAE guidelines recommending the ERC Program pointed out:

  • Rapid advances in technology are driving engineering toward cross-disciplinary interactions…there is a growing need for engineering education that cuts across the engineering subdisciplines and applied science;
  • Technological advances are also leading toward integration among design, engineering, manufacturing and marketing…(and) a need for engineers with a broad understanding of the overall manufacturing systems.[1]

Synthesizing the recommendations in those guidelines, the goals for the ERC Program that would bring about that needed culture were:

  • Develop fundamental knowledge in fields that will strengthen U.S. competitiveness;
  • Increase the proportion of engineering faculty committed to cross-disciplinary teams;
  • Focus on engineering systems and increase competence in new fields needed by industry;
  • Increase the number of engineering graduates who can contribute innovatively to U.S. productivity;
  • Include practicing industrial engineers as partners to stimulate technology transfer; and
  • Join research and education.[2]

The ERC Program was a mandate for a new culture of engineering research—from first principles to engineering systems. The focus on engineering systems was an especially important component of ERCs, designed to impact the engineering research culture and produce an engineering workforce better able to strengthen U.S. competitiveness because a systems view of research required integration and synthesis of disciplinary knowledge. As discussed in Chapter 1, U.S. engineering schools had become too theoretical and too analytical. Rigorous grounding in fundamentals is a crucial component of engineering education, but it has to be complemented by experiences that give an integrated picture of engineering in practice and of the relationship between design and synthesis needed to build and manufacture complex engineering systems. Industry also was losing sight of systems issues and diminishing its support for the long-term research needed to address them, as short-term profits and a near-term outlook began to drive industry in the 1980s.[3] The ERCs were designed to spearhead a change in academic and industrial cultures.

In response, the ERC Program team developed ERC research features to specify how this culture change was to come about through the research component of ERCs. Initially these were:

  • Provide research opportunities to develop fundamental knowledge in areas critical to U.S. competitiveness in world markets. 
  • Focus on a major technological concern of both industrial and national importance.
  • Involve a cross-disciplinary team effort, contributing more to the focus and goals of the Center than would occur with individually funded research projects. 
  • Emphasize the systems aspects of engineering and educate and train students in synthesizing, integrating, and managing engineering systems.
  • Provide experimental capabilities not available to individual investigators.
  • Include the participation of engineers and scientists from industrial organizations in order to focus the research on current and projected industry needs and enhance understanding of systems aspects of engineering.[4]

Taken together, these features represented a radical approach to research—i.e., direct the research to achieve a desired next-generation engineering system, rather waiting for those advances to emerge “spontaneously” from basic research. This approach required that academic engineers work in close research partnerships with industry engineers, as the academic culture lacked the understanding of how to advance technology and products and the knowledge of how to integrate and synthesize knowledge in real time to develop and manage engineering systems. In addition, at the time, there was a lack of synergy among research, design, product development and manufacturing in industry. Both sectors stood to gain significantly by building bridges across the academic/ industrial divide, as it existed at the time. The participation of undergraduate and graduate students in this new research culture would achieve the desired integration of research and education and prepare a new generation of engineers better able to “hit the ground running” as they entered careers in industry and, also, better able to spearhead a broader culture change in academic engineering as they entered academic careers.

To address this mandate, 21 Gen-1 ERCs were awarded in the start-up period for the ERC Program, with 6 awarded in the first Class of 1985. The Gen-1 ERC research programs had the systems goals described in the file “Gen-1 Systems Goals.

Given these centers’ complex systems goals, the challenge at NSF was to help the ERCs develop new ways to manage the cross-disciplinary research programs needed to address these goals. Through strategic oversight of the ERCs in partnership with industry, the NSF ERC team began to craft a new approach to managing academic research with complex systems goals. Steven Currall characterized the approach as “Engineering Innovation,” in his study of ERC strategic planning in 2004, and as “Organized Innovation” in his 2014 book based on the ERC Program.[5],[6] During the Gen-1 period, the ERC Program was learning how to put in place a new “framework or systematic method of leading the translation of scientific discoveries into societal benefit through technology and commercialization.” The ERC team put in place “conditions for technology breakthroughs that would lead to new products, companies and industries.” Currall characterizes organized innovation as consisting of three pillars:  channeled curiosity, boundary-breaking collaboration, and orchestrated commercialization.[7] That outcome would take some time to achieve, and this chapter will explore how “organized innovation” came about through experimentation with new concepts in research management and organization.

5-A(b)    Building Cross-Disciplinary Research Platforms

The first challenge was that each of these centers would have to take the time to build a collaborative research “space” where faculty from different disciplines could bring their skills and perspectives together to address the systems goals of their center. In the 1980s, opportunities to find support to join disciplines in research were rare and faculty were not encouraged to form cross-disciplinary teams which would enable them to collaborate and to bring their insights to bear on advancing technology or addressing societal problems. The ERC Program broke the ground for building these new spaces, setting an example for other programs in NSF and across the government and around the world to follow, and eventually to stimulate a reorganization of the academic culture that would play out over decades. Some new fields were generated as a consequence of these investments, such as biological engineering and neuromorphic engineering. However, for the ERCs to be successful the faculty had to be sufficiently motivated by shared goals to establish these collaborative spaces between their disciplines. It also took time and mutual respect for them to learn enough across fields to be able to communicate and collaborate. As a result, they could then bring their skills and knowledge to bear on challenging new problems that could not be solved without that collaboration and conversion of epistemologies. In addition, they were also learning to create research spaces which blended basic and applied research.

The following is a summary of the complex cross-disciplinary research spaces built by the Gen-1 ERCs.

BIOLOGICAL AND BIOMEDICAL ENGINEERING (Joining Engineering, Biology, and Medicine)

The following Gen-1 ERCs laid down new cross-disciplinary platforms that joined engineering, biology, and medicine and formed the basis for the start of new disciplines:

  • Bioprocess Engineering Research Center at MIT joined biochemical engineering, chemical engineering, and molecular biology—forming the basis for the start of the new discipline of biological engineering.
  • ERC for Emerging Cardiovascular Technologies at Duke University joined biomedical engineering, electrical and optical engineering, mechanical engineering, computer visualization and simulation, chemistry, cardiology, and physiology.
  • Biofilm ERC at Montana State joined chemical engineering, electrical engineering, microscopy, and microbiology—forming the basis for the start of the new discipline of biofilm engineering.


The following Gen-1 ERCs joined disciplines in research that led to the integration of structural engineering with sensing systems, the integration of chemical and environmental engineering, and the integration of structural engineering, materials engineering, and ocean science.

  • Center for Advanced Technology for Large Structural Systems at Lehigh joined structural engineering, material science and engineering, electrical engineering, manufacturing engineering, robotics, and computer science.
  • Hazardous Substance Control ERC at UCLA joined chemical engineering, civil engineering, manufacturing engineering, and plant and animal sciences.
  • Off-shore Technology Research Center at Texas A&M/UT Austin joined materials engineering and science, ocean engineering and science, structural engineering.


The broad array of ERCs focused on manufacturing systems and design joined several disciplines to bring about new paradigms in manufacturing, integrating industrial processes with information technology.

  • Advanced Combustion ERC at Brigham Young University and the University of Utah joined chemical engineering, process engineering, electrical engineering, environmental engineering, and computer-aided design.
  • Engineering Design Research Center at Carnegie Mellon University joined chemical engineering, civil engineering, computer science and robotics, electrical engineering, mechanical engineering, public policy, and management,
  • Center for Composites Manufacturing and Science at the University of Delaware and Rutgers University joined materials science and engineering with manufacturing.
  • ERC for Compound Semiconductor Microelectronics at theUniversity of Illinois, Urbana/Champaign joined microelectronics, optoelectronics, and computer science.
  • ERC for Interfacial Engineering at the University of Minnesota joined chemical engineering, materials engineering and science, and atomic microscopy.
  • ERC for Net Shape Manufacturing at Ohio State University joined mechanical engineering and materials science and engineering.
  • Center for Intelligent Manufacturing Systems at Purdue University joined electrical engineering with manufacturing engineering and computer science.
  • ERC for Plasma-Aided Processing at the University of Wisconsin joined electrical engineering, chemical engineering, materials science and engineering, and nuclear engineering and physics.


Some of these ERCs laid the groundwork for the transfer of knowledge of optics from physics to engineering, and joined electrical engineering and computer engineering and computer science with materials science, chemistry, and physics to advance micro and optoelectronic technology.

  • Data Storage Systems Center at Carnegie Mellon University joined microelectronics, optoelectronics, and materials engineering and science.
  • Engineering Center for Telecommunications Research at Columbia joined microelectronics, optoelectronics, computer science, industrial engineering and operations research.[8]
  • ERC for Robotic Systems in Microelectronics at the University of California, Santa Barbara joined electrical engineering with robotics and manufacturing engineering.
  • ERC for Optoelectronic Computing Systems at the University of Colorado/Colorado State joined microelectronics, optoelectronics, materials science and engineering, computer science, chemistry, and physics.[9]
  • Systems Research Center at the University of Maryland and Harvard University joined systems engineering with electrical engineering, design automation, computer science, and information science.
  • Center for Computational Field Simulation at Mississippi State University joined electrical engineering, computational science, computer science and visualization.
  • ERC for Advanced Electronic Materials Processing at North Carolina State University joined electrical engineering, materials science and engineering, mechanical engineering, chemistry, and physics.

5-A(c)    The Challenges of Systems and Testbeds

Understanding that the systems goals would be the most challenging aspect of the ERC Program, in 1985 the Program funded the Steering Group for Systems Aspects through the Cross-Disciplinary Engineering Research Committee of the Commission on Engineering and Technical Systems of the National Research Council. An Engineering Systems workshop was held in 1986. At the start of their deliberations, the members of the Steering Group remarked that graduating engineers were “equipped with the in-depth knowledge to adapt to rapidly changing technologies. What has suffered, however, is the crucial orientation toward industrial practice and needs that traditionally helped ensure technological eminence for the United States. The focus on analytical solutions is valuable, but in some cases it has gone too far. Engineering graduates entering industry no longer have the same ‘feel’ for systems synthesis that they once possessed, and the emphasis on specialized tasks in industry has done little to strengthen that orientation among practicing engineers.”[10] 

The resulting culture emphasized theory and science, which were important, but the culture of hands-on product-and-practice orientation of academic engineering decreased. “A certain snobbishness appeared:  Those who preferred to think in terms of synthesis or design of products, rather than research, became in some vague way second-class citizens.”[11]  This culture was reinforced in industry through the large-scale industrial research and development laboratories that began to take on an academic “flavor” within a firm. Those who hired that type of engineer were impressed with the depth of knowledge but dismayed at how long it took for them to come up to speed. This engineering research and education culture was producing graduates who were “acquiring a notion that analysis itself—rather than the solution of engineering problems—is the focus of engineering work…but the existing curriculum tends not to impart an integrated picture of engineering, nor does it give a synthesis of complex, engineering systems. From the standpoint of industry needs, these are serious shortcomings.”[12]

“Industry had found that graduates from engineering schools were so immersed in the single-discipline focus of their professors, unfamiliar with technology and the integrative approaches needed to advance technological systems. New entrants had to be taught how to work in teams and how to depend upon the paradigms of other research disciplines needed to make incremental advances in production systems and develop new products.”[13]

These were the issues that drove the creation of the ERC Program and its focus on integrating disciplines to address engineering systems, from fundamentals to technology. The guidance to the ERC Program and the funded ERCs was to create a research and education culture that would address the shortcomings of the post-war academic engineering culture and build a new culture or “systems environment—going from narrow technical aspects of manufacturing to the broader techno-economic aspects, to the broadest techno-social concerns for national impacts.”[14]  The Committee recommended that the systems environment of an ERC would have the following characteristics:


  • Cross-disciplinary teams of engineers and scientists from separate disciplines should work as a team toward the solution of engineering research problems that have a direct bearing on near- and longer-term needs of industry or society
  • Systems approach should focus on development of generic processes and principles, rather than an optimized product alone.


  • Understanding of how systems are designed, manufactured, and supported in the field
  • Not extensive curricular changes, but an increased exposure to the practical application of existing course material to the synthesis of engineering systems, with no single correct answer
  • Hands-on experimentation and experience in systems design and development through exposure to industry personnel and methods of practice.


  • Interdepartmental approach to design and manufacturing as an integrated whole
  • Understanding of all elements of the systems environment
  • Ability to understand how the separate activities contribute to the solution to improve both product and process.[15]

Also in 1986, Erich Bloch asked the Office of Cross-Disciplinary Research staff, the home of the ERC Program, to support a workshop to explore the development of the emerging interdisciplinary field of management of technology. Bloch had been approached by Richie Herink, then the Program Director for Technology Management and Process Education at IBM, about the need to integrate the various disciplines that that been focusing on the issue of how to stimulate and manage technological innovation into a new field—i.e., Management of Technology. Lynn Preston and Fred Betz, an ERC PD who had managed an NSF program of Industry/University Cooperative Research projects, supported the workshop and worked with Herink to develop the steering committee and workshop agenda. The Management of Technology workshop, held in May 1986, brought together academic and industrial experts working in various aspects of the field.[16],[17]

The participants in the workshop agreed that in an “era of rapidly changing technology, a better understanding of the causes of inefficiencies in product development and better tools to manage technology development were needed. They recommended linking engineering, science, and management disciplines to address the planning, development, and implementation of technological capabilities to shape and accomplish the strategic and operational objectives of an organization.”[18]

Today the Management of Technology field is characterized by the image in Figure 5-1:[19]  The Steering group recommended that NSF support the development of this field because of the following catalysts for change in the 1980s:

  • The pace at which new product and process technology is generated had grown exponentially, creating rapidly changing sources of competitiveness; and U.S. companies must stay abreast of and lead these changes.
  • Product life cycles shortened significantly because of increasing engineering capability and consumers who more readily adapt to change.
  • International competitors were dramatically reducing product development times—Japanese automakers had a 3–4-year product development time compared to 6 for U.S. automakers.
  • These trends would force U.S. firms to adopt flexible equipment that could adapt to changing production needs and facilities that could manage integrated systems.[20] 

NSF did provide support for the development of the field. A new Program Director with experience in the field was hired and a program announcement was issued.

Figure 5-1:  Technology Management:  The Integration of Management, Analysis, and Engineering Skills (Credit: University of Bridgeport)

The Management of Technology and Engineering Systems workshops both had significant impacts on the management of the ERC Program. Because of their experience before the initiation of the ERC program, Preston and Betz understood that a passive approach to research management in ERCs would not enable these centers to achieve their systems goals and the envisioned changes in their research and education cultures. Given the complex systems goals of the ERCs and the implied mandate for technology management, they began to explore how effective the first two classes of ERCs were in organizing and directing research programs to achieve their systems goals.

The first task was to work with the funded ERCs to better define their engineering systems goals and to expand their horizons beyond those goals to reach for a vision for new technology systems. It quickly became apparent that if the ERCs were to achieve their complex systems visions, they would have to move beyond theory and modeling to synthesis and integration in “an experimental demonstration of a systems concept.” Some of the early Gen-1 ERCs, like the Center for Telecommunications Research (CTR) at Columbia, understood at proposal stage that they would have to build integrative systems or testbeds on an academic scale in order to demonstrate their systems idea, which would reveal additional barriers in the way of demonstrating functionality and provide a flexible testbed for future systems concepts.

CTR Director Mischa Swartz told the audience at the symposium announcing the ERC Class of 1985 that his ERC was:

“…implementing a highly flexible network testbed called MAGNET. MAGNET is a local area network of our own design capable of supporting integrated services such as data, facsimile, graphics, voice, and video communications. Through proper software design it will also emulate, at higher levels, integrated networks of various types. Once completed it can be used to study integration of services on local area networks, as well as to provide a testbed for trying out new system concepts as they are developed.”[21] 

At the same symposium, Daniel I.C. Wang, the Director of the Bioprocess Engineering Research Center at MIT, also voiced a vision for new testbeds that would be needed to support the development of large-scale bioreactors capable of processing mammalian cell-based material and protein-secreting microorganisms through biosynthesis or biocatalysis. The research to support the testbeds would require the integration of knowledge from biology, engineering, and industry. [22]

During the discussions between the ERC Program team and the ERC Directors and faculty about the need to develop testbeds, that took place at the 1986 ERC Annual Meeting, other ERC directors were still reluctant to develop testbeds because they were concerned that academic research should not produce “prototypes”. It was apparent that when the ERC Program pushed the academics into a research space between traditional academic culture and industry’s research culture, they became uneasy. The outcome of the dialogue pointed to the ERC testbed as a research tool not a product prototype and they took on a role as a proof-of-concept testbed over time. By 1987, demonstrating systems concepts in testbeds became a requirement. (See Section 5-D(b) for several examples of large ERC-developed testbeds.)

By 1993, the end of the Gen-1 period, systems were defined as follows:

“Systems integrate components to serve a processing function or product need. Some of the engineering systems that are being explored (in Gen-1 ERCs) include a knowledge-based design modeling system for rapid prototyping; a wide-band optical telecommunications network that integrates a signal transmission control system with voice, data, and image presentation systems, a deep-ocean tension-leg platform for offshore recovery of oil in deep water; and next-generation magnetic or magneto-optic data storage systems that optimize head/media interface to achieve higher rates of data storage.”[23]

5-A(d)    Strategic Planning—Industry Guidance and Initial Attempts

As indicated in Chapter 3, the outcome of reviews of the first two classes of ERCs found mixed capability to organize faculty—accustomed to working alone—to develop shared goals and work together to address them. Some ERCs had begun to build a strong base of research to address their engineering systems goals and a new research culture was evolving that joined disciplinary capabilities to explore new barriers to technology. Some, however, continued to look like collections of single-investigator projects, with little or no synthesis, which was needed to address higher-level engineering systems goals.

Nam Suh, then the Assistant Director for Engineering at NSF, pointed to major concern “that some centers do not have a vision…knowing simply where they are going…where the center should be three or five or ten years from now…. Some have lost sight of their goals to accommodate existing institutional power structures. It is business as usual for them…. They lack this vision because they are not doing the work proposed. Some have not expanded existing operations into a new effort. And some have not emphasized the cross-disciplinary thrust so important in any systems approach to a problem.”[24]

Industry voiced the same concerns: “What are ERCs going to deliver? Students and papers? We can already get that with the way we fund research now. The ERCs have to be focused on ‘deliverables’.”[25]

These comments were foreshadowing the reasons why several of the Gen-1 ERCs created before and after these comments were made would not succeed in their third-year renewal reviews.

To address these concerns, Preston led a team of ERC PDs to work with industry in February 1987. The result was a requirement that ERCs develop strategic plans for research. This reflected a philosophy that “directed” fundamental and applied research was a more effective means of achieving engineering systems goals than research motivated by the separate interests of faculty or targeted problem solving for industry. As an ERC Program staff member voiced to the GAO in 1987, “The goal was for these plans to serve to organize the research to reflect industry’s needs for deliverables and the researchers needs for freedom to pursue individual research interests.”[26]

The next ERC Annual Meeting, in 1987, was used to explore the strategic planning construct and how to develop it in an academic culture. There were working sessions on how to develop a strategic plan, how to define a deliverable, what a testbed would be in an academic setting, and what would constitute a technology deliverable or prototype. Preston remembers that the use of the words “deliverables” and “prototypes” caused some of the academics a lot of confusion and anxiety, as their primary goals in the past had been to deliver knowledge and publications, not technology, and that was how they were rewarded in tenure and promotion.

This conflict was ameliorated over time by a clarification that a technology deliverable would be an academic scale proof-of-concept testbed, as opposed to an “industrial” product or process prototype. Industry also recommended that these deliverables be early-stage and flexible so they could be adopted by member firms and pursued further in different ways. James Solberg, the Director of the Purdue ERC for Intelligent Manufacturing Systems noted at the 1991 ERC Symposium, that “strategic research planning was at first difficult for ERC participants to accept; but …all ERCs now agree that such planning is essential and that it does not necessarily lead to restrictions on the individual’s research if it is conducted in the right way. The ‘right way’ for strategic research planning in an ERC is not the same way that industry would pursue it; instead, it must be a form of planning that is appropriate to the university. It must provide ample freedom to maneuver, to shift directions, and capitalize on new ideas and opportunities. That is essential for good academic research. But it must also provide a sense of direction, expectations, and intended goals—i.e., overall strategy—so that the efforts of the group can be integrated.”[27]

At this symposium, Anthony Acampora, then the Director of the Columbia ERC, pointed out that an ERC’s “Strategic Research Plan:

  • Identifies emerging technological trends that impact the Center’s charter
  • Articulates the vision that drives the Center’s work
  • Assesses technical feasibility
  • Identifies key systems and technological challenges and their interdependency
  • Contain research and education goals and provides a plan for achieving these goals
  • Selects cross-disciplinary projects, identifies technological thrust areas, and projects milestones
  • Provides a mechanism for industrial input
  • Establishes review and updating procedure.”[28]

During the Gen-1 period, the initial approach to strategic planning was through the organization of the research programs into manageable “groupings”—or what was termed research program thrusts—and milestone charts were developed to provide a visual representation of the plan at the center and thrust level. Figure 5-2 is an example of one of these thrust-level milestone charts, used by the Wisconsin Plasma-Aided Manufacturing ERC (Class of 1988) to manage Thrust 3, which was focused on plasma synthesis, sintering, and spraying.[29] The long-term goal was for proof-of concept testbeds of automatic controls and a testbed to enable analysis for selected borides and carbides. This is a typical milestone chart for that period and its shortcoming was a failure to depict the knowledge and intermediate technology needed to reach the proof-of-concept testbed stage.

By the end of the Gen-1 period in 1993, Preston voiced the issues as follows:

“Traditionally, research projects originated from avenues of inquiry generated by individual investigators or in response to a particular problem posed by an industrial sponsor. In contrast, the strategic plans put a fundamental new twist on a research program, directing it to ‘strategic’ knowledge creation. They focus on long-term advances, plan for intermediate demonstration of concepts in experimental testbeds to explore ideas along the way, and serve as road maps for identifying and integrating projects necessary to move toward the needed advances. The strategic plans involve a combination of science or knowledge-driven and technology-driven research. They are flexible and evolutionary allowing industry and academe to explore technological options, with the luxury of the possibility of success and failure. Both avenues lead to greater understanding of needed advances.”[30]

While that was the new paradigm for ERC strategic planning, there was still a long way to go before it would be operationalized from the technology vision on down to the research platform. For some ERCs with a strong and guiding vision for a technology, the technology-driven research program was easier to operationalize. That could be seen in ERCs where the systems testbed was crucial to explore the new concepts. For example, one of the technology systems goals of the ERC for Emerging Cardiovascular Technology at Duke were technologies for cardiac defibrillation and arrhythmia prevention, so they developed research projects to understand the electrical fields within the heart and the wave-forms of a shock to the heart, to model and simulate them, and to develop new electrodes and analog circuits. This work led to pioneering insights that biphasic wave forms delivered to the heart require less voltage and energy to defibrillate. This pioneering work led to improved defibrillators and eventually to industry’s development of the portable defibrillator discussed in Chapter 11.

Figure 5-2:  Milestone Chart for Plasma-Aided Processing ERC’s Thrust 3 (Source: Center for Plasma-aided Manufacturing)

By the end of the Gen-1 period, it had become clear to ERC research leaders that strategic planning was essential for the success of an ERC, and that there were a number of essential components of such a plan:

  • An ERC has to establish a team culture versus the more traditional, individual-research culture, both intra- and inter-university;
  • Ensure that the overall ERC vision and mission are articulated in the plan and shared by those in the research thrust leader’s area of responsibility;
  • Define resource and budget needs, given the goals;
  • Lay the groundwork to take advantage of the best communication technology (e.g., to facilitate “brainstorming sessions” and other necessary interactions).
  • Define succinct deliverables and outcomes on reasonable timelines.[31]

These basic elements remained consistent from then on in ERCs, with the addition in Gen-2 of a crucial planning tool that will be described in section 5-B.

5-A(e)    New Paradigms in Research

The combination of strategic knowledge creation driven by complex engineering systems goals and experimental testbeds led some ERCs to create new research areas. Prominent among these are biofilm science and engineering, catalyzed by breakthroughs at Montana State University’s Center for Biofilm Engineering (CBE); and bioprocess engineering advances at MIT’s Biotechnology Process Engineering Center (BPEC) that laid the foundation for the new field of biological engineering. The Synberc ERC provided major impetus to the emerging field of synthetic biology. Additionally, ERC Program investments in optics and optoelectronics in the late 1980s and early 1990s at several ERCs built a solid foundation for the field of optoelectronic engineering. All of these stories are told in some detail in Section 5-D(b), Emerging Fields Catalyzed by ERCs.

5-B       Gen-2 (1994-2006)

During the second generation of the ERC Program, the research programs of the Gen-1 ERCs were in their later years of productivity and the research programs of the new centers created during this period benefitted from an increased sophistication of knowledge within the ERC Program and among the ERCs about how to develop a coherent engineered systems vision, structure a plan, and manage a research program to achieve it.

By 1998, the research key features required of ERCs reflected this greater refinement:

  • A guiding strategic vision to produce advances in a complex, next-generation engineered system and a corresponding new generation of engineers needed to strengthen the competitive position of industry and the Nation in a global economy;
  • A dynamic, evolutionary strategic research plan to focus and integrate the ERC to achieve its vision; and
  • A cross-disciplinary research program, promoting synthesis of engineering, science and other disciplines, spanning the continuum from discovery to proof-of-concept in testbeds, and involving undergraduate and graduate students in research teams.[32]

The phrase “engineering systems” was changed to “engineered systems.” This changed the focus from a broader, almost cultural context that was hard to implement through research, to a technology concept that required engineering to deliver functionality. The ERC Program defined engineered systems as “deriving from integrating a number of components, processes, and devices to perform a function. The system may be living or inanimate in origin. It must be complex and challenging enough to justify a ten-year program of research. Analysis, modeling or development of individual components of a system, without their integration into a complex engineered system, is not an appropriate focus for an ERC.”[33]  

The definition of ERC testbeds was articulated as follows: “proof-of-concept testbeds in ERCs are used to explore an ERC’s next-generation engineered system to determine if all components work together as planned and the system is feasible. The process of building the testbed and beginning to integrate various devices and components or processes often revealed barriers, which generated new fundamental research projects. These testbeds help to ensure that the research outcomes are integrated and tested and supply a framework for faculty, students, and industry representatives to work together and gain a better understanding of the realities of the system they are exploring and demonstrating.”[34] (See chapter 3, section 3-B for further discussion of engineered systems and testbeds.)

There were 28 ERCs awarded during the Gen-2 period. These were the ERCs that proposed systems visions, as opposed to systems goals, and those visions and their testbeds became increasingly complex as ongoing ERCs and new proposers better understood how to use strategic research planning to develop integrated, cross-disciplinary research programs to address their technology goals and systems visions. These 28 ERCs and their systems visions are summarized in “Gen-2 Systems Visions.

Because of the complexity of systems visions of the ERCs awarded between 1994 and 2006, these ERCs also had to build cross-disciplinary platforms across fields where there had been little past collaboration.

Preston reiterated her definition of cross-disciplinary, as opposed to multi-disciplinary or interdisciplinary, in her plenary address to the ASEE’s Engineering Research Council’s Summit in 2004, as follows:

  • Multidisciplinary Research:  Involves different disciplines that are not necessarily integrated
  • Cross-Disciplinary Research: The integration of the capabilities of different disciplines to address a major challenge in research or technology
  • Interdisciplinary Research:  Long-term cross-disciplinary collaboration blurs the lines between the disciplines, often leading to new fields such as bioengineering, photonics, MEMS, etc.[35]

Some strong and lasting research collaborations were built between engineering and medicine that continue today. For example, the partnership between the schools of engineering, computer science, and medicine created through the CISST ERC at Johns Hopkins continues today and has significantly impacted surgical techniques there and across the country. As was noted earlier, some of these collaborations created new disciplines, such as biological engineering and synthetic biology. Some ERCs joined engineering and social and policy sciences, like the three Earthquake Engineering ERCs and the CASA ERC headquartered at the University of Massachusetts at Amherst, reaching beyond academe to include emergency management personnel.


The ERCs in this cluster built strong collaborations between engineering and molecular biology and neurobiology in order to address their visions for gene biotechnology delivery systems, synthetic biology, or neuromorphic systems, for example. Some of the ERCs with technical underpinnings in electrical engineering reached out to form collaborations with medicine and biology to achieve their goals, such as an artificial retina at the BMES ERC at USC, the cochlear implant at the WIMS ERC at the University of Michigan, and a cortical implant at the Neuromorphic Systems Engineering Center at CalTech.

The cross-disciplinary research platforms built by these 13 ERCs were:

  • Biotechnology Process Engineering Center at MIT joined biochemical engineering, chemical engineering, molecular biology, chemistry, and medicine.
  • Center for Neuromorphic Systems Engineering at CalTech joined electrical engineering, chemistry, computer science, neurobiology, and medicine.
  • Engineered Biomaterials ERC at the University of Washington joined biomedical engineers, biologists, chemists, materials scientists, physicians, and dentists[36]
  • Georgia Tech/Emory Tissue Engineering Center joined biochemical engineering, biology, and medicine
  • Computer-Integrated Surgical Systems and Technologies ERC at Johns Hopkins joined electrical engineering, biomedical engineering computer science, and medicine and in the process built one of the strongest partnerships between engineering and medicine among all the ERCs.
  • Marine Bioproducts ERC at the University of Hawaii joined chemical engineering, biological engineering, biology, chemistry, and ocean science.
  • The VANTH ERC at Vanderbilt University built unique partnerships between biomedical engineering, medicine, and engineering education research that plowed new ground in integrating core knowledge in engineering with that in engineering education to explore new ways of teaching biomedical engineering.
  • The Wireless Integrated MicroSystems ERC at the University of Michigan devoted part of its effort to developing a cochlear implant that required the integration of skills in electrical engineering, materials science, and medicine.
  • The Center for Subsurface Sensing and Imaging Systems at Northeastern University devoted part of its efforts to breast cancer imaging that required joining electrical engineering, biomedical engineering, computer science, and medicine.
  • The Biomimetic MicroElectronic Systems ERC at the University of Southern California joined electrical engineering and medicine not only through the educational background of its director, Dr. Mark Humayun, M.D., but also through collaborating faculty via a partnership with the Keck School of Medicine and its Doheney Eye Institute and the USC’s Viterbi School of Engineering. The cross-disciplinary team also included faculty with disciplinary expertise in electrical engineering, biomedical engineering, physiology, neurobiology, and biology.
  • The Quality of Life Technologies ERC at Carnegie Mellon built a complex cross-disciplinary space that joined computer science and robotics, electrical engineering, biomedical engineering, mechanical engineering, psychology, gerontology, and sociology.
  • The Synthetic Biology ERC at the University of California at Berkeley joined biological engineering, chemical engineering, biology, ethics, and genetics to form over time a new discipline, synthetic biology.
  • The ERC on Mid-Infrared Technologies for Health and the Environment at Princeton formed a cross-disciplinary space to work on medical diagnostics that involved electrical and optical engineering and medicine.

During her plenary address to the ASEE Engineering Research Council in 2004, Lynn Preston pointed to powerful examples of how new discoveries can come from the interface of biology and engineering, such as at the Center for Biofilm Engineering, an ERC at Montana State University.

“Biofilms form when different strains of bacteria bind together in a sticky web wherever there is water. Some biofilms can serve a beneficial role in reaction systems for the treatment of waste-containing liquids or in the bioremediation of contaminated groundwater aquifers. However, detrimental biofilms have been implicated in diseases such as cystic fibrosis and blood poisoning to infected catheters. Others secrete acids that eat away tough metals and minerals corroding the legs of oil derricks and even your teeth.

Before the ERC began to bring the power of engineering and microbiology to bear on the study of bacterial biofilms, our concept of these adherent populations of bacteria was that biofilms stuck to the surface by means of their own slime and the only way to treat them was with chemicals. However, the ERC was determined to understand the biological nature of these films so they could engineer them to control their formation and use.

They began by using confocal microscopy and physical probes that could be used to examine the structure of the films. They found complex, sophisticated architectures. The biofilms were seen to live in slimy towers and mushroom-shaped structures with water channels that carried nutrients to all parts of the community. The ERC team found that the bacteria communicated through chemical sensing mechanisms to form these structures.

More importantly, the ERC team discovered that the introduction of a mutant strain of the bacteria, that did not contain the signaling chemical, caused the towers and structures of the biofilm to collapse. Without the signaling molecule, the cells cannot make a biofilm.

This interdisciplinary team of engineers and biologists has discovered a class of compounds that can prevent biofilm formation. The applications in industry and medicine are legion. We can now manipulate at least one and probably many more behaviors of bacterial cells instead of simply killing them with toxic agents that harm the environment or the host. We can use these simple nontoxic molecules, active blocking analogues, which will be cheap, stable, and environmentally friendly. The whole business of biofilm control, in industry and in medicine, has entered a new era in which chemical manipulation will replace indiscriminate killing with toxic agents. It is a true green revolution that came from the driving desire of engineers to understand a phenomenon in order to control it, and that understanding required the collaboration of biologists and engineers.”[37]

The cartoon of a biofilm in Figure 5-3 illustrates this cell-to cell-communication that results in an architecture that enables the flow of nutrients within the biofilm.

Figure 5-3: Cell-to-cell communication in a biofilm. (Source: CBE)[38]


Five ERCs in this cluster that were focused totally or in part on mitigating or sensing environmental pollution formed cross-disciplinary spaces that joined chemical engineering, electrical engineering, environmental engineering, chemistry, and medicine.

  • The NSF/SRC ERC for Environmentally Benign Semiconductor Manufacturing at the University of Arizona joined chemical engineering, electrical engineering, environmental engineering, mechanical engineering
  • The Wireless Integrated MicroSystems ERC at the University of Michigan devoted part of its effort to developing an environmental pollution sensing system that required the integration of skills in electrical engineering, environmental engineering, and chemistry.
  • The Center for Subsurface Sensing and Imaging Systems at Northeastern University devoted part of its efforts to detecting hazardous wastes and other hazardous materials beneath the surface that required joining electrical engineering, optical engineering, environmental engineering, and civil engineering.
  • The Center for Environmentally Beneficial Catalysis at the University of Kansas joined chemical engineering with environmental engineering.
  • The ERC on Mid-Infrared Technologies for Health and the Environment at Princeton University devoted part of its effort to developing chemical sensing capabilities for environmental monitoring that required joining electrical engineering, optical engineering, chemical engineering, chemistry, and physics.

The three Earthquake Engineering Research Centers (EERCs) built complex, cross-disciplinary research platforms that joined earthquake engineering, civil (structural and geotechnical) engineering, geology, and sociology. Each EERC was required to have a research thrust devoted to societal response to earthquake hazards.


  • The Institute for Systems Research (formerly the Center for Systems Research) at the University of Maryland continued to function with its cross-disciplinary team that involved electrical engineering, design automation, computer science, and information science.
  • The ERC for Collaborative Manufacturing at Purdue started its new award phase building a team that joined electrical engineering, manufacturing engineering, mechanical engineering, and communications science.
  • The Particle Engineering Research Center at the University of Florida built a team that joined mechanical engineering, chemical engineering, chemistry, pharmacy, dentistry, and medicine.[39]
  • The Center for Innovative Product Development at MIT joined faculty from mechanical engineering, computer engineering, and the school of business administration.
  • The Center for Reconfigurable Machining Systems at the University of Michigan joined mechanical engineering with electrical engineering, optical engineering, systems engineering, and economics
  • The Center for Advanced Fibers and Films at Clemson University joined mechanical engineering, with chemical engineering, chemistry, computer science and visualization, and materials science and engineering.
  • The ERC for Compact, Efficient Fluid Power at the University of Minnesota built a cross-disciplinary platform that integrated mechanical engineering with electrical engineering and biomedical engineering.[40]
  • The ERC for Structured Organic Particulate Systems at Rutgers University integrated chemical engineering, biochemical engineering, mechanical engineering, pharmaceutical engineering, and pharmacy.[41]


  • The Center for Neuromorphic Systems joined faculty from electrical engineering, chemistry, computer science, neurobiology, and medicine.
  • The Packaging Research Center at Georgia Tech joined electrical engineering with mechanical engineering, optical engineering,
  • The Integrated Media Systems Center at USC joined electrical engineering, computer science, and music.
  • The Extreme Ultraviolet Science and Technology ERC at Colorado State University built a robust platform that integrated electrical engineering, optical engineering, biology, chemistry, and physics.
  • The Collaborative Adaptive Sensing of the Atmosphere ERC at the University of Massachusetts-Amherst joined electrical engineering with mechanical engineering, atmospheric science, sociology, and public policy.

5-B(a)    Gen-2 Strategic Planning

Given the complexity of the engineered systems visions of the ERCs in this period and the scope and complexity of their cross-disciplinary teams, effective strategic research planning became even more important for these Gen-2 ERCs to succeed. These ERCs started out with a framework for strategic research planning built by the Gen-1 ERCs that was based on milestone charts, at the center and thrust level. These charts plotted out knowledge and technology advances on a time line over the life of the center at the center and research thrust level. Figure 5-4 is an example of one of the more complex of these charts, developed by the Particle Engineering Research Center at the University of Florida.[42] It shows that developing novel coatings for microbe removal from surfaces was a technology goal in 1998, projected out to the graduation of the Center in 2006. However, the chart does not clearly depict what research would enable that goal and how that goal contributed to higher-level systems goals. This was a typical problem with using milestone charts and not a weakness of that particular ERC.

Therefore, while the milestone charts did serve as a way of organizing the research, helping to keep an ERC’s team focused on its knowledge and technology goals, Preston was concerned that the charts were not effectively showing how the engineered system was a driver for the research and were not depicting the dynamic nature of an ERC’s research program—the push and pull between fundamental knowledge and technology. In addition, there was a tendency to leave the proof-of-concept testbed to the last year of the ERC whereas it was more likely that intermediate-stage testbeds would be needed to test components before the systems-level testbed could be started.

Accordingly, in 1997 Preston developed the 3-plane ERC strategic research planning chart to address these concerns. She worked with Fred Betz and Cheryl Cathey, ERC PDs at the time, to perfect how to show the flow of research, from fundamentals to enabling technology testbeds, to systems-level testbeds. As shown in Figure 5-5, the approach was to put the engineered system testbed(s) and engineered systems-level research at the top of the chart and show through driving arrows down to the fundamental research plane how the system requirements would be addressed through fundamental research. It also visualized how those fundamental insights would flow back up the chart to the enabling technology plane, where enabling technology-level research would occur and enabling technology would be tested and strengthened, eventually delivering technology insights up to the systems plane for testing in a systems-level testbed. The intent was to be able to display the research program in one comprehensive yet simple chart, so that funders and reviewers could better understand the ERC’s research program and deliverables, and faculty and students could see the work that needed to be done and how their skills could contribute to the ERC team’s achieving its goals. It is designed to be flexible and dynamic over time and should not be perceived as a “product production plan.”

Figure 5-4:  Milestones for Research, Particle Engineering Research Center, University of Florida (Source: PERC)

The 3-plane chart crystalized Preston’s management of the ERC Program, which has been characterized as “’Engineering Innovation’—here, ‘Engineering’ is used as a verb in the sense that the raison d’être of the ERC Program is to create and foster new technical innovations (i.e., incremental or disruptive improvements to a technology, service, or standard). That is, ERCs are devoted to ‘engineering’ (i.e., creating) new technical innovations.”[43]

This was a significant shift in the way engineers planned their research portfolios. Initially, the intellectual discipline and visualization skills it required were a challenge for some ERCs, so the 3-plane chart met with a mixed reception from the ERC community. However, ERC review teams and industrial supporters welcomed it for the clarity of thought it required of the ERCs. Over time, it became the standard by which to measure the effectiveness of an ERC’s strategic research planning.

Figure 5-5:  Standard ERC 3-Plane Chart

From Figure 5-6, it is possible to see that the Center for Neuromorphic Systems Engineering (CNSE) was focused by four systems-level testbeds: autonomous vehicle systems, networked sensing, neuroprosthetics, and human machine interfaces.

The chart clearly displays how the fundamental research needed to achieve those systems-level testbeds fed into a broad range of enabling technologies that fed specific systems testbeds or, in most cases, enabled several of them. For example, the neural prostheses testbed drove fundamental research in circuits and optoelectronics which fed into enabling VLSI, optoelectronics, MEMS, and nanofabrication enabling technology. It also drove fundamental research in distributed systems theory and sensory-based behavior in order to develop the sensory systems for the prostheses. And finally, fundamental knowledge was needed for attenuation and awareness, as well as cortical physiology and anatomy, to develop effective enabling brain-computer interface technology. The final system was used to enable a chimpanzee to manipulate a game to retrieve a “treat” through mind control rather than motor control.

Currall and his team found that the ERC Program’s requirement to couple the 3-plane chart with detailed milestone charts by thrust area resulted in more effective delivery of research findings and technology, as shown in Figure 5-7 from the CalTech ERC.[44]

Figure 5-6:  3-Plane Strategic Research Plan for the Neuromorphic Systems Engineering Center (Source: CNSE)

The CNSE joined engineers with neurobiologists and for those scientists the chart enabled them to more easily see how their work contributed to the long-term systems goals of the center. It also enabled many of the neurobiology faculty and students to better understand engineering thinking. As a consequence, the ERC produced a new generation of neuromorphic engineers who fanned out across the country in engineering and science departments building bridges between the two fields of inquiry, as Preston remembers the assembly of young CNSE graduates who gave presentations at the ERC’s “celebration of graduation” from NSF/ERC support in 2006.

Figure 5-7:  CNSE Thrust-Level Milestone Chart (Source: CNSE)

At the same time that the 3-plane chart improved the management of research to address long-term systems goals, it should be understood that there was always room in an ERC’s research program for “opportunistic” research relevant to the long-term goals of the ERC, with roles in fulfilling those goals or setting up new pathways that was not well understood.

While Preston could see that the 3-plane chart improved an ERC’s communication and research management, she wanted to understand how the ERCs were using it in the day-to-process of research management and to be sure they weren’t just “trotting it out” for display in their annual reports and annual reviews. In 2004, she and Linda Parker, the ERC Program evaluation specialist, gave a grant to Steven Currall, who was then at Rice University, to study how ERCs used the 3-plane chart in planning. He was from the Rice School of Management and had provided guidance to Professor Vicki Colvin, the PI of the Rice Center for Biological Nanotechnology (which was not an ERC), on how to develop a 3-plane chart for her proposal to NSF for a Nanoscale Science and Engineering Center.

Currall and his team interviewed 22 ongoing ERCs in 2005 to determine the impact of the 3-plane chart on research publication productivity and technology applications. They concluded that:

“…the three-plane framework and a formal process of strategic planning were vital tools for organizing the research endeavor within ERCs. Also, the three-plane framework was a useful tool for illustrating each center’s strategic plan. Yet, the method of implementing the three-plane framework critically determined whether it was beneficial to overall planning formality and quality of planning (i.e. comprehensiveness) and organizational outcomes. The most important determinant of whether planning benefited organizational outcomes was the overall comprehensiveness of the planning, rather than commitment to the planning tool or process.”[45]

The results corroborated Preston’s intent that the chart become a flexible tool that was dynamic throughout the life of the ERC. “As an ERC evolves, it periodically submits a revised three-plane framework, presenting a revised strategic plan to the NSF Program for review and comment.”[46] The study found that most of the ERCs viewed strategic planning and the three-plane framework as valuable. Leaders who had used the chart in their proposals or were familiar with strategic planning found value in top-down planning and became champions of planning and the three-plane framework in their conversations with faculty members within the ERC. There was a second cohort who valued strategic planning but found “modest” value in the three-plane framework. Currall indicated that might be because they were still learning how to use it and Preston surmises that most likely they were also learning how to think from the top down. Currall also found that a number of ERCs placed more emphasis on curiosity-driven research, doing only the minimum required to adhere to the ERC Program’s requirements for planning.[47]

The study found that the ERCs were creatively coming up with solutions to how to display their research and its complexity, especially when they had more than one systems goal. There was some concern about the timeline, as ten years was thought to be too short for most significant advances emanating from the life sciences; some even recommended that there be a precursor center program at NSF that fostered very early-stage interdisciplinary basic research,[48] with presumably a potential for eventual interfaces with engineering.

Currall and his team found that for the 3-plane chart to be an effective strategic planning tool it needed a champion who fully understood its value and could effectively communicate that to the ERC’s team. They also found that when there was not an effective champion, little use of the chart took place. “In those cases, a preexisting research plan was retrofitted into the three-plane chart and no real changes in future-oriented thinking occurred. The framework, was used primarily for communications to the NSF,” as Preston had been worried about. [49]

One team she didn’t have to worry about was the team from the Center for Collaborative Adaptive Sensing of the Atmosphere (CASA), as David McLaughlin, the Director, had enthusiastically embraced the 3-plane chart from start-up, even with a sense of humor, as shown in Figure 5-8.

Figure 5-8:  The CASA 3-Plane Strategic Planning Cake presented to Lynn Preston by Brenda Phillips and David McLaughlin at the ERC’s Start-up Celebration in 2003 (Credit: Janice Brickley)

CASA’s strategic plan, shown in Figure 5-9, illustrates the ERC’s complex systems vision—build small-scale radars with sensors that could “collaborate” to communicate their findings to determine whether or not a storm had the telltale hook that signaled a tornado, and transmit that finding to emergency response personnel responsible for tornado warnings.

Figure 5-9:  The CASA ERC 3-Plane Strategic Research Plan (Source: CASA)

The need to integrate the findings from the radars with those responsible for hazard warnings took a few years to be fully understood and implemented because it required engineers and atmospheric scientists to be able to effectively communicate together and with social scientists and emergency response personnel. By Year 4 that effort had begun and culminated in a four-radar testbed aimed at supporting hazardous weather response in “Tornado Alley,” Oklahoma. The experimentation between CASA and the National Weather Service was the achievement of then-ILO Brenda Philips, who knew how to communicate between the world of the weather service, with its responsibility for accurate forecasting and timely warnings, and the scientists and engineers who built the CASA system. That test yielded warnings for tornados that were three minutes faster than the NOAA radar system—a critical time difference.

Another testbed resulted in spotting a tornado two hours ahead of impact and tracking its path—which saved lives, as shown in Figure 5-10, transmitted to Preston by the CASA team. The Oklahoma City Journal Record reported on July 1, 2011, “The data from a new radar system being tested in Newcastle was so precise that refugees from the storm were able to time the closing of the town’s public shelter down to the last minute,” City Manager Nick Nazar said. “The opportunity to use this advanced technology was very helpful and probably saved lives. It was literally up to the minute and it made a difference.”[50]

Figure 5-10:  Outcome of Radar System Test in Newcastle, OK (Source: CASA)

5-B(b)    Adoption of Earthquake Engineering Research Centers

In 1999, the Assistant Director for Engineering, Eugene Wong, transferred three Earthquake Engineering Research Centers (EERCs) to the ERC Program in order for them to benefit from the ERC Program’s “seasoned” post-award oversight system. The Division of Civil and Mechanical Systems had solicited proposals to create new EERCs in 1996 and three were awarded in 1997. They were:

  • Multidisciplinary Center for Earthquake Engineering Research (MCEER), headquartered at the University at Buffalo
  • Mid-America Earthquake Center (MAE), based at the University of Illinois at Urbana-Champaign
  • Pacific Earthquake Center (PEER), led by the University of California at Berkeley.

However, the CMS division staff had little or no experience with centers. They issued highly detailed cooperative agreements, required NSF approval for each subaward to each partner of each EERC, and provided little guidance on how to operate a center. The EERCs complained about micro-management and delays in funding to Dr. Wong. As a consequence, he implemented the transfer of these centers and their NSF budgets to the ERC Program.

The transfer brought the following benefits to the EERCs:

  • More flexible cooperative agreement
  • NSF approval clause for subcontracts removed
  • ERC Best Practices Manual providing guidance from other ERCs on:
    • Research
    • Education
    • Industry collaboration
    • Administrative management
  • Revised renewal plan that gave the centers one more year before their first renewal review, due to the transfer.

Overall, the research goals of the EERCs were to advance knowledge and technology in earthquake engineering research and earthquake hazard mitigation through the integration of engineering, earth science, and social science. To achieve these goals, each EERC was restructured with the following ERC key features:

  • Vision for systems aspects of earthquake hazard mitigation
  • Strategic plan to focus and integrate the resources of the EERC to achieve its vision
  • Research program integrating engineering, earth science, and social sciences, from discovery to proof-of-concept(testbeds and demonstrations projects), involving both undergraduates and graduates in cross-disciplinary research teams.[51]

That transfer meant that the centers had to reorganize their research programs to more effectively join faculty from various disciplines to address the systems issues and opportunities in the field. They could no longer function like a collection of single investigators clustered around the important experimental equipment.

As an example, Figures 5-11 and 5-12 characterize the changes in the PEER research plan introduced by the 3-plane strategic research planning process.[52]

Dr. Joy Pauschke, an ERC PD with earthquake engineering training, was assigned the oversight responsibility for the EERCs. She and Preston visited each of the EERCs in 1999 to bring them up to speed on the ERC Program’s goals for research, education, and industrial collaboration and to become familiar with each center’s goals and research/education teams. These visits included training in how to structure their research programs using the ERC 3-plane strategic research planning tool. To Preston, there was no better example of an engineered system than a hazard mitigation technology designed to enable a structure to withstand the forces of an earthquake as they move through the earth and soil and impact the structure. While each EERC was designed to develop research programs to develop and test these technologies, there was a tendency to break down each effort by discipline, with little incentive to integrate the knowledge from a systems perspective.

Figure 5-11:  PEER EERC Strategic Research Plan in 2000 (Source: PEER)

Figure 5-12:  Project Selection Before the ERC 3-Plane Strategic Plan and After (Source: PEER)

The systems perspective of PEER focused on Performance Based Earthquake Engineering (PBEE) technologies for buildings and infrastructure to meet the diverse economic and safety needs of owners and society. The center laid the groundwork and made inroads toward creating the data, models, and the performance criteria and applied the performance-based techniques to testbeds and studies to quantify the expected performance of current engineering design practices. PEER’s 3-plane strategic plan, shown in Figure 5-13, illustrates how the plan is “driven by the Needs and Requirement of Clients, Stakeholders, and the Marketplace, involves research with the Technology Integration, Enabling Technologies, and Knowledge Base Planes; and produces Products and Outcomes that respond to the Needs and Requirements” of users. [53]

Figure 5-13:  PEER EERC’s Strategic Research Plan[54] (Source: PEER)

The chart demonstrates how the research planning strategy supported fundamental research in the knowledge base plane in ground motion, risk decision-making theory, non-structural performance assessment, foundation subassembly behavior, and other fundamental areas. This research fed into the loss assessment and reliability frameworks and tools, ground motion hazard models, and structural and other component simulation and performance models which supported the real-world systems testbeds—highways, bridges, and building systems in the Bay area. The systems/Technology Integration Plane in the chart represents the systems-level applications and studies in PBEE. The system includes the seismic environment, the soil-foundation-structure-nonstructural-contents systems, and the facility-impacted stakeholder segments. This plane contains the over-arching impact of PEER’s research program—specifically, the development of assessment and design methodologies that integrate the seismic-tectonic, infrastructure, and socio-economic components of earthquake engineering into a system that can be analyzed and on which rational decisions can be made.[55]

In PEER, “central to the enabling technologies are analytical models, ground motion libraries, and assessment criteria to simulate the performance of buildings and bridges. These are integrated through the OpenSees …. software platform, which enables nonlinear simulations and visualization of response”[56] OpenSees models have been validated with data from laboratory tests and data recorded during past earthquakes. The Open System for Earthquake Engineering Simulation (OpenSees) facilitates the development and implementation of models for structural behavior, soil and foundation behavior, and damage measures. In addition to improved models for reinforced concrete structures, shallow and deep foundations, and liquefiable soils, OpenSees was designed to take advantage of the latest developments in databases, reliability methods, scientific visualization, and high-end computing. It offered greater flexibility in combining modules to solve classes of simulation problems and allowed researchers from different disciplines to combine their perspectives for integrated implementation.[57]

The MAE ERC developed a decision tool, called MAEViz, that was designed for public policy-makers to use to determine the impact of decisions to retrofit buildings or transportation systems and bridges to withstand an earthquake based on consequence-based risk management. “Consequence-based Risk Management is a new paradigm for seismic risk reduction across regions or systems that incorporates identification of uncertainty in all components of seismic risk modeling and quantifies the risk to societal systems and subsystems.”[58] “MAEViz provides decision tools for public policy makers and others; at the request of the public policy advisors to the EERC, the outcomes of those tools are presented through visualization to enable policy makers who are not engineers to comprehend the consequences of risk reduction decisions.” [59] Thus, MaeViz used a visually-based, menu-driven system to generate damage estimates from scientific and engineering principles and data, test multiple mitigation strategies, and support modeling efforts to estimate higher level impacts of earthquake hazards, such as impacts on transportation networks, and social, or economic systems. It enabled policy-makers and decision-makers to ultimately develop risk reduction strategies and implement mitigation actions. At the same time, it became a focal point for the ability of the Center to drive the integration of disciplines and to facilitate the management and funding of the research within the Center.

Eventually, OpenSees and MaeViz formed the computational engines for the Computational and Modeling Simulation Center for National Hazards Engineering, headquartered at the University of California, Berkeley.

Attached here is a 2008 summary of NSF’s investment in EERCs and its impacts, written by NSF ERC Program Director Dr. Vilas Mujumdar.[60]

5-C       Gen- 3

5-C(a)    Gen-3 Awards (2008–2012)

The Gen-3 period spans from 2008 through 2017. Since Lynn Preston retired in 2014 before the Class of 2015 was awarded, this document will focus only on the 12 ERCs awarded in the first three Gen-3 classes—2008, 2011, and 2012—with which she has familiarity. Three new Gen-3 ERCs were also awarded in 2015 and four more in 2017. Thus, the listings in this section are current as of 2014.

The 12 Gen-3 ERCs in the first three classes and their systems visions are summarized in the file “Gen-3 Systems Visions.

5-C(b)    Cross-disciplinary Research Platforms Built by the Gen-3 ERCs



CASE STUDY: The Gen-3 ERCs functioning in this technology sector continue to build new spaces in research between engineering, biology, and medicine. They benefitted from the lessons learned about how to build and maintain these partnerships that were transmitted to them by the older, more experienced Gen-2 ERCs at the ERC Annual Meetings’ working groups, which were organized by sector. One, the Center for Sensorimotor Neural Engineering (CSNE) at the University of Washington (UW) was breaking new ground by forming new partnerships between engineering and neuroscience and ethics. At the ERC Annual Meeting the CSNE team took away lessons learned from Ted Burger, leader of the cortical implant team at the Biomimetic MicroElectronic Systems (BMES) ERC at USC, and Christof Koch, one of the PIs of the Center for Neuromorphic Science and Engineering (CNSE) ERC at CalTech, which rested on a partnership between engineers and neuroscientists; and in ethics from the biological risk efforts of the Synberc ERC at UC Berkeley. In addition, Tom Daniels, who served as the interim CSNE Director, was a member of the NSF site visit review team to Caltech’s CNSE prior to developing the CSNE proposal to NSF.

At times an ERC would become a role model and stimulant for new cross-disciplinary partnerships among faculty in engineering and the sciences. This was more prevalent when a strong partnership was formed between engineers and scientists in the ERC, so that the innovation-driven approach of engineers was transferred to scientists. An example of this occurred at CSNE. Tom Daniels, a biologist, took over the leadership of the Center at start-up when the PI, Yoky Matsuoka left the university shortly after the award was made. After Tom stabilized the ERC over a few years, two ERC faculty members, Rajesh Rao and Chet Moritz, took over its leadership. Tom returned to his biology roots; but his experience with the ERC way of thinking and leading research teams with the ERC strategic planning approach led him to head up the UW Institute of Neuroengineering (UWIN) ( in partnership with Adrienne Fairhall, from the Physiology and Biophysics department. UWIN is a privately funded program that provides new matching funds to the CSNE as well as a host of other programs promoting neuroengineering and computational neuroscience on campus, including the new Air Force Center of Excellence on Nature Inspired Flight Technologies (, which Tom also directs, and a WRF-Moore-Sloan Data Science Institute ( As he puts it:

“The most critical thing to realize is that the CSNE at the UW led to many new programs, ranging from new degree options to new private funding and new attention to the domain. Even more exciting was the recruitment of three new female faculty in this space:  Bingni Brunton (Data Science and Biology), Azadeh Yazdan (BioE and EE) and Amy Orsborn (BioE and EE). We now boast 50 faculty members who consider themselves core to the UW Neuroengineering community ( None of this would have come about without the CSNE. I must say that the lessons we learned from running an ERC and from the framing and structure it provided for coordinating complex systems of collaborations among scientists, engineers—and even philosophers!—were instrumental in our ability to keep the CSNE going, to leverage entire new programs (UWIN, NIFTI, eScience), and to begin the new broad interests at the UW and nationally in this domain.”[61]


The four Gen-3 ERCs in this sector  are: 

  • Center for Biorenewable Chemicals, Iowa State University, joined biochemical engineering, chemical engineering, biology, chemistry genetics, life-cycle analysis, economics.[62]
  • ERC for Revolutionizing Metallic Biomaterials, North Carolina A&T University, joined biomedical engineering, materials science and engineering, medicine (orthopedics, neurosurgery), clinical pathophysiology.[63]
  • NSF Engineering Research Center for Sensorimotor Neural Engineering (later renamed Center for Neurotechnology),University of Washington, joined bioengineering, electrical engineering, mechanical engineering, computer science, biology, neural engineering neural ethics, neurobiology, neurological surgery, physiology and biophysics, radiology, rehabilitation medicine, speech and hearing, statistics.[64]
  • Nanosystems ERC for Advanced Self-Powered Systems of Integrated Sensors and Technologies, North Carolina State University, joined electrical engineering with biomedical engineering, chemical and biomolecular engineering, computer engineering, mechanical engineering, mechanical engineering, behavioral health, computer science.[65]


  • ERC for Quantum Energy and Sustainable Solar Technologies, Arizona State University joined electrical and optoelectronic engineering with mechanical engineering, materials science and engineering, energy and environmental policy, social science, and physics.[66]
  • ERC for Re-Inventing America’s Urban Water Infrastructure, Stanford University, joined civil engineering with environmental engineering, mechanical engineering, architecture and urban design, earth sciences, political science, urban water policy.[67]
  • Future Renewable Electric Energy Delivery and Management (FREEDM) Systems Center, North Carolina State University, joined electrical/power engineering, with mechanical and chemical engineering and computer science.[68]
  • ERC for Ultra-wide Area Resilient Electric Energy Transmission Networks, University of Tennessee–Knoxville, joined electrical/power engineering with computer science. [69]


  • Nanosystems ERC for Nanomanufacturing Systems for Mobile Computing and Mobile Energy Technologies, The University of Texas at Austin, joined chemical engineering with mechanical engineering, computer engineering, and materials science and engineering.[70]


  • Smart Lighting ERC (later renamed Lighting Enabled Systems & Applications), Rensselaer Polytechnic Institute, joined electrical engineering with optoelectronic engineering, chemical engineering, industrial systems engineering, mechanical engineering, economics, management, materials science and engineering, and physics.[71]
  • Nanosystems ERC for Translational Applications of Nanoscale Multiferroic Systems, University of California, Los Angeles, joined electrical engineering with mechanical engineering and materials science and engineering. (Translational research is defined as research that explores issues involved in moving a technology from the proof-of-concept stage through the early phases of product development—see following section.)[72]

5-C(c)     Innovations in Research in Gen-3: Translational Research

During the Gen-2 period it became apparent that in some fields industry was not ready to take the risk to move the results of ERC proof-of-concept testbeds into the next phase of development. Earlier, that phase of product development had been the role of a firm’s R&D laboratory. However, as those laboratories were phased out and funding for higher-risk, long-term explorations in product development within a firm was diminishing, industry and the country looked to the emerging small R&D business sector to step into the void. A culture of innovation began to emerge in academia by the late 1990s and some professors and/or graduate students saw opportunities to go outside of the academic culture to further develop technologies with academic origins. There was often a gap between explorations of technology development possible through an academic proof-of-concept testbed in an ERC and the further testing and development needed to reach a product phase. Realizing this need, the ERC Program began to support supplements to ongoing ERCs for translational research. The aim was for the ERC to partner with a small R&D business or foster a start-up to carry out that phase of research and product development. However, the intent was not to foster a culture of product development within an academic laboratory. The role of translational research in R&D has been characterized as research to bridge the “Valley of Death,” which Preston preferred to label the Valley of the Shadow of Death, to give it a more optimistic twist.

These supplements began in the early 2000s with awards to some of the biological engineering ERCs, where these types of high-risk efforts were not readily supported by industry.  These were followed by a formal partnership with NSF’s Small Business Innovation Research (SBIR) Program in support of translational research between ERCs and small R&D firms. The role of this effort is shown in Figure 5-14.

Figure 5-14:  Translational Research and the Innovation Spectrum (Credit: Angus Klingon)

A fuller discussion of this effort and the intellectual property issues that arise from it can be found in Chapter 6(E).

5-C(d)    2009 ERC Innovation Awards to Stimulate the Economy

As a result of the American Recover and Reinvestment Act (ARRA) of 2009, an initiative of then-President Barack Obama also known as the Stimulus Act, NSF received increased funding in 2009 to be used to advance discovery and innovation. The awards made in the summer and fall of 2009 were:

  1. Translational Research Platforms—Carry out research needed to span the gap between ERC-generated research outcomes and commercial products:
  2. Rutgers/CSOPS – Continuous pharmaceutical processing: $1.8M for 36 months.

The goal of the project was to assemble a coalition of technology suppliers, led by a system integrator, to carry out proof-of-concept testbed research and development to bring to the market commercial-grade integrated technology for continuous manufacture of pharmaceuticals. This testbed project expanded the output scope of the ERC from uncoated tablets by compression or granulation to enable continuous manufacture of both coated and uncoated tablets and capsules, as depicted in Figure 5-15. The proposal abstract noted: “A high level of interest exists at the present time in this technology, both by the US FDA and by large pharmaceutical manufacturers, many of which are CSOPS members. Many technology suppliers that are also members of C-SOPS have also indicated a keen interest in addressing this market need. The key missing element needed for successful commercialization is that, at the present time, no single technology supplier has all the necessary capabilities required to address this commercial opportunity. Thus, the main goal of this proposal is to assemble a coalition of technology suppliers, led by a systems integrator, and to enable them, by knowledge transfer and technical support, to commercialize fully integrated “turnkey” manufacturing systems.“[73]

By the third year of the project, the ERC reported in its annual site visit that they had demonstrated a direct compaction line running continuously in new dedicated CSOPS laboratories at Rutgers and Purdue. Most importantly, their industrial partner, Johnson & Johnson (J&J), had established the first continuous pharmaceutical manufacturing process in North America: Project INSPIRE. This project commanded a total investment of $15M to focus on the development of a continuous commercial facility for the manufacture of J&J’s then-new HIV drug, Darunavir (commercial name “Prezista). The facility was constructed in 2012 in Gurabo, Puerto Rico, and went live in May 2013. The project involved four other firms who were already CSOPS members, or became members because of this project. In addition, the project stimulated Bristol Myers Squibb to support the implementation of continuous manufacturing technology in their site in New Brunswick, NJ.[74]

Obviously, that NSF investment of $1.8M for three years did have a major stimulus impact on the pharmaceutical industry and the U.S. and Puerto Rico. (See section 5-D(b)iii for more detail.)

Figure 5-15: Commercialization of Continuous Pharmaceutical Manufacturing Technology. (Source: C-SOPS)

  • “Professors of Practice”—Hire experienced industrial personnel to bring knowledge of industrial practice to ERCsfor up to three years to enrich ERC testbeds projects with practical industrial experience and otherwise bring knowledge of industry to the ERCs’ research and education programs. Three awards of this type were made:
  • Princeton/MIRTHE – Electronic packaging expertise for mid-IR technology: $859,506 for 36 months.[75] The goal of the project was to appoint a small group of industrially experienced researchers to the MIRTHE ERC to assist in establishing cost-effective, reliable, and state-of-the-art packaging capabilities for the ERC’s foundational research work on high-performance quantum cascade lasers, laser subsystems, and sensors. These researchers would allow the ERC to rapidly establish packaging capabilities on-site in the shared facilities of the ERC under the guidance of experienced industry experts. The lack of packaging capacity at MIRTHE had been pointed out by its site visit team as a weakness both for the ongoing research and technology development as well as for the education of the MIRTHE engineering students.

Figure 5-16: MIRTHE Professors of Practice (MIRTHE)

MIRTHE technology required packaging expertise to achieve the size and complexity needed for practical use. Two Professors of Practice, Igor Trofimov and Michael Lange, from at-risk small firms, brought that expertise to MIRTHE and three then recently graduated engineers whose jobs were eliminated due to the recession came to MIRTHE to work with students until they could apply for graduate school. Two were former MIRTHE REU students. (See Figure 5-16.)

  • University of Arizona/CIAN—Partnered with UC-Berkeley electronic packaging expertise for integrated optical and digital networks: $500,000 for 24 months.[76] The goal was to establish a research project for scalable and novel packaging for optoelectronic components. The project was designed to enable the ERC to bring an industrially experienced engineer to CIAN to support research groups focusing on novel devices by designing and fabricating components suitable for advanced characterization in optoelectronic communications systems. The project included two major components of wafer-level, low-cost, hermetic packaging. (See Figure 5-17.)
  • UMass-Amherst/CASA—Electronic packaging expertise for storm sensing radar systems: $748,965 for 36 months.[77] This project was designed to allow CASA to engage industry in more direct, effective, and innovative ways by bringing industrial personnel on the campuses of CASA’s university partners to work closely with academic researchers and students on research projects and testbeds. In addition, they were charged to work in the educational programs of the ERC and support the school of engineering by developing a Systems Engineering curriculum. The effort involved hiring a research engineer to enhance future development and expansion of the 4-node radar network testbed, as well as hiring two industry experts on successful transfer of technology to the marketplace and other industrial processes and practices, including systems engineering. (See Figure 5-17.)

Figure 5-17: CIAN and CASA Professors of Practice

  • Develop an additional testbed that will help speed the translation of ERC research to technology. The investment involved buying new equipment and hiring technical staff to help develop the testbed. Two awards were made:
  • University of Arizona/CIAN—Testbed for Optical Aggregation Networking: $0.6M for 12 months.[78] The new Testbed for Optical Aggregation Networking (TOAN) would provide the ERC with test sites that would offer cutting-edge performance to enable a new flexible, multi-node, and heterogeneous traffic “network emulator.” (See Figure 5-18.) The emulator would provide much-needed networking-oriented testing capabilities for CIAN researchers as well as for affiliated institutions and industry—e.g., impairment-aware cross-domain traffic engineering and functionality tests. This end-to-end network topology would allow for experimentation with the various trade-offs in the implementation of impairment compensation and switching in dynamic network settings.

CIAN’s Director, Nasser Peyghambarian, noted to Preston in 2018 that: “The CIAN TOAN was built to provide a state-of-the-art 100G flexible, multi-node, and heterogeneous traffic “network emulator” for validating architectures and network protocols in a realistic networking environment. TOAN enabled testing CIAN’s new and innovative optoelectronic chips that would reduce the size, energy consumption, cost, and complexity of the network. The emulator provided networking-oriented testing capabilities for CIAN researchers as well as for affiliated institutions and industry. TOAN enabled CIAN to consistently lead industry on key research themes for metro networks, starting with the CIAN box, which emphasized integration of packet and optical network for service awareness and the use of optical performance monitoring for greater flexibility. In recent years, CIAN adapted its vision to include inter-data center networks, in particular edge cloud data center networks for 5G mobile and other high-speed network applications. CIAN proceeded to lead the way in the use of software-defined networking (SDN) for optical systems, showing the benefits of software control, which is a must-have in optical systems today. Recently we showed that CIAN chips can at last break down network domain barriers and, with SDN control, enable low-latency, high-bandwidth end-to-end connections—delivering on a key goal of CIAN, which 5G has made more important than even before.”[79]

Figure 5-18: The TOAN testbed (Source: CIAN)

  • Northeastern University/CenSSIS ERC—Biomedical Imaging Acceleration Testbed: $1.3M for 36 months[80] The back story for this project arises from a poster session during the third-year renewal of the CenSSIS ERC in 2003. Preston, having recovered from breast cancer surgery and treatment in 2000, saw the images of the lesions in breasts that the early tomosynthesis technology could render and was amazed at their depth and clarity. She challenged the presenting graduate student and Michael Silevitch, the Center Director, that she wanted to see that technology in use in the clinic for screening and diagnostic purposes; but they countered that it was too expensive to process the images for those purposes. “Well,” she said, “that’s your challenge to meet and solve!”

In partnership with Massachusetts General Hospital (MGH), the ERC began that quest by combining funds from the base award to the ERC and the support from this ERC Innovation Fund project, which helped to speed up the results. The advantage of the technology comes from “…the ability to scroll through the stack of reconstructed ‘layers of the breast,’ (which) minimizes the impact of overlapping tissue that can mask lesions making them difficult to detect in conventional 2D mammography. The ‘quasi’ three-dimensional format of the reconstructed DBT (Digital Breast Tomosynthesis) images also allows better localization of lesions and improves the conspicuity of both benign and malignant lesion margins.”[81] The goal of the project was to develop a distributed testbed where each partner would provide either biomedical imaging or graphics processing units (GPU) parallelization expertise (or both). The proposed outcome would include a new set of parallel libraries for the biomedical research community, as well as a testbed model that could be replicated across other research communities that require acceleration using many-core platforms. The goals were: (1) develop a methodology for rapid parallelization of biomedical imaging applications, and then apply best practices in GPU programming; (2) produce a rich library of parallelized biomedical imaging codes; (3) provide the capability to “right-size” a multi-GPU system for any biomedical imaging application; and (4) deliver these capabilities in a web-based framework that would allow a larger community to use the technology available in this testbed. The projects supported by the testbed included: (i) Image Registration for Radiation Oncology; (ii) Iterative Reconstruction for CT imaging; (iii) Tomosynthesis for Breast Imaging; (iv) Motion Compensation in PET-CT Image Reconstructions; (v) Hyperspectral Imaging for Skin Cancer; and (vi) Image Segmentation for Brain Imaging.

This work directly contributed to the acceleration of the implementation of tomographic imaging technology for breast cancer screening—especially the CenSSIS contributions through algorithms that reduced clutter and produced superior images at much lower processing times and requirements.

The support for this project enabled the team from Northeastern, Rensselaer Polytechnic Institute (RPI), and MGH to focus on speeding up the introduction of Digital Breast Tomosynthesis into clinical use to dramatically reduce false-positive mammographic results or call-backs by at least 40 percent. At that time, 8 percent of screening exams required the patient to return for further imaging of suspicious-looking tissue. The superior rendition of breast tissue by DBT was projected to cut that in half, to 4 percent. Between 2003 and 2010, CenSSIS investigators collaborated with MGH to decrease the processing time required for DBT by 100-fold, thus enabling its practical use in an NIH-funded 3000-woman clinical screening trial at MGH. Call-backs fell from 8.1 percent to 5.1 percent, a 38 percent improvement over conventional mammography; and microcalcifications, an early indicator of cancer, are equally or better seen by DBT in a screening setting. FDA approved tomosynthesis or DBT for breast cancer screening in 2011.[82]

  • Develop a design-build “facility” to provide design, fabrication, and prototyping experience for students and faculty to give ERC and other engineering students the experience of going from a design idea through to building early proof-of-concept “products”:
  • University of California, Berkeley/Synberc—BioFAB to Support Synthetic Biology Parts Manufacture: $1.4 M for 24 months.[83] The goals of the project are displayed in Figure 5-19.

Figure 5-19: Synberc BioFab (Source: Synberc)

The motivation for the BioFab arose from discussions Preston had with Jay Keasling, the Director of Synberc, about the need to develop biological parts for the ERC’s research. Synberc had been asking the graduate students to develop the parts as an element of their research, but the demand for these parts was escalating with the development of the ERC and the field in general. She suggested that they add a new plane to the ERC’s 3-plane strategic research plan: a fabrication plane. The question was how to do that in a research organization. When the ERC Innovation opportunity email was released, she called Keasling to make sure he understood that it would be an opportunity to develop such a facility, if the proposal were of high enough quality to pass through the review process—and it was and it did.

At the end of Synberc’s 10-year award, the ERC created a book to commemorate the ERC’s achievements, which included the following statement about the impact of the BioFab concept: “BioFab…the first facility of its kind…has made freely available standardized, quantitatively characterized parts, with the aim of improving the efficacy of many biological engineering projects. This first facility serves as a proof of concept that such a fabrication lab can have great value to the biological engineering community. Exhaustively isolating, characterizing, and optimizing thousands of genetic elements, and then providing these reliable parts to researchers in accessible form is no small task. BioFabs that undertake this challenge offer an important support system for biological engineers and biologists as they work to make it easier to build biological solutions to many problems.”[84]

  • Post-Doctoral Fellows in Industry for ERC graduates, supporting one to two years of work using a combination of NSF and industry funds:
  • Princeton University/MIRTHE—MIRTH Postdoctoral Fellowship in Industry: $462,165 for 12 months. A small group of recent MIRTHE graduates was competitively selected to conduct year-long, postdoctoral research on-site with MIRTHE industry member companies. The research projects were closely linked to the fellows’ former MIRTHE research and strengthened the university-industry partnerships arising from the joint work as well as the group of academic and industrial mentors tied together by the post-doc fellows. In addition to full-time work in industry, the postdoctoral fellows were assigned mentors at Princeton; they reported twice yearly on their research to the entire MIRTHE community. The fellowships were competitive, with a strict selection process, and were tightly woven into MIRTHE’s overall formal postdoctoral mentoring program.

The postdoctoral fellows were appointed through Princeton University at the base salary rate; the hosting firm raised the fellows’ salary to the company rate and funded their research expenses, resulting in at least 25% cost-sharing by industry.[85]

  • Establish an “entity” in partnership with other non-NSF sources of funds to generate a range of small business opportunities based on the ERC’s research which is reaching the translational phase:
  • Carnegie Mellon University/QoLT—The Foundry: $1.5M for 36 months.[86] The Quality of Life Technologies (QoLT) ERC, devoted to carrying out research needed to advance the technology needed to improve the quality of life of the aging and disabled, was experimenting with a Foundry to identify and assist small start-up firms working on technology arising from the ERC’s research. The Innovation award expanded the Foundry as a broader base for regional economic development and to achieve three objectives: (1) ensure that spin-off companies emerge from the QoLT Center more speedily and on a sounder footing than might otherwise be possible for start-up firms, (2) make the QoLT Foundry a permanent feature of the QoLT ERC, and (3) refine and codify the ERC’s technology transfer processes. The effort was proposed to be a partnership between the ERC, Carnegie Mellon University, the University of Pittsburgh, industry, local foundations, and regional economic development organizations. The summary of Foundry achievements prepared for the ERC Best Practices Manual[87] (see case study) points to the impact of the Foundry on speeding technological innovation through start-ups. Figure 5-20 is a schematic of the QoLT Foundry business development process.

Figure 5-20: QoLT ERC Foundry (Source: QoLT)


ERCs can act as a venue for commercial vetting of a broader university research base, such as is done by the QoLT Foundry. Although the QoLT ERC is actually a Gen-2 ERC, it has implemented a vibrant innovation-to-commercialization program that is a front-runner among ERCs and could serve well as a Best Practice for Gen-3 ERCs. The QoLT Foundry is focused on identification, evaluation and commercial advancement of technologies from core ERC and associated research within Carnegie Mellon University (CMU) and the University of Pittsburgh. Established in 2008 with support from CMU, a local foundation, and an ERC Program Innovation grant, the Foundry has demonstrated remarkable success: 12 companies created since its inception and more on the way. (See Figure 5-21 for examples.) Rather than waiting for researchers to form start-up companies, QoLT has taken the innovative approach to reduce the time-to-market for QoLT technologies by being proactive about identifying and cultivating opportunities to form start-ups. The Foundry is led by experienced Entrepreneurs-in-Residence (EIRs) who serve as consultants on time-limited (6-9 months) contracts and are chartered to find their “next new thing” in the form of a spin-off company. Foundry interns—CMU and Pitt students in business, law, and management programs—work with the EIRs to conduct market analyses, assess intellectual property strength, scan competitors and develop business models. Those are presented to potential investors, industry advisors, and innovation partners (regional technology-based economic development organizations) in “Opportunity Meetings” organized twice a year. Because they are a proven success, Foundry elements have been adopted by new campus-wide CMU programs that have broader reach within the university.[88],[89]

The Foundry process also impacted the CMU Project Olympus, according to Dan Siewiorek, the Director of the ERC.[90] Richard D. McCullough, the PI for the Foundry Project and the CMU Associate VP for Research points to the Foundry and Project Olympus as the foundation for CMU exceptional success in spinning out start-up firms.[91]

Figure 5-21: QoLT Foundry Startups (Source: QoLT)

5-D       Special Topics that Span the Generations

5-D(a)    ERCs and the Interface with Society

Most of the Engineering Research Centers have visions and goals designed to impact systems technology that would be the basis for technological innovation by industry. However, the interface with society is generally indirect, through the application of the technology to achieve efficient processing and manufacturing technology or new products. A few ERCs chose visions and goals that required them to more directly interface with the populations that would use the technology. Unique among these ERCs are faculty and research staff whose technical skills were honed through knowledge of a particular user or patient population. Among those users were first responders to natural hazard threats, architects, physicians, and caregivers of the aging or disabled. There is also the challenge of addressing the risks and ethical issues associated with the emerging field of synthetic biology. Examples of these areas, the challenges they faced, and how they were addressed follow.

i.                     Interface of ERCs with Emergency Preparedness and Response

As background for the interface of engineers and social science working in emergency preparedness, Preston asked Dr. Louise Comfort, Professor of Public and International Affairs and Director, Center for Disaster, at the University of Pittsburg to reflect on her experience with the ERCs funded to work in that arena. This experience was gained when she served as a peer reviewer of incoming proposals and a post-award site visitor. Her reflection follows:

The Interface between Engineering and Social Science: Hazards Research and Systems Theory

As a hazards researcher, I had long been interested in the design of buildings, transportation networks, and lifeline systems, as these systems often failed under the onslaught of extreme events, leaving communities disrupted with devastating losses in lives and property. But I was most concerned about how and where these systems were placed, and who made the decisions regarding the design of the systems, their construction, location, and capacity to withstand severe threats. To me, these were policy decisions, most often made by city planning boards, county commissioners, state legislatures, and federal funding agencies. Most officials in those positions were trained in history and law and, like me, had little to no engineering background.

Yet, to my engineering colleagues, the key issues were engineering decisions. The mantra among engineering professionals in California, subject to seismic risk, was: “Earthquakes don’t kill people; buildings do.” The primary assumption was that if engineers designed buildings, bridges, and utility systems with expert knowledge of the stresses and strains to which they would be exposed from recurring hazards, the systems would not fail. The engineers knew that designing buildings and lifeline systems in regions exposed to seismic risk would cost more, and if schools were not located atop earthquake faults, they were less likely to fail. Yet they expected that public officials who made decisions regarding location of schools, design of utilities, and funding for public infrastructure would accept the engineers’ expert judgment on the increased cost of such infrastructure and agree to go forward, and that voters as taxpayers would accept their recommendations without question.

In fact, these are not easy decisions to make. The interdependencies between the physical environment, built environment, and social/policy environment are such that only a systems perspective can bridge these three different disciplinary perspectives. After taking stock of the toll on lives and property from the 1994 Northridge Earthquake in California and the 1995 Hanshin Earthquake, centered in Kobe, Japan, the National Science Foundation established the Earthquake Engineering Research Centers (EERC) program[92] to fund research on designing resilient systems for regions exposed to seismic risk. This was exactly the policy area in which my own research on decision making under conditions of uncertainty focused. In the early years of the EERC program, I was invited to serve on the peer review committees for all three earthquake engineering research centers that were funded in California, Illinois, and New York. It was not an easy assignment. Often I was the only social scientist on the review team. The engineers would listen politely to my questions, but essentially dismiss them as irrelevant to the real issues which, to them, were the design of the structure of the building and the increase in cost for corresponding increases in stability. The fact that policy makers, most with little background in engineering, made the decisions regarding location and allocation of funding for public infrastructure, was an issue in which they, as engineers, took little interest.

Yet, over time, engineering professionals began to listen. It was a learning experience for me as well. I realized that I needed to know more about engineering. I read articles about engineering design; worked (laboriously) through the mathematical formulas that engineers developed for dealing with uncertainty; and realized, to my dismay, that engineers were treating uncertainty as a function that could be quantified, when, in actual hazard events, there is a powerful nonlinear element to the interaction among the components of metropolitan systems that defies systematic quantification. As I learned more about their field, I realized that I could frame my questions more concisely in terms that engineers could more easily understand. As they listened more carefully, the engineers also began to recognize that human decision makers shaped the decision processes for building the infrastructure needed to make communities more resilient to hazards.

Over the decades since 1997, the NSF began to incorporate multidisciplinary requirements regularly into its programs. Importantly, this programmatic change has led to successful innovations such as the NSF-funded CASA (Collaborative Adaptive Sensing of the Atmosphere) ERC, based on a complex adaptive systems design. CASA has designed smaller, easily adaptable radar systems that can identify emerging tornadoes and windstorms more quickly and accurately than the large Doppler radar systems operated by the National Weather Service. This is only one example where a shift to a multidisciplinary systems perspective has advanced not only science, but the practice of alerting and warning community residents to extreme hazards. In the concerted national effort to build more resilient communities, an interdisciplinary systems perspective is fundamental to advance both science and practice.

Louise K. Comfort

University of Pittsburgh

August 19, 2018

ii.                  CASA and First Responders

The ERC for Collaborative Adaptive Sensing of the Atmosphere, led then by David McLaughlin and now a graduated Class of 2003 ERC, works at the interface of its novel radar system technology, the National Weather Service (NWS), and first responders to tornadoes and other severe storms. Most tornado warnings are false alarms because the current hazardous weather alert system is a nationwide network of high-powered Doppler radars, which can map the upper and middle regions of the atmosphere but are not effective in observing near ground-level weather because of the earth’s curvature. This deficiency limits the accuracy of forecasting and warning for tornados and other severe storms. CASA developed Distributed Collaborative Adaptive Sensing (DCAS) networks designed to overcome the limitations of the current radar in observing, predicting, and responding to atmospheric hazards. The vision is a system that consists of small phased-array radars that can be deployed on cellular towers, rooftops, and other infrastructure, spaced several tens of kilometers apart and arranged to operate in a collaborative network. It can provide coverage from the ground up to the tops of storms, thereby overcoming the Doppler effect that restricts the utility of today’s large radar systems. It is a system in which the information needs of the end-users drive the allocation of resources in a distributed sensor network. These radars provide high-resolution sampling throughout the troposphere, providing efficient utilization of low-power phased-array radars through distribution and collaboration, via coordinated targeting of multiple radar beams, based on atmospheric analysis tools. The system can determine needs and allocate resources such as radiated power, beam position, and communications bandwidth in regions of the atmosphere where threats exist and where the data needs are the greatest. It can rapidly reconfigure in response to changing conditions in a manner that supports the needs of multiple end users by gathering data to detect and characterized local intense storm cells and forecast their future locations. Thus, the CASA system supports the needs of an emergency manager deploying a team of storm spotters and alerting portions of the public through sirens and other tornado warning communication systems, while gathering data on future locations of the storm.[93]

As was described earlier in Chapter 5 (Section 5-B(a)), a deployment of the DCAS in “Tornado Alley” in Oklahoma yielded warnings for tornados that were three minutes faster than those of the NOAA radar system—a critical time difference. Another testbed resulted in spotting a tornado two hours ahead of impact and tracking its path—which saved lives.

This degree of effective collaboration between a technology and its end users was a goal for CASA that developed in sophistication throughout the proposal preparation and pre- and post-award review processes. It was a goal that “stretched” the engineers and meteorologists beyond their comfort zones of dealing only with technology or atmospheric modelling. The CASA team initially thought their end users were those two research communities. Through prodding by NSF and the NSF site visit teams, they came to understand that they needed to redefine their end users and focus directly in designing and operating the CASA DCAS system so that it served the needs of the National Weather Service and emergency managers. Brenda Philips, a member of the leadership team, stepped in to fill that void once CASA began operations and they reached the realization that they had to have a fully integrated engineered system that would effectively interface with those end users. To achieve that goal, they had to understand the functions of the end users—forecasters and emergency management officials who deliver warning and sound alerts—and how they made decisions. The CASA team pointed out that by the fourth year, the CASA site visit team’s report noted, “What CASA is doing is unique worldwide, connecting the various research thrusts and end-user integration team in work that is leading to important advances in sensing, distributing, predicting, and warning.”[94]

How was this outcome achieved? Brenda Philips, a research scientist at the University of Massachusetts, Amherst, was able to build bridges across fields and sectors. Brenda, who has a Master’s degree in Finance from Yale University, served as the CASA Director for Government, Industry and End User Partnerships and came to the task without a technical background in any of the fields underlying the ERC. However, using her intellect and innate ability to work across boundaries, she learned the requisite radar engineering, meteorology, and decision and behavioral science to be able to lead the hazards management team and to communicate directly and effectively with the end users. In this way, those end users understood what the envisioned DCAS system would provide for them and she could understand how they made decisions and how they behaved in an emergency situation. With this knowledge, she would return what she learned to the engineers, computer scientists, atmospheric modelers, and social scientists so the CASA team could develop a system that met the end users’ needs.

The success of this effort was highlighted in the 2010 proceedings of the National Research Council summer workshop titled, “When Weather Matters: Science and Services to Meet Critical Societal Needs,” remarking on a need for more “Collaborative social-physical science or engineering (“end-to-end”) test beds that integrate social science into the development of new meteorological technologies and products. A current example is the NSF CASA Engineering Research Center which has incorporated social science as an equal partner in multiple aspects of its work…Such efforts combine new weather technologies and products, users and their socioeconomic consideration, and social science expertise. In doing so, they provide a focusing mechanism for integrating social sciences and meteorology in ways that meet users’ and meteorologists’ needs.”[95]

Philips voiced to Preston in communications in 2019 that “To be effective, severe weather warning systems must integrate environmental, technological, policy, societal, and human dimensions. However, it is challenging to create direct linkages among these different dimensions. Not only do they involve different disciplines, such as engineering, meteorology, and social and behavioral sciences, but also each discipline has its own language, research methods, and measures. My research focuses on bridging human dimensions, where individual cognitive processes, naturalistic decision-making, and societal factors play an important role, and the environmental and engineering dimensions where normative and Gaussian perspectives prevail.”

She noted how this was achieved:

Through multi-attribute utility analysis, we established user-based utility and trade-off coefficients for the radar optimization algorithm. Once a version of the Meteorological Command and Control (MCC) was operational in the Oklahoma test bed, I conducted forecaster evaluations of the MCC. These evaluations showed a disconnect between the data displayed as a result of the radar scanning optimization algorithm and the data forecasters needed for decision making. Based on these evaluations, the MCC optimization algorithm was modified so that the radar data displayed supported forecaster decision-making practices. The problem identification and solution required socio-technical insight and collaboration among systems engineers, radar engineers, meteorologists, and social scientists.[96]

CASA II is the extension of CASA after graduation from NSF/ERC support and it is led by Brenda Philips and V. Chandrasekar (Colorado State University). In 2012, “CASA and the North Central Texas Council of Governments (NCTCOG) have embarked on a five-year, $10 million project to create the Dallas Fort Worth (DFW) Urban Demonstration Network. This project is centered on the deployment of a network of 8 dual-pole, X-band radars to demonstrate improved hazardous weather forecasts, warnings and response in a densely populated urban environment. These radars provide weather hazard information at spatial and temporal scales that are relevant to urban decision-making and human response. The project goals are to:

1. Develop high-resolution, two and three-dimensional mapping of current and future atmospheric conditions, focusing on the lower atmosphere, to detect and forecast severe wind, tornado, hail, ice, and flash flood hazards.

2. Create impacts-based, urban-scale warnings and forecasts for a range of public and private decision-makers that result in measurable benefit for public safety and the economy.

3. Demonstrate the value of collaborative, adaptive X-band radar networks to existing and future sensors, products, performance metrics, and decision-making; and assess optimal combinations of observing systems.

4. Develop models for federal/municipal/private partnerships that fund new observation technologies and on-going interdisciplinary weather system research.”[97]

The project was jointly funded by the NSF Accelerating Innovation Research Program, the North Central Texas Council of Governments, and the National Weather Service and its Southern Region and Fort Worth Forecast Office, and the City of Fort Worth Storm Water Department.

As an extension of this project, CASA was funded to develop a “living lab” for severe weather warning in north Texas. (See case study in Chapter 11, Section 11-A(c).) The lab coordinates the public safety community, weather forecasters, and the public in collaboration through an end-to-end severe weather warning and response infrastructure where research and live operations can be conducted during actual severe weather events. The warning system operates year-round, disseminating weather products to over 1,500 public safety and industry stakeholders, who use the data for real-time decision making and also collaborate with CASA researchers to improve science and application. CASA is changing the warning paradigm from a broadcast paradigm, where everyone gets the same warning message, to a context-aware paradigm, where individuals receive personalized warnings on a mobile app, CASA Alerts. This context-aware warning system directly links weather hazards (high rain, winds), the built and natural environment (roads, watersheds), and people’s precise location to deliver customized alerts. Another innovative mobile app simultaneously provides weather information to the general public and serves as a research tool on the public’s severe weather perceptions and responses.

iii.                Earthquake Engineering Research Centers and Decision Makers

As was noted earlier, three earthquake engineering research centers (EERCs) were funded by the National Science Foundation in 1997, separate from the ERC Program, and were transferred to the ERC Program in 1999 to benefit from the ERC Program’s engineered systems experience and its post-award oversight system. These centers graduated from ERC Program support in 2007. The EERCs had a significant impact on the practice of earthquake engineering, loss assessment due to earthquakes, education of students in earthquake engineering, and more significantly on developing and implementing the interdisciplinary methodology of research incorporating such diverse disciplines as structural engineering, geotechnical engineering, information technology, and social sciences. Overall, the three centers made significant contributions to the understanding of phenomena of major earthquakes and their impact on society, developing technologies to reduce earthquake event-related losses, and creating tools to develop resiliency in the communities to deal with these events. Each of these centers formed teams of faculty across several universities and disciplines and each of them worked at the interface of engineering and the social sciences in order to build decision models that would be useful to architects, city and regional planners, etc.

Lynn Preston and Joy Pauschke, then an ERC Program Director with a technical background in earthquake engineering, managed the transition of the three earthquake centers into the ERC Program’s post-award performance system. They focused the centers on developing effective long-term visions and supporting strategic plans. In addition, they realized that each center needed a more effective interface with end users if the knowledge generated by these centers were to be effectively transitioned into public policies for hazard mitigation. As a consequence, they brought social scientist Dr. Louise Comfort on board as a reviewer of the public policy aspects of each of these centers.

The research, education, and technology transfer achievements of the three EERCs were summarized by Dr. Vilas Mujumdar, the ERC Program Director responsible for oversight of these centers in their last years in the ERC Program. The following summaries are derived from his report.

Multi-disciplinary Center for Earthquake Engineering Research (MCEER) – The State University of New York, Buffalo, NY

This center developed the concept of resiliency in the community and defined its dimensions to provide enhanced seismic resilience of communities through improved engineering and management tools in three areas: (1) critical infrastructure systems (water supply, electric power), (2) acute care hospitals, and (3) emergency management functions.

As shown in Figure 5-22, MCEER developed the advanced knowledge and technologies needed to achieve integrated engineering tools, decision support systems, and related techniques and procedures that can provide cost-effective quantitative enhancements of the seismic resilience of these highly critical infrastructures. These tools and technologies make it possible to make more rationally based investments and allocations of finite resources, and also to quantify the expected outcomes in forms that can be communicated to the public and policy makers. This newly generated knowledge also helped engineers to better anticipate and adjust the outcomes of their designs for different hazard scenarios and apply appropriate loss-reduction measures, working with their clients. MCEER’s research also makes it possible for emergency management agencies to develop more reliable post-earthquake scenarios and to optimize their response and recovery activities through the use of advanced technologies and decision support systems, enhancing post-disaster response and accelerating the time to recovery after a major earthquake event.”[98]

Figure 5-22: The community recovery model emphasizes recovery time, spatial disparities, and linkages between different sectors of a community (Source: MCEER)

The impacts of MCEER most directly impacting public policy were:

  • Two decision support platforms were developed to integrate these findings. The Rehabilitation Decision Analysis Toolbox (RDAT), built on a user-friendly MATLAB interface, provided an integration framework based on a fragility approach. The Evolutionary Aseismic Design and Retrofit (EADR) software used an evolutionary analysis procedure for structural systems, which incorporated advanced protective technologies in an uncertain seismic environment and integrated multiple flexible constraints and rules including non-engineering organizational and socio-economic constraints.
  • In the emergency functions, rapid response and recovery was emphasized. MCEER addressed three major topics: (1) new and emerging remote sensing technologies to enhance resilience by producing more accurate building inventories for pre-event loss estimation and by providing more accurate and timely data for post-event damage detection and situation assessment; (2) advanced loss estimation tools that contribute to resilience by improving response and recovery decision making, including decisions involving post-event restoration of lifelines and community systems; and (3) methods for modeling post-earthquake recovery processes. This work produced:
    • A range of remote sensing technologies, including synthetic aperture radar, higher resolution optical satellite imagery, and GPS-based tools with advanced GIS and improved database management systems, are utilized.
    • The development a new post-earthquake reconnaissance tool called VIEWS™, Visualizing the Impacts of Earthquakes with Satellites, and the fielding of a set of tools called the “Virtual Reconnaissance Survey” (VRS), which allowed researchers to share spatially referenced disaster impact data online through a web browser.
    • Seismic response modification technologies to protect structural and nonstructural systems and components in acute care facilities from the effects of earthquakes. The results are used to provide meaningful input to integrated decision support tools. Studies include development of new materials and technologies for the seismic retrofit of a wide variety of structures and nonstructural components; development of an integrated decision-assisting model to help executives and engineers make informed choices about alternative approaches to improving seismic safety; and formulation and application of an evolutionary theory approach to aseismic design and retrofit, and organizational decision support.

Reflecting MCEER’s reliance on an interface of engineering and social science knowledge, its framework included the following dimensions of resilience, which can be used to help quantify measures of resilience for various types of physical and organizational systems.

  • Technical – the ability of total physical systems (including all components) to perform to acceptable/desired levels when subject to disaster;
  • Organizational – the capacity of organizations—especially those managing critical facilities and disaster-related functions—to make decisions and take actions that contribute to resilience;
  • Societal – consisting of measures specifically designed to lessen the extent to which disaster-stricken communities and governmental jurisdictions suffer negative consequences deriving from loss of critical services due to disaster; and
  • Economic – the capacity to reduce both direct and indirect economic losses resulting from disasters.

Mid-America Earthquake (MAE) Center – The University of Illinois, Urbana-Champaign, IL

This Center focused on establishing a complete framework and application tools for Consequence-based Risk Management (CRM) and its IT platform, MAEViz (see below), deployment to successfully address the challenges of earthquake impact assessment, mitigation, response, and recovery for the portfolios of its partners in industry, State, Federal, and international agencies and organizations in the Central USA. It is postulated that the balance between annual earthquake hazard and potential losses in the Central USA is similar to that on the West Coast, due to the potentially catastrophic effects of a major earthquake on the New Madrid fault system. The MAE Center developed suitable system-level procedures and application cases that have made a measurable difference to the region and its ability to respond to a catastrophic earthquake.

For the first time, the MAE Center characterized the hazard in the Central USA comprising eight states generally known as New Madrid Seismic Zone (NMSZ), resulting in a significant contribution to understanding the New Madrid fault mechanisms. Overall, the focus of the Center has been in three areas: defining hazard; generate inventory of all assets in a specified region; and developing vulnerability functions. Social impacts were modeled, including damage to infrastructure. All of these efforts were bundled into the Mid-America Earthquake Center Visualization Module (MAEViz, Figure 5-23).[99]

Figure 5-23: MAEViz is a software platform for visualization of earthquake effects and impacts. (Credit: University of Illinois)

As is discussed more fully in Chapter 11, Section 11-B(f), the MAEViz platform resulted from a collaboration between the engineers of the MAE Center, computer visualization specialists from the National Center for Supercomputing Application (NCSA) at UIUC, decision scientists at UIUC, and local government hazard response decision makers from Memphis, TN. The initial attempt to communicate with these decision makers was through equations. At one of the NSF site visits, the local government emergency manager organizations’ representatives remarked to the site visit team and the MAE team that while they could understand the implications of the equations for decision making, given that they were engineers, their upper-level decision makers could not. The suggestion was made to use visualization to represent the outcomes of various decision scenarios and to consult with decision scientists to improve the means of communication with decision makers.

This outcome resulted from a collaborative conversation between Preston, Pauschke, Comfort, and the other members of the site visit team. The result was a much greater emphasis on building this type of decision tool with visualization aids, which implied a stronger collaboration with the NCSA and decision scientists than was previously envisioned or budgeted. This necessarily meant a shift in funds away from structural engineering, which Preston and Pauschke explained to the Department Chair at the time, David Daniels. The argument was that if MAE and, in fact, civil engineers in general, were to be successful in practice, they needed to understand how to improve communications with policy makers, and this shift in emphasis at MAE would pave the way for that future.

Pacific Earthquake Engineering Center (PEER) – University of California, Berkeley, CA

According to Mujumdar’s summary, the PEER Center concentrated on the development of performance-based earthquake engineering (PBEE) technology for design and evaluation of buildings, lifelines, and infrastructure to meet the diverse seismic performance objectives of individual stakeholders and society.

The PEER Center developed and disseminated procedures and supporting tools and data for PBEE-focused engineers and policy makers to result in cost-effective reduction of earthquake losses, with emphasis on the following areas:

  • Definition of seismic hazard for engineering design applications;
  • Engineering tools for the seismic assessment and design of constructed facilities, with emphasis on geotechnical structures, buildings, bridges, and lifelines;
  • Design criteria to ensure safe and efficient performance of constructed facilities;
  • Methodologies including engineering and public policy instruments for mitigating seismic hazards in existing buildings; and
  • Performance-based approaches for design and evaluation of constructed facilities to provide appropriate levels of safety for occupants and protection of economic and functional objectives for essential facilities and operations.

The overall approach was aimed at improving decision-making about seismic risk by making the choice of performance goals and the trade-offs that they entail apparent to facility owners and society at large. As shown in Figure 5-24, it required the integration of engineering, social, and earth sciences.

Figure 5-24: Multidisciplinary Integration in Research Program (Source: PEER)

The approach gained worldwide attention in the 2000s with the broader realization that earthquakes in developed countries impose substantial economic and societal risks above and beyond the potential loss of life and injuries. The PEER Center developed quantitative tools for characterizing and managing these risks to address diverse economic and safety needs. Specifically, three levels of decision-making were addressed:

One level is that of owners or investors in individual facilities (e.g., a building, a bridge) who face decisions about risk management as influenced by the seismic integrity of a facility.

A second level is that of owners, investors, or managers of a portfolio of buildings or facilities—a university or corporate campus, a highway transportation department, or a lifeline organization—for which decisions concern not only individual structures but also priorities among elements of that portfolio.

A third level of decision-making is concerned with the societal impacts and regulatory choices relating to minimum performance standards for public and private facilities.

The overall impact of PEER Center’s work has been summarized by the California Seismic Safety Commission (CSSC), which stated that PEER is the primary earthquake engineering research arm of the State of California, PEER’s efforts have produced cost-effective products that benefit the State of California, these efforts are consistent with the goals and initiatives of the California Earthquake Loss Reduction Plan.

Although the CSSC is concerned with California, the impact of PBEE is national and international and can be been considered as the next generation of earthquake engineering practice, as it allows risk-informed decisions based on expected performance of buildings and infrastructure rather than on code-specified values of loads and materials.[100]

iv.                 Interface of ERCs with Medical Doctors, Patients and Care Givers

1.      BMES and Patients

The Argus II Retinal Prosthesis System, the “bionic eye,” (Figure 5-25) was developed by the Biomimetic MicroElectronic Systems (BMES) Center, an ERC at the University of Southern California (Class of 2003), together with its commercial partner, Second Sight Medical Products, Inc. In February 2013 the Argus II gained approval by the Food and Drug Administration (FDA) for use in the United States to treat patients with retinitis pigmentosa (RP). By fall 2013, Second Sight Medical Products received Medicare reimbursement approval and began moving the bionic eye into mainstream use at 12 surgical centers throughout the nation. Previously it had been implanted experimentally in several subjects by the research team and then in patients in a clinical setting across Europe, where surgeons had been trained to implant it and work with patients.

The device relies on a small video camera affixed to a pair of sunglasses, which sends visual data to a microchip implanted on the outside of the eye, which in turn relays the information via a flexible microcable to an electrode array implanted on the retina (i.e., at the back of the eye). The electrodes stand in for the damaged retinal cells, transmitting electrical signals to the remaining retinal neurons, which then relay the information via the optic nerve to the brain. The user receives the information in the form of 60 black-and-white spots (‘pixels’) which when seen as an ensemble convey visual information about images.[101]

Figure 5-25: The Argus II retinal prosthesis system (Credit: Second Sight Medical Products, Inc.)

The Argus II was the first FDA-approved long-term therapy for people with advanced RP in the U.S. and a game-changer device for treating blindness. As a result of the retinal prosthesis, patients with chronic degenerative eye disease can regain some vision to detect the shapes of people and objects in their environment. The sight gained is enough to allow patients to navigate independently, which offers greater mobility and confidence.

The pathway to the success of the Argus II was long, exciting, and possible only through the leadership and vision of Dr. Mark Humayun, an ophthalmologist with a Ph.D. in Biomedical Engineering, and the interdisciplinary teams he led at Johns Hopkins University and the University of Southern California. They found patients who were blind or nearly blind from RP who were willing to experiment with a new technology that might give others in the future the ability to see, or even help themselves in the near term. The first patient to receive the first implant was a man from Maryland. After the implant was inserted, Humayun and his team placed a white Styrofoam cup on a black background and asked him if he could see anything. They watched as he moved his head around, scoping out what he might see, since the implant gave him a very narrow, pinpoint field of vision; and then suddenly he exclaimed, according to what Humayun has told Preston and many others: “Oh, is that what you want me to see—a cup?” They all stood behind him shedding tears of joy.

That was the start of a long journey that depended on the interaction of the medical and engineering team members with patients willing to be experimental subjects and entrepreneurs willing to invest in the technology and carry it through clinical trials. Humayun and two of his team members, James Weiland (engineer) and Eugene De Juan (M.D.), moved from Hopkins to USC in 2001, where they found an environment open to stimulating the interface of academic research and commercial development through the Alfred Mann network. Mann was an innovator in biomedical technologies who founded Second Sight in 1998. Second Sight undertook the clinical trials for the Argus in 2002 and 2006.[102]

Humayun ‘s vision was stimulated by seeing his grandmother go blind while he was in college heading for a degree in neurosurgery. Mark noted that, “Although—and perhaps naively—I never doubted that it would work, in the back of my mind I always wondered how well it would work. That’s something you can’t tell from preclinical testing. You can test as much as you want on a desktop and in preclinical models, but you will never know what level of vision you will get. That’s the difficult part. The level of vision some patients get with the Argus II amazes me.”[103]

That level of progress depended upon the volunteer patients who stepped up to experiment with these two generations of the Argus and their willingness and capacity to “train their retinas to see.” These include:

  • An early female experimenter with the prototype Argus I, who, as she remarked to NSF site visit teams, saw herself as an experimental subject, willing to contribute to the future, and was pleasantly surprised that she could see the outlines of windows and doors
  • A female subject who could see well enough after implantation to play rudimentary basketball with her grandson (see Figure 5-26)[104]
  • Several patients who could later recognize the shape of large letters, as shown in Figure 5-27[105]
  • A man who experimented with the implant at home and told the NSF site visit teams that he had not seen his sons in several years and asked his sons to dress in black and walk in front of his white garage doors. He was delighted that he could see their silhouettes.
  • A male subject who experimented with Argus I and II and, as a USC publication detailed, “…is able to do things he never thought possible 15 years ago. He takes walks around his neighborhood in Riverside, California, and, with the help of the Argus II, he can identify obstacles like parked cars that at one point would have stopped him in his tracks. Regaining some of his independence and being able to see more than he used to has been a life-changing experience, he says”[106]

Figure 5-26: Grandmother playing basketball with the aid of the Argus II (Source: BMES)

Figure 5-27:  Letter Recognition (Source: BMES)

At the FDA hearings, Preston remembers the riveting testimony of several patients who found that the Argus II had transformed their lives:

  • The grandfather who used to be surprised when his grandchildren jumped all over him, now could control the situation by asking them to dress in white tee shirts so he could see them coming.
  • The woman who worked for a foundation for the blind in New York City, who could now read a few words and could see the outline of her apartment building, so she could tell the taxi driver to let her off in front of the building instead of somewhere near her building, which formerly required her to navigate down the street to find the building through sound recognition only.

All these years of interaction between the volunteers and Humayun’s technical team at the ERC and Second Sight resulted in incremental improvements in Argus I and Argus II and that impact has been broadened by the use of the implant in clinical practice in the U.S. and Europe. One recent patient, who had RP, in 2014 “flew from Arizona down to USC for a series of eye tests. In the end, she was chosen not only because she fit the requirements—her blindness had to be severe enough that she could benefit from the device—but also because of her optimism and dedication to learning how to use the Argus II.,” said Lisa Olmos de Koo, the eye surgeon at USC who performed the procedure with Mark Humayun. Once the implant was in place, “a few days later, [the patient] turned on the device. At first she could only see dramatic contrast: the edges of sidewalks, the steak on her plate at dinner (she still can’t make out rice). ‘A lot of people think I’m going to put it on and ‘Wow, you’re going to see again. It’s nothing like that,’ she says. ‘The contrast is easy, but trying to figure out shapes and letters—I need to work on that more. It’s definitely a whole new way of learning how to see’.”[107] The USC video of this patient speaks eloquently for Argus II and its positive impact on patients.[108]

2.      CISST and Physicians

The Center for Computer Integrated Surgical Systems and Technology (CISST), an ERC formed at Johns Hopkins University in 1998, established one of the most effective partnerships among engineers and computer scientists in the Whiting School of Engineering and surgeons and physicians in the Hopkins School of Medicine, especially the Wilmer Eye Institute, and between MIT (a CISST core partner) and its medical partner, the Brigham Women’s Hospital in Boston, MA. This ERC was envisioned and led by Russ Taylor, often called the “Father of Medical Robotics” because of his invention of a robot-assisted surgical tool, Robodoc, the first surgical robot to perform hip- and knee-replacement surgeries, while he was at IBM in the late 1980s There, he also pioneered medical imaging and modeling, and complete systems for surgical assistance, image-guided surgery, and what he refers to as “Surgical CAD/CAM”. Those innovations formed the basis for the vision of CISST. His colleague at Hopkins, Louis Whitcomb said: “And he is—with Victor Scheiman, Richard Paul, and a very few others—among the handful of pioneers who created the field of robotics research in the 1970s.”[109]

Taylor’s vision and the culture he and his team created joining engineering and medicine resulted in:

  • Steady hand surgical assistance tools that came from the understanding that surgeons are reluctant to cede control of surgery to a tool unless they are allowed to directly manipulate the instruments. Taylor remarked that “we developed a tool (the steady hand tool) where both the robot and the surgeon manipulate a single tool together,’ which quells the natural tremors of even the surest surgical hand.” “The steady hand robot senses forces exerted by the surgeon on the tool handle and moves to comply with the surgeon’s wishes,”… by performing the “actual motion. The robot can also enforce safety barriers beyond which the surgical tool cannot go, in order to prevent surgical errors.[110]

These advances were achieved by:

  • Creating a multicultural environment by enabling team members to invest the time to understand the problems and mindset of the other participants… engineering staff, faculty, and students developed a thorough understanding of medical requirements and constraints through operating room (OR) visits, frequent discussions with clinicians, and courses such as Surgery for Engineers; and surgeons invested time to understand engineering through close research collaboration and a course, Engineering for Surgeons.
  • Opening the medical school’s surgery training laboratory to engineers, where new tools that were available to surgeons could be explored by the engineers.
  • Building a mock OR at the engineering school to provide a realistic environment for pre-clinical system integration and validation testing.
  • Establishing a culture for successful projects that originated from “clinical pull” or an “engineering push” and had a critical element for adoption in surgery: a physician champion who understands clinical needs and can help steer the technology in the right direction.
  • Supporting technical staff to build testbeds so faculty, students, and physicians could experiment with the physical reality of an idea and the complexity of a surgical engineered system built upon robust modular hardware and software components derived from earlier research.
3.       Engineers Working for the Health and Well-Being of the Aging and Disabled

The Quality of Life Center (QoLT), an ERC funded at Carnegie Mallon University (CMU) in partnership with the University of Pittsburgh (Pitt) in 2006, was devoted to enhancing the body and mind through robotics and other technologies to improve the quality of life for the aging, infirm, and disabled and others to afford new possibilities in life and enable improved independent performance in daily life for those in need. Their goal was to create systems that recognize situations and people’s habits, know when and how to intervene, and to interact with them in natural and familiar ways. QoLT Systems were designed to operate in a person-systems symbiosis in which the human and engineered components are mutually dependent and work together. To achieve this goal, the ERC formed a partnership between engineers, social scientists, rehabilitation specialists, caregivers and end users learned to work together throughout the design, development, testing, deployment and evaluation phases to ensure that technical solutions address cultural factors, privacy concerns, and other factors that govern adoption. Joining the team of engineers, roboticists, and computer scientists was a robust team of social scientists from CMU and Pitt. Together, they defined quality of life technologies to have the following characteristics:

  • Functional domain targeted technologies that are focused on the need being address which determined the significance or potential value of the technology and helps to identify appropriate outcome measures for determining effectiveness;
  • Technologies that are compensatory, preventative and maintaining, or enhancing to compensate for diminished function or enhance normal function;
  • Passive or interactive depending on the extent of user involvement; and
  • Technologies with system intelligence that perceive, reason, learn, and act in the service of addressing individual needs.[111]

One of the assistive technologies that QoLT focused on was HERB, an assistive robot or “butler” that could function collaboratively with the user as a robotic caregiver. In contrast to robots used in a manufacturing facility that carry out repetitive tasks, HERB had to be designed to be able to function in a constantly changing environment, safely serving a human user. In designing and testing HERB to develop functionality in a human environment and user acceptance, the HERB team consisting of roboticists, engineers, software engineers, social scientists and physical therapists worked with users, clinicians, and caregivers to gain feedback on the design and development of the hardware and software systems of HERB. The testbed system for HERB directly addressed several fundamental barriers. User acceptance was addressed by incorporating users’, clinicians’, and caregivers’ feedback into the design and development of the system. Multiple control modes and interfaces were available for different users, and predictability robot motion algorithms were developed to address individual difference and unpredictability. Autonomous operation was coupled with teleoperation to create intelligent and customizable human robot collaboration, addressing complex interactions. Human-robot interactions were tuned using contextual information about the environment and the user to deal with contextual variability. [112]

v.      Synthetic Biology and Risk

The ERC Program funded the Synthetic Biology Engineering Research Center (Synberc), headquartered at the University of California, Berkeley (UCB), in 2006 to develop a fundamental understanding and new technologies that have the potential to enable modular design and assembly of useful new biological entities such as genetic circuits or engineered whole cells. To achieve this vision, biological properties must be formulated into a set of design rules that purposefully specify, design and realize novel engineered biological systems with predictable functionality. At the time of funding synthetic biology offered newly engineered biological tools for biological “factories” to produce sustainable, risk-minimized, biorenewable chemicals, biopharmaceuticals, and biofuels in the short term as well as diagnostics, therapeutics, and regenerative medicines in the longer term.

The knowledge and technological frontiers of Synthetic Biology lie at the convergence of biology (e.g. genomics), computation (e.g. informatics), and engineering (e.g. engineering of biological entities and processes). To ensure that these public benefits and risks are adequately addressed, before funding could be initiated, the ERC Program required Synberc to add a new research thrust, Societal Issues in Synthetic Biology, where a cross-disciplinary team of social scientists, ethicists, biologists, and bioengineers would evaluate risk and integrate the ethical, legal, and regulatory issues associated with Synthetic Biology into the engineering design process. This research thrust was later called Human Practices and then Practices.

Synberc’s approach built on advances in genomics, informatics, and molecular and cell biology and its engineering construct distinguishes it from discovery-oriented molecular and cell biology. Over time, Synberc’s unique contribution to synthetic biology has been the systematization of the field through an engineering paradigm. Synberc’s research program sought to transform biology in the way that integrated circuit design transformed computing: by focusing on modular design and construction of standardizable biological parts including promoters, ribosome binding sites, and genes, that can be assembled into a device (e.g. small circuit). A device might perform a logical operation, sense a chemical, or control the flux of a molecule through a pathway. Parts and devices are integrated into a complex host cell chassis in which performance has been optimized (to avoid interference from host metabolism, for example)—the “factories” of Synthetic Biology. Computational bioinformatics is employed to predict functionality of integrated synthetic biological systems using design-build-test strategies employed in engineering.

Unlike many other areas of engineering, biology is incredibly non-linear and more unpredictable, and there is less knowledge of the parts and how they interact. Hence, the overwhelming physical details of natural biology (gene sequences, protein properties, biological systems) must be organized and recast via a set of design rules that hide information and manage complexity, thereby enabling the engineering of multi-component integrated biological systems.

At each level, biological hazards were analyzed in the practices thrust by student teams, individual researchers, and industrial collaborators to minimize human and environmental risk. The goal of Thrust 4 (Human Practices) was to assess and mitigate the biological and human practices’ risks of the use of synthetic biology. This included (1) assessment of risks from unintended consequences or misdirected use of synthetic-biology-generated products pertaining to biosafety, biosecurity, and biocontainment issues; (2) incorporation of biosafeguards at all successive levels of design, build, test, and manufacture in synthetic biology-based systems; and (2) holding public conversations to define acceptable practices and regulatory frameworks for licensing IP and approval of synthetic biology generated constructs. Thus, the intent of the thrust was to connect “scientific/technological” thrusts and testbeds by designing interfaces between and among technical developments and broader ramifications in security, medicine, energy, and the environment. Dr. Paul Rabinow an anthropologist at UCB and Dr. Ken Oye, a political scientist at MIT, a Synberc partner, co-led this thrust at startup.

The thrust leaders set up governance organizations with the goal of establishing a community to accomplish these goals. This community would develop a framework for reviewing and developing policies for research done in synthetic biology. This included establishing a Bio-Ethics and Threats Advisory Committee (BETAC) to work closely with existing institutional review mechanisms to minimize that risk of SynBERC researchers creating something that could harm humans or the environment. BETAC met annually to review research and education in the Center.  A thrust leader and/or key investigators from each of the first three thrusts also served as members of teams of investigators in each of the research topics in Thrust 4.

Drawing on the work of experts in the synthetic biology and security communities, the MIT group assessed the “delta” of synthetic biology on security. Initial findings suggested that the short-term effects of synthetic biology on biosecurity, over and above the effects of synthesis and ordinary recombinant DNA research are insubstantial. The long-term effects of synthetic biology on bio-offense relative to biodefense, defined in terms of a ten to fifteen-year time frame, are potentially more problematic. The UCB group engaged in ongoing monitoring and critical analysis of the conjunctures among and between synthetic biology, security, and strategies for responding to emerging biological threats. Their approach addressed these deficiencies by conducting comparative inquiry into current laboratory practices, developing analytics of security rationalities, and designing security platforms adequate to reciprocal challenges posed by synthetic biology and the contemporary security environment.

While the thrust area played a leadership role in helping to bring the biological risks of Synthetic Biology to the forefront, its impact on the design process in Synberc was less effective. By the fourth year, Thrust 4 had shown significant independent contributions to this emerging field by raising the issue of risk and bringing groups together at the Wilson Center and other venues to define and address the issue. However, the goal of integrating risk into the up-front design considerations had not been achieved. There appeared to be a classic case of poor communication across the cultural divide between Synberc researchers and the UCB anthropology team.[113] As a consequence, by the fourth year, NSF requested a new leadership team be put in place for the Practices Thrust. After a nationwide search, Dr. Drew Endy, the strategic director of the ERC, was appointed to lead the Thrust, along with co-leader, Dr. Megan Palmer, then a postdoctoral scholar and bioengineer. Dr. Endy’s commitment to containing risk in Synthetic Biology and his knowledge of SynBERC led to that choice. Professor Oye continued to participate in the thrust. At the time that he was selected to lead the Thrust area, Endy had a demonstrated commitment to issues of science policy and synthetic biology and had been active at a national and international level in leading this discussion. His obvious energy and leadership capacity and his vision for how to work with the ERC to make this Thrust a success contributed significantly to that choice, as opposed to bringing someone in from the outside who would have a steep learning curve. Endy and Palmer developed plans and activities for integrating Practices (ethics, biosafety and biosecurity) throughout the ERC, in the design of every research project, in every educational product at K-12 to graduate levels, which were consistent with this new vision, and the goals of NSF and the ERC for Thrust 4.

Contributions to the ERC from Thrust 4 include a listening tour by Dr. Palmer and reconceptualization of the strategy for Thrust 4; testimony by several Synberc members in hearings of the Presidential Commission for the Study of Bioethical Issues; participation in a number of policy-relevant conferences; research, presentations, and publications on safety and security, ownership, sharing and innovation, and proactive risk management; research, presentations, and publications addressing the interface of ethics and the life sciences; additional development of the website Ars Synthetica; and posting of a guide to the regulation of synthetic biology, as requested by industry. The Scientific Advisory Board was broadened to include members with national recognition in the areas of bioethics, biosafety and biosecurity, who could specifically contribute to Thrust 4.

The integration of biological and environmental risk into the design process is best illustrated by the design rules that students competing in the IGEM competitions must adhere to. These design rules, published on the White List, define banned organisms, organisms and parts that are approved, and those that require approval to use in IGEM competitions.[114]

The following is an example of how risk has been integrated into the research/project design process at Synberc: In year 5, Synberc initiated a yeast testbed. The goal of the testbed was to develop generally useful tools make it easier to engineer yeast via synthetic biology approaches. More specifically, the goal was to construct new parts and devices for a yeast chassis modified using MAGE—one of Synberc’s tools—and to use these in combination to produce polyketides. Polyketides were chosen because they represent a grand challenge in protein expression/folding and production of unnatural secondary metabolites. The design process is shown in Figure 5-28:

Figure 5-28: Testbeds are a primary driver for decisions on parts/devices. (Source: Synberc)

Decisions regarding choices of parts and “chassis” were governed by searches for the most efficient pathways and their inherent safety. The Policy and Practices Thrust contributed to this effort by providing information of risk, biosecurity, and biopreparedness through internal Synberc analyses and outreach to the scientific and non-scientific community.

[At the conclusion of Synberc’s ERC grant period, the Engineering Biology Research Consortium (EBRC) was founded. Many of the key activities established by Synberc have been adopted, improved, and continued by EBRC. EBRC is continuing to develop additional new activities and programs to support and sustain the impact of research, products, discoveries, and ideas from the synthetic biology community.[115]]

5-D(b)    Emerging Fields Catalyzed by ERCs

                                                   i.      Role of ERCs in Development of the Field of Bioengineering

This sub-section is embodied in the attached file, which is a lengthy essay on the development of bioengineering and the important role that ERCs have played in it over the past three decades. The history of this field is lengthier than those that follow in the remainder of this section because the field is broader in scope and there have been more ERC involved in it over a longer period of time.

                                                ii.      Neuromorphic Engineering

A History of the Development of the Field of Neuromorphic Engineering[116]

Neuromorphic Engineering (NE) was initiated at Caltech in the laboratory of Carver Mead in the late 1980s; the word neuromorphic was coined by Mead himself to describe the design and construction of engineered systems which are modeled on, or are informed by, biological neural systems. The first work in the field was triggered by an observation by the Nobel laureate Max Delbruck, who pointed out to Mead that the newly discovered field-effect transistors behaved similarly to ion channels in neural membranes.[117] This led Mead and his students to investigate the ways in which CMOS (complementary metal-oxide semiconductor) circuits could be designed to simulate the operation of neural systems.  CMOS technology refers to an integrated-circuit process family in which insulated-gate field-effect transistors allow the development of extremely low-powered circuits; CMOS is the core semiconductor technology in modern electronics.[118]

Carver Mead’s interest in modeling the nervous system in electronic hardware may have been triggered in the 1960s, when he interacted with Prof. Paul Mueller at the Rockefeller University in New York. At that time, Prof. Mueller had published his famous papers on artificially creating cell membranes (bilipid biomolecular layers) and showing that action potentials could be generated by appropriately opening channels in the membrane using electrical signals to allow ions to be transported in and out of the membrane.[119] He went on to implement the first large-scale neural computer with discrete commercial-off-the-shelf electronic components (opamps, multiplexers, analog switches, etc.) in the late 1970s and early 1980s. By the end of the 1980s, Prof. Mueller started a collaboration with a CMOS integrated circuits designer, Prof. Jan van der Speigel, at the University of Pennsylvania, which resulted in the first large-scale analog neural computer implemented with custom CMOS chips.[120] This work developed in parallel with the work of Mead and his students at Caltech, Andreas Andreous and his students at The Johns Hopkins University in Baltimore, and Eric Vittoz and his students at APFL in Lausanne, Switzerland. Similarly to Mead, Professor Mueller continued to work on neuromorphic hardware for speech and visual information processing. This groundbreaking work in Mead’s laboratory mostly focused on sensors—circuits that imitated the mammalian retina and cochlea, and the associated neural periphery. Early successes were the first silicon retina (Mahowald)[121]—a now-famous image from which was featured on the cover of Scientific American—and the first silicon neuron.

During the same period, Eric Vittoz at EPFL in Switzerland had recognized the potential utility of CMOS for ultra-low-power circuits when operated in the subthreshold (low voltage) region; his work paralleled Mead’s in many respects. This body of work formed a foundation for European work in neuromorphic engineering and led to the development of low-power quartz watch movements, which were so successful that Vittoz is widely considered to have saved the Swiss watch industry.

The graduating Ph.D. students from Mead’s lab, and from that of his Caltech colleague Christoph Koch, seeded new neuromorphic laboratories internationally. A significant boost to the field was the founding of the Institute for Neuroinformatics (INI) at ETH Zurich by Mead’s colleague Rodney Douglas. The field was characterized from its early days by a high degree of collaboration and inter-laboratory collegiality, as evinced by the annual Telluride (Colorado) Neuromorphic Engineering Workshop, which was held for the 25th time in 2018.

At the core of the NE agenda is the principle that we cannot be certain that we understand biological neural systems unless we have successfully reproduced their behavior in engineered systems which are designed according the prevailing biological theory; in the words of Richard Feynman—who, together with Mead and John Hopfield, created Caltech’s Physics of Computation course—“What I cannot create, I cannot understand.” In particular, the use of analog hardware is an important point of departure. Systems built from analog hardware embody real-world randomness, such as electronic noise and structural mismatch between supposedly similar elements, in a way that is difficult to achieve in a numerical simulation.

Since 2008 there has been a significant shift in emphasis in the field, in recognition of the increasing complexity of the systems built and the convergence with computational and systems neuroscience. The new focus is on the development of truly cognitive systems, rather than isolated neuronally-inspired modules. Recent work has sought to be identified as Cognitive Neuromorphic Engineering (CNE) in acknowledgement of the current focus. With the recent explosion in Artificial Intelligence (AI) in everyday life, the CNE community is poised to make contributions that will touch every person by developing hardware and software systems that truly emulate living intelligence. This is supported by discoveries in neuroscience and through the Brain Initiative. Terry Sejnowski, a major player in AI, machine learning, and computational neuroscience, continues to be a visible proponent of the CNE field by attending the annual workshop in Telluride. The workshop is now in its 25th year and continues to be supported by the National Science Foundation, among other benefactors which include Google, Intel, HP, IBM, and many others.

Ralph Etienne-Cummings

Chair, Department of Electrical and Computer Engineering

The Johns Hopkins University

April 2019

1.      Telluride Workshops

“The Telluride Workshops (see arose out of a strategic decision by the U.S. National Science Foundation to encourage the interface between Neuroscience and Engineering. At the recommendation of the Brain Working Group at NSF, a one-day meeting was held in Washington on August 27, 1993. The recommendations of this group were for NSF to organize and fund a “hands-on” workshop that would draw together an international group of scientists interested in exploring neuromorphic engineering.” It is apparent that NSF saw this as an opportunity to build a community of early pioneers in this emerging field because very few groups were pursuing this type of research. Individual scientists and engineers, from academia and industrial research groups, were drawn to the field from a variety of disciplines but they had few common meeting points. NSF support for the Telluride workshop began with the Division of Neurobiology and the ERC Program also provided support after 1995. 

“As a whole, the Telluride Workshops, which have been held annually since 1994, have been a strong factor in establishing the field of Neuromorphic Cognition Engineering world-wide. Some indication of this growth can be seen from the number of centers that now declare their interest in Neuromorphic Engineering. Articles on Neuromorphic Engineering published in Science, Nature, The Economist, Scientific American and IEEE Spectrum are further evidence of this maturation. Since the majority of the websites reflect the research of alumni of Telluride, these workshops are contributing strongly to NSF’s strategic goal of enhancing the interface between Neuroscience, Engineering and, now, Artificial Intelligence.”[122]

2.      ERCs in Neuromorphic Engineering

The first ERC funded in Neuromorphic Engineering was established in 1994 at Caltech as the Center for Neuromorphic Systems Engineering (CNSE). As discussed above, it grew out of the convergence of the growing field of neuroscience research that began in the 1980s through the intersection of the interests of a few eminent and free-thinking professors at Caltech—Richard Feynman, John Hopfield, and Carver Mead—with the fundamental laws of computation, particularly computation in biological systems.[123] Mead was interested in exploring the use of very-large-scale integration (VLSI) systems containing electronic analog circuits to mimic neuro-biological architectures present in the nervous system.[124] Mead and Hopfield led a course in the physics of computation which sparked the interests of Caltech neurobiologists who wanted to understand the brain from a computational viewpoint. They developed the joint course, “Physics of Computation,” (mentioned above) which attracted students from engineering, physics, and neurobiology. This led to a new Ph.D. program, Computation and Neural Systems (CNS), which was established in 1986. It provided a collaboration “space” for physicists, engineers, and neurobiologists to jointly research how the brain functions and how to design computers that could mimic its properties. Carver Mead published a book, Analog VLSI and Neural Systems, in 1989 that explained the field in terms accessible across disciplines, establishing the new field of Neuromorphic Systems Engineering.[125]

The collaboration of engineers and neurobiologists at Caltech was driven by the desire to understand how the brain worked as a “neuro system” and how to design brain-like computers and artificial neural networks. The engineers focused on artificial neural networks and their implementation, learning systems, sensory systems, algorithms, and applications in technology. They and their collaborating neurobiologists addressed questions like: “What, in essence, does a neuron compute? What are the fundamental circuits and functional areas of the brain’s architecture, and how can they be modeled? Do connectivity and parallelism provide a fundamental difference? Can we build and program high-connectivity artificial neural network computers? What are the organizing principles that allow networks in the brain to connect up and compute useful quantities?”[126]

Collaborations to address these questions proved to be so fruitful that a core team of faculty—Yaser Abu-Mostafa, Rodney Goodman, John Hopfield, Christof Koch, Carver Mead, Dimitri Psaltis, and Pietro Perona—decided to submit a proposal to the ERC Program hoping they could find long-term support to develop the field and explore practical system-driven applications. Given the Caltech culture, Perona told Preston that the team had some trepidations about the center mode of research, however, they saw it as an opportunity to galvanize their cross-disciplinary team and pursue high-risk projects through long-term support.

The team was successful and Caltech received an award in 1994 to start the Center for Neuromorphic Systems Engineering,with Pietro Perona serving as the Director and Dimitri Psaltis as Deputy Director.[127] The goal of the CNSE was “to develop the technology infrastructure for endowing the machines of the next century with the senses of vision, touch, and olfaction which mimic or improve upon human sensory systems.”…..”Although the U.S. is the world leader in neural network research, a quantum leap in technology is needed for this research to manifest itself as innovative processes and products in U.S. industry. The Center will aim to facilitate this leap by focusing on sensory processing in which the natural parallelism of artificial neural networks, and neuromorphic VLSI and optical circuits can provide solutions to problems that are hard for conventional computing. These problems include vision, audition, tactition, and chemical sensing (olfaction). Coupling high-bandwidth arrays of sensors and actuators with the processing power and learning abilities of distributed neural networks will generate a quantum leap in human-machine interaction and machine-environment interaction. The Center will take a multi-disciplinary approach through the tight coupling of sensors and intelligence required to achieve sensory processing. Algorithms and VLSI hardware must be developed together.”[128] The early focus was on designing and building components of the neuromorphic systems—cheap sensors for the signals captured by biological creatures; systems for sensing, adaptation and learning; and technology for implementing sensory computations in aVLSI (advanced VLSI) hardware.

Fred Betz, the NSF Program Director assigned to the Center, began a dialogue with Carver Mead regarding how the center was planning to transfer new technology to industry. According to Mead’s recollections[129], Betz was making assumptions about technology transfer that rested on his experience with large, established firms, in the other centers for which he was responsible. Mead pushed back, indicating that for such an emerging field, large firms would not take the risk of exploring commercialization of innovations, and the Center would have to rely on spinning out start-up firms or other small firms for early-stage technology development. The ERC pursued this strategy with significant success. Descriptions of the spinoff companies derived from aVLSI technology started by CNSE are shown in Figure 5-29, which was shared with Preston by Pietro Perona in 2006. This strategy had obvious success for the ERC and it impacted the ERC Program as well, as this was the time when large firms were withdrawing from high-risk investments in new technology and it provided an alternative.

Mead and Koch showed that aVLSI circuits were capable of implementing neuron-like computations, and developed system-level aspects like spike-based transmission of information and sensor processing on a chip, resulting in a typical CNSE silicon retina that could perform “the equivalent of a billion floating-point digital operations per second with low power usage,” as demonstrated in the Foveon generation of cameras and other innovations. Other innovations based on aVLSI were the first polymer-based artificial nose (Rod Goodman and Nate Lewis [a chemist]), computational underpinnings for biological olfaction (Laurent [a physiologist] and Perona) and artificial noses (Lewis, Perona, and Laurent).

In 2006, Microsoft began shipping new products that contained technology developed by Digital Persona, a CNSE start-up company founded by former CNSE students Vance Bjorn and Serge Belongie. In 1996, when they were undergraduates, Bjorn and Belongie developed “U. are U.” the first fingerprint identification technology, winning the coveted Best of Comdex award for computer peripherals in 1997. Caltech provided them with guidance on how to start up a firm and find capital. The reader was developed and marketed to banks and Microsoft. The new Microsoft products incorporating this technology at that time were: Optical Desktop with Fingerprint Reader, Wireless IntelliMouse® Explorer with Fingerprint Reader, and Microsoft® Fingerprint Reader (Figure 5-30). The company’s Digital Persona IDentity Engine made fingerprint recognition fast and reliable.[130]

Figure 5-29: CNSE spinoff companies (as of 2004) (Source: CNSE)

Figure 5-30: Microsoft’s fingerprint reader featuring DigitalPersona technology (Source: DigitalPersona)

In addition to these innovations derived from aVLSI, the ERC increasingly moved to systems testbeds that included MEMS and robotics/controls. Yu-Chong Tai’s MEMS lab increased CNSE’s ability to develop integrated sensor hardware and led to miniaturized autonomous vehicles to demonstrate low-power biomimetic sensing. Robotics and control were introduced by Joel Burdick and Richard Murray. There were two driving systems-integration testbeds: swarming robots and neuroprosthetics. The swarming robots demonstrated collective sensing and swarming around a chemical plume. The neuroprosthetics testbed took on a major role in the ERC and through the efforts of Andersen, Burdick, and Krishna Shenoy (a postdoctoral fellow at Caltech who moved to the faculty at Stanford in 2001), it became one of the four worldwide leading efforts in neuroprosthetics.

The neuroprosthetics work pioneered a unique cognitive approach: decoding goals, intentions, and the cognitive state of the paralyzed patient. The approach was to focus on the parietal cortex, which is associated with vision and motor planning, and in particular the Parietal Reach Region (PRR), which encodes the plan for the next intended reach movement. The team pursued several related research thrusts, including the development of an implantable chip to read signals from the parietal cortex, development of computational models for the neural signals involved, development of an online decoding algorithm for the intended movements, and finally the implementation of the real-time control of a robotic arm through a brain/computer interface, or probe. They made significant progress by implanting three rhesus monkeys with an electrode array and training them to move a cursor to a target on a video screen without moving their arms. The PRR research on planning and intention is important because this region is deeper in the brain and might have escaped injury. Potentially, commands generated here could bypass the requirement for a more complex sequence of steps such as “move right, open hand, close hand, move left. They were one of the first teams to explore the possibility of empowering those who are completely paralyzed to control prosthetics by thinking, a major breakthrough benefiting potentially millions of people worldwide. The testbed benefitted from augmented seed funding ($350,000) from the ERC Program in 1998-2000 to build the intellectual and physical infrastructure and DARPA and NIH provided significant augmented funding for development and testing ($4.6M and $2.5M respectively) The system was first demonstrated in several macaque monkeys, and later through pilot studies with human patients.

During the last five years of effort (2001–2005) under NSF support, CNSE focused on systems integration—how to integrate sensing, computation, and algorithms into working autonomous and intelligent systems. Koch began to explore the neurobiological basis of attention, awareness and, ultimately, consciousness; others began to explore attention and awareness as engineering design principles. Work by Koch and Psaltis suggested that attention and awareness modules in an intelligent system can serve as the glue to integrate conventional feedback control algorithms with logic-oriented control systems. Work by Koch and Perona showed that attention is crucial for learning to recognize complex scenes. They helped to develop a new viewpoint: intelligence is tightly linked with the sensorium of an autonomous system and the behaviors that are enabled by the senses. The senses transform meaningless physical quantities into meaningful tokens and control signals—i.e. they transform raw ‘data’ into useful ‘information’. This led to inquiries into the computational principles of high-level functions of intelligence.

This ERC and the Telluride Neuromorphic Engineering Workshop spawned two generations of Neuromorphic Engineers and the widespread development of research groups focused on Neuromorphic Engineering around the world.  A few of these innovators are highlighted below:

Kwabena Boahen, a 1997 Ph.D. in Computational and Neural Systems from Caltech in Carver Mead’s laboratory, who was a member of CSNE. Dr Boahen is currently a Professor of Bioengineering at Stanford University where his research group is pushing the frontiers of computing by focusing on parallel, interconnected architecture that functions more like the brain through self-organizing chips that emulate the way the brain is wired on a mixed analog-digital hardware platform, the Neurogrid, the stimulates a million cortical neurons in real-time.[131] 

Andrew Cassidy, a 2010 Ph.D. in Neuromorphic Computing from Johns Hopkins University under the supervision of Andreas Andreou.  Dr. Cassidy is a member of the Brain Inspired Computing Group at IBM Research, the team that brought you the TrueNorth neurosynaptic processor.  His expertise is in non-traditional computer architecture, and brings experience in architectural optimization, design, and implementation, particularly for machine learning algorithms and cognitive computing applications.

Tobi Delbruck, a 1993 Ph.D. in Computational Neural Systems from Caltech who was mentored by Christof Koch, David van Essen and Carver Mead. Currently Dr. Delbruck is a Professor of Physics and Electrical Engineering at ETH Zurich in the Institute of Neuroinformatics, University of Zurich and ETH Zurich, Switzerland, where he has been since 1998.  He co-organized the Telluride Neuromorphic Cognition Engineering workshop. He worked on electronic imaging at Arithmos, Synaptics, National Semiconductor, and Foveon (a CNSE spin-off) and has founded 3 spin-off companies, including, which supports basic R&D on neuromorphic sensory processing.[132]

Shih-Chii Liu,  a 1997 Ph.D. in Computational Neural Systems from Caltech was mentored by Carver Mead.  Dr. Liu is a Group Leader at the Institute for Neuroinformatics at University of Zurich, Switzerland. Her research interests are in developing event-driven sensors with optimal asynchronous spike representations; embedding knowledge from biological sensory and cortical processing into new event-driven computational neural network models, and algorithms and building hardware accelerators and custom circuits that support these models. [133]

Ralph Etienne-Cummings, a 1994 Ph.D. in Electrical Engineering from the University of Pennsylvania. Dr. Cummings is currently a Professor and Chairman of Department of Electrical and Computer Engineering at Johns Hopkins University. He is the founding director of the Institute of Neuromorphic Engineering and has served as a leader of the Telluride Workshop community, where he frequently collaborated with the CNSE faculty.  His research focuses on mixed-signal very large-scale integration systems, computational sensors, computer vision, neuromorphic engineering, smart structures, mobile robotics, legged locomotion, and neuroprosthetic devices.

Jennifer Hasler, a 1997 Ph.D. in Computational and Neural Systems from Caltech and a CNSE student who is now a Professor of Bioengineering in the School of Electrical and Computer Engineering at Georgia Institute of Technology.  Dr. Hasler founded the Integrated Computational Electronics (ICE) laboratory at Georgia Tech, a laboratory affiliated with the Laboratories for Neural Engineering. Her research interests focus on frameworks for analog numerical analysis for energy efficient computing for ultra-low power configurable systems on a chip, and analog VLSI physics related to submicron devices or floating-gate devices, analog VLSII models of on-chip learning and sensory processing in neurobiology.[134]

Giacomo Indiveri, a Ph.D. in Neuromorphic Engineering, now heads the Institute for Neuroinformatics at University of Zurich, Switzerland. Dr. Indiveri’s research group focuses analog/digital VLSI architectures that use the physics of silicon to reproduce the biophysics of biological neural systems, and multi-chip systems that communicate using asynchronous event-based signals (spikes). The main contributions to neuromorphic VLSI technology consist of novel low-power silicon neuron circuits, dynamic silicon synapses, hybrid analog/digital spike-based learning mechanisms, soft Winner-take-all networks, and asynchronous digital communication circuits and systems.[135]

Laurent Itti, a 2000 Ph.D. in computation and Neural Systems from Caltech and a CNSE student is now a Professor of computer science, psychology, and neuroscience at the Viturbi School of Engineering at the University of Southern California.  Dr. Itti’s research focuses on using computational modeling to gain insight into biological brain function by studying biologically-plausible brain models and comparing the predictions of model simulations to empirical measurements from living systems.  The brain subsystem is the visual system and the meta goal is to investigate the tasks and conditions for which the biological brain approaches the theoretical limits of information processing.[136] 

Andre Van Schaik, 1998 PhD, Electrical Engineering from the Swiss Federal Institute of Technology, Lausanne, Switzerland.  Dr. Van Schaik is currently a research professor at Western Sydney University and leader of the Biomedical Engineering and Neuromorphic Systems (BENS) Research Program in the MARCS Institute for Brain, Behaviour, and Development. In 2018, he became the Director of the International Centre of Neuromorphic Engineering. His research focuses on neuromorphic engineering and computational neuroscience, specifically, smart sensors with a built-in brain by combining bio-inspired sensors with bio-inspired signal processing. He is one of the pioneers of the field of Neuromorphic Engineering and a recognized world leader in neuromorphic vision sensors and audio sensors.[137]

Timothy Horiuchi, 1995 Ph.D. in Computation and Neural Systems from Caltech under the super vision of Christof Koch and a CNSE student. Currently an Associate Professor at the University of Maryland College Park, Dr. Horiuchi conducts research on modeling of navigation with models of hippocampus.  He is particularly interested in understanding how bats use echo location, in combination with their localization and mapping networks, to maneuver in complex and cluttered environments.

Lloyd Watts, 1992 Ph.D. in Computation and Neural Systems from Caltech under the supervision of Carver Mead.  Dr. Watts founded a cochlea modeling company, Audience, Inc.

Chris Diorio, 1995 Ph.D. in Computation and Neural Systems from Caltech under the supervision of Carver Mead.  Dr. Diorio is CEO, Vice Chairman, and Co-Founder at Impinj, an Affiliate Professor of Computer Science and Engineering at the University of Washington, and a Director of the RAIN RFID Alliance.

Erik Winfree, 1996 Ph.D. in Computation and Neural Systems from Caltech and a CNSE student is currently a Professor in the Caltech Division of Biology and Biological Engineering. Shortly after Dr. Winfree became an Assistant Professor at Caltech he won several import awards: a MacArthur Fellowship (2000) and the MIT Technology Review’s first TR100 list of “top young innovators” (1999). His research focuses on theoretical and experimental research in DNA computation, DNA nanotechnology, and molecular programming including:

  • Algorithmic self-assembly
  • In vitro biochemical circuits and systems
  • Enzyme-free DNA strand displacement circuits
  • DNA-based molecular robotics
  • Molecular self-replicating systems and evolution
  • Multistranded DNA and RNA interaction kinetics
  • Nucleic acid system specification and sequence design
  • Fault-tolerant molecular computing.[138]

The second ERC in Neuromorphic Engineering was awarded to the University of Southern California in 2003 to develop the Biomimetic Microelectronic Systems (BMES) ERC. The Director of the ERC was Mark Humayun, an MD/Ph.D. (Electrical Engineering) and the Deputy for most of the term of NSF support was James Weiland (Biomedical Engineering). This ERC is essentially the second generation of centers in Neuromorphic Engineering, as the complexity of its vision and testbeds rested on the pioneering work that went before them at Caltech. The vision of the ERC was to bring physicians, biologists, and engineers together to explore and advance microelectronic systems that would allow bi-directional communication with tissue, thus enabling implantable/portable microelectronic devices to treat presently incurable diseases such as blindness, paralysis, and the loss of cognitive function. To realize the functionality needed to effectively treat these conditions, the fundamental engineering research focused on mixed-signal systems-on-a-chip, power and data management, intelligent analog circuits to regulate power usage on demand and minimize temperature, interface technology at the nano and micro scales to integrate microelectronic systems with neurons, and new materials designed to prevent rejection. More specifically, key barriers that had to be addressed through research to achieve the BMES vision were:

  • Communication between the chip and neurons through a new class of mixed-signal systems-on-a-chip based on the architecture of living neural systems, using multi-channel analog circuity to detect and amplify weak biological signals and digital logic to rapidly process the ensemble recordings, enabling extraction of information from the data with low signal-to-noise ratios characteristic of biological signals.

Power and data management through a dual-band hybrid power and data link and intelligent analog circuits to regulate power usage based on demand, while

  • balancing high performance with low-power operation to protect tissue
  • developing interface technology at the micro and nano scales to integrate microelectronic systems to neurons using non-penetrating parylene polymer electrodes for the retina with novel hermetic packaging to produce watertight barrier coatings.[139]

This research was motivated by the barriers to achieving two major testbeds—the retinal testbed and the cortical testbed—which have had significant impacts on the state of the art in neuromorphic engineering and patient care.

The Retinal Testbed focused on a system that would allow patients blinded by retinitis pigmentosa to regain useful vision. The system derived from Humayun’s long-term goal to integrate medicine, biology, and engineering to provide chip-based technology to restore sight. After 20 years of effort—10 of which were through the ERC—the testbed culminated in a clinical trial of the Second Sight Medical Products Inc.’s (a BMES clinical partner) ARGUS II retinal prothesis, producing results showing that electrical stimulation of the retina can produce usable visual input for a blind person. The ARGUS II recipients demonstrated improved letter recognition, form vision, and mobility. The sight gained is enough to allow patients to navigate independently, which offers greater mobility and confidence. Based on the strength of the clinical data, in 2012 Second Sight received a CE Mark that allowed the sale of the ARGUS II in Europe as a medical implant. In February 2013, the implant became the first “bionic eye” to be approved for use in the United States through a Food and Drug Administration market approval.[140]

The goal of the Cortical Testbed was to restore higher cognitive functions that are lost as a result of damage (stroke, head trauma, epilepsy) or degenerative disease (dementia, Alzheimer’s disease). The interdisciplinary team was led by Professor Theodore Berger, USC. The focus was on long-term memory formation, which is supported by the hippocampus and surrounding limbic cortical brain regions. At the beginning of the ERC, NSF viewed this testbed as a long-term, high-risk project where demonstration of proof-of-system-concept might not occur within the 10-year time frame of ERC Program support. The focus of the research was on bi-directional communication with the brain, advanced modeling methods to uncover spatio-directional communication within the brain, advanced modeling methods to uncover spatio-temporal coding schemes used by the brain to represent long-term memories. Next-generation VLSI “system-on-a-chip” designs were developed for hardware implementation of the prothesis, as shown in Figure 5-31.

Figure 5-31: Cortical prosthesis model developed at the BMES ERC (Source: BMES)

Progress was swifter than originally envisioned. By the third year they successfully demonstrated “proof-of-principle” of their goal to replace a damaged hippocampus with a biomimetic device by using a transverse hippocampus slice, replacing one portion of the multi-component circuit (dentate-CA3-CA1) of the in vitro hippocampus with a VLSI-based model that accurately predicted the nonlinear dynamics of that component (CA3) and restored total hippocampal system function. During years 4–6, they built on that accomplishment to develop a multi-circuit, systems-level solution for an in vivo prosthesis. By 2011, the device was demonstrated in a behaving rat, showing that a multi-input-output model of CA3-CA1 nonlinear transformations can generate highly accurate predictions of CA1 output in real time, thus substituting for hippocampal function.[141]

That year, the U.S. Defense Advanced Research Projects Agency (DARPA) became interested in the work for its potential to improve memory function caused by traumatic brain injury from combat and funded the team for $16M, for four years, enabling them to move to successful tests in non-human primates in 2013. By March 2018, a team from Wake Forest Baptist Medical Center, who had collaborated on the rat studies, and Burger’s BMES ERC team at USC “demonstrated the successful implementation of a prosthetic system that uses a person’s own memory patterns to facilitate the brain’s ability to encode and recall memory.” In the pilot study, published in the Journal of Neural Engineering, participants’ short-term memory performance showed a 35 to 37 percent improvement over baseline measurements. The DARPA-funded research[142] rests on the long-term collaboration at the systems level supported by the ERC.

The third ERC in this field, the ERC for Sensorimotor Neural Engineering (CSNE), was established at the University of Washington in 2011, in partnership with the Massachusetts Institute of Technology (MIT) and San Diego State University. The Center has long-term partnerships with the Allen Institute for Brain Science and an outreach collaboration with a neuromorphic engineering faculty member at Caltech. The first director, Yoki Matsuoka, left the University for industry shortly after the award was initiated. Thomas Daniel, a leader at the university working at the interface of biology and engineering, took over the ERC leadership as interim director to stabilize the new Center and “repair” relationships with NSF, which had been damaged by Matsuoka’s departure. Daniel has a personal connection to the CNSE team at Caltech, as he was a Bantrell Postdoctoral Fellow in Engineering Sciences there, where he studied biofluid dynamics with Ted Wu until 1984. This course of study had a lasting impact on Daniel’s research, as his laboratory currently focuses on control and dynamics of movement in biology and reverse engineering of flight control in flying animals and insects.[143] Once stabilized, in 2013 the leadership of the ERC was transitioned to a permanent director, Rajesh Rao, a computer scientist and engineer who works at the interface of engineering and neurobiology. In 2017, Chet Moritz, an electrical engineer who works in rehabilitation medicine and physiology, was promoted to co-Director to support the rehabilitation goals of the ERC.

Throughout the initial period, the vision of the ERC was to become a global hub for delivering neural-inspired sensorimotor devices that assist people with neurological disabilities. The devices envisioned would utilize neural-inspired, closed-loop control between the human and the device; be fail-safe, robust and reliable; and incorporate human values, ethics, and security in device design. These innovative devices are envisioned to serve a community of individuals with deficits anywhere in the sensorimotor pathway, from amputees, to individuals with nerve injuries, to those with central nervous system deficits such as traumatic head injury, stroke, or Parkinson’s disease.

Under the leadership of Rajesh Rao, the ERC focused its vision on revolutionizing the treatment of stroke, spinal cord injury, and other debilitating neurological conditions using a new approach: engineered neuroplasticity. The approach utilizes electronic devices, designed within a neuroethical framework, to interact directly with the nervous system to rewire connections and restore lost sensory and motor function.[144] To emphasize the role of neurotechnologies in its vision, the name of the ERC was changed in 2018 to the Center for Neurotechnology (CNT).

The CNT ERC’s testbeds drive the research from a systems perspective and integrate the disciplines of biomedical engineering, computer science, neurobiology and neuroethics. These testbeds include:

Co-adaptation Testbed to develop computational models and mathematical algorithms designed to help a brain-computer interface co-adapt with the brain itself in a closed-loop neural stimulation system.[145]

Significant technology achievements include:

Sensation in a Prosthetic Limb – An international team of researchers, from the ERC and the Karolinska Institute in Stockholm, Sweden, have stimulated part of a brain’s sensory cortex to create a sense of a hand being touched as the subject watched a rubber hand being stroked. The researchers tapped into a 64-electrode grid array implanted into the brains of two epilepsy patients to monitor seizures (Figure 5-32). Positioned over the somatosensory cortex—the region of the brain that processes touch—the arrays enabled the team to focus on the region that corresponded to the subjects’ finger. Researchers hid a volunteer’s hands behind a screen and placed a prosthetic hand in front of the screen and then used a touch probe linked to a cortical stimulation device to simultaneously touch the rubber hand with a brushstroke and stimulate the patients’ somatosensory cortex. Participants were more likely to agree with the statement “It feels as if the rubber hand were my hand.” Subjects later experienced something called proprioceptive drift—when blindfolded and asked to point to their finger, they tended to point to the rubber hand and not their own.[146]

Figure 5-32: Stimulation of areas of the brain’s sensory cortex can create the sensation of touch in a prosthetic limb. (Credit: CSNE/CNT)

In this case, the array was located on the right side of a subject’s brain. The red dots represent where electrodes stimulated an area associated with feeling in fingers; green dots represent control stimulation sites.

Using A Brain-Computer Interface to Control Both Arms with the Undamaged Side of a Brain – CNT researchers demonstrated that one side of a brain can multitask, simultaneously controlling one arm through a brain-computer interface (BCI) and another through normal neural pathways. The ability to control two arms with one side of the brain promises to greatly expand the reach of BCI, which had been limited in human trials to demonstrating function in people with spinal cord injury and brain-stem stroke. The number of people with these conditions is small compared to strokes that affect the brain surface and disable one side of the brain. The innovation of using the remaining, functional side of the brain to control both the arm it normally would as well as another arm through the BCI arose from a collaboration between electrical engineers, neurophysiologists, and rehabilitation researchers. The concept was demonstrated by a monkey who learned to play a videogame where he simultaneously controlled brain activity with the BCI and the movements of the opposite hand. The experiment showed that one brain area could effectively differentiate its activity to control the BCI while independently moving the hand naturally controlled by that brain area.

Interestingly, accomplishing the dual-task control did not depend on the type of neurons selected for the BCI. But under certain circumstances, neurons closely related to hand function used more roundabout pathways to achieve the videogame’s goals. Thus, selecting neurons less related to natural hand function may permit enhanced control of a BCI for people recovering from stroke.[147]

In collaboration with more than 30 industry partners, CNT researchers are translating their research on engineered neuroplasticity in human clinical trials and demonstrating significant improvements in the quality of life of patients. In collaboration with Medtronic, CNT researchers have demonstrated the first closed-loop deep brain stimulation (DBS) system for essential tremor patients. This system is aimed at reducing the side effects seen in current commercial open-loop DBS systems, while at the same time increasing battery life of the implant. A second collaboration with Neurorecovery Technologies has provided the first demonstration of significant improvements in hand and arm function in spinal cord injury patients using a completely noninvasive electrical stimulation system.

CNT has also spun off five startup companies, including two that have considerably broader applications beyond neural engineering. Initially developed for neural implants as part of an ERC-funded student project, the wireless power technology now being commercialized by CNT startup Wibotic, Inc. has applications in powering not only medical pacemakers but also robotic systems such as drones. Similarly, the concept of battery-free wireless communication through ambient backscatter of radio waves was first developed as a CNT-funded student product but is now being commercialized for the consumer electronics market by Jeeva, Inc., a startup founded by CNT students and their advisors. 

These three ERCs and the community-building activities of the Telluride Workshop demonstrate how effectively engineers, computer scientists, neurobiologists, and rehabilitation scientists have catalyzed the field of neuromorphic engineering and its impacts on artificial intelligence, medicine, and rehabilitation medicine. This impact was achieved through research that spanned a continuum from fundamental inquiries to proof-of-concept testbeds involving animals and humans as well as the training of a wide range of neurobiologists and neuromorphic engineers who have spread across the country to develop programs in neuromorphic engineering in universities and hospitals. 

                                        iii.         C-SOPS and New Approaches to Pharmaceutical Manufacturing[148]

The Center for Structured Organic Particulate Systems (C-SOPS), an Engineering Research Center (ERC), received NSF-ERC Program support between 2005 and 2016. It was headquartered at Rutgers University, in partnership with the New Jersey Institute of Technology (NJIT), Purdue University, and the University of Puerto Rico at Mayaguez (UPRM). The C-SOPS leaders are Rutgers Professor Fernando Muzzio, the Director, and Purdue Professor Gintaras (Rex) Reklaitis, the Deputy Director. C-SOPS continues to function as an active center after “graduation” from ERC Program funding because of its research competence, its broad impact on industry, and its committed industrial partners.

The C-SOPS ERC focuses on the use of engineering principles to relate an in-depth analysis of particle properties to pharmaceutical product performance and to develop continuous manufacturing processes that integrate knowledge of particle behavior with mechanisms to sense and control that behavior. C-SOPS partners with most of the major U.S. pharmaceutical companies as well as firms that specialize in building equipment to support pharmaceutical product production, process sensing and control, ingredients suppliers, and process modeling. These industrial partners are unanimously enthusiastic about C-SOPS research in general and the production of continuous automated manufacturing systems specifically, and continue to support the ERC.

Research in an ERC extends to the proof-of-concept phase in testbeds, but usually does not continue further into the technology development and commercialization phase. In 2009, the ERC Program’s Innovation Fund invested in C-SOPS along with its industrial partner, Johnson and Johnson (J&J), to carry out translational research and technology development needed to bring a continuous pharmaceutical process for tablet manufacture, already well under way at C-SOPS, to a point that would enable a vigorous commercialization effort. The C-SOPS continuous pharmaceutical manufacturing technology employs direct blending, dry granulation, and wet granulation processes for the production of pharmaceutical tablets (uncoated and coated) and capsules. In contrast to the traditional batch manufacturing processes, continuous manufacturing using engineering-based process design methods can enable significant improvements in product quality, process robustness and productivity, and overall economic performance of the pharmaceutical tablet and other manufacturing processes.

At the time of that investment, there was a high level of interest in this technology on the part of both the U.S. Food and Drug Administration (FDA) and the large pharmaceutical manufacturers, many of which are C-SOPS members. Many technology and equipment suppliers, also members of C-SOPS, expressed a keen interest in addressing this market need. The key missing element needed for successful commercialization at that time was the fact that there was no single technology supplier that had all the capabilities required to address this commercial opportunity. Therefore, the main goal of the joint NSF/J&J investment was to assemble a coalition of technology suppliers, led by a systems integrator, and to enable them, through knowledge transfer and technical support, to commercialize fully integrated “turnkey” pharmaceutical manufacturing systems.

The FDA had indicated that modernizing pharmaceutical manufacturing through the development of continuous manufacturing capabilities, yielding higher product quality and efficiencies, was highly desirable and that the implementation of these systems would be fast-tracked through the U.S. regulatory process. This is validation that the C-SOPS endeavor could yield important benefits for end consumers—the American public—as well as for the manufacturing houses. The project had the potential to significantly influence the pharmaceutical industry on a global scale.

The C-SOPS/J&J testbed project established the first commercial continuous pharmaceutical manufacturing process in North America and was an acclaimed success. J&J’s investment of $15 million in its Project “INSPIRE” in 2012-2016 focused on the development of a continuous manufacturing commercial facility for J&J’s then-new HIV drug, Duranavir (commercial name “Prezista”). J&J provided the ERC with more than $1,600,000 in cash and $300,00 in personnel, plus an additional $250,000 in materials during the project. A facility was built in Gurabo, Puerto Rico, for manufacturing Prezista. By 2014, three other ERC members (Glatt, K-Tron, and Siemens), as well as non-member Bruker, were also involved in the project. The project was very successful; FDA granted approval for Prezista continuous manufacturing in 2016, and to date (October 2018) has approved four additional products to be continuously manufactured by Eli Lilly, Vertex, and Pfizer (all C-SOPS members). Moreover, J&J expanded its collaboration with C-SOPS, granting more than $6 million in funding to support development and implementation of continuous manufacturing processes for six additional products.

Since then, collaborations in continuous manufacturing have been initiated with many other C-SOPS members, as well as some non-member companies, including:

As a consequence of C-SOPS’ research, this demonstration testbed, and the potential opportunities for the use of biopharmaceuticals in disease treatment, after decades of stagnation, biopharmaceutical manufacturing is experiencing unprecedented innovation. By 2018, the biopharmaceutical industry and its technology suppliers had embraced a world-wide transformation from traditional, inefficient batch methods to continuous manufacturing, which C-SOPS and its industrial partners have shown greatly reduces both the time and the cost of developing and manufacturing new medicines while enabling significant improvements in quality and reliability of the final product. In addition,continuous manufacturing of small molecules and biologics has become a priority for biopharmaceutical companies, their technology suppliers, the FDA, the Biomedical Advanced Research and Development Authority (BARDA) office in the U.S. Department of Health and Human Services, and the U.S. Pharmacopeia (USP). Dozens of biopharmaceutical, equipment, and instrumentation companies are actively engaged in this major reinvention of the manufacturing platform. Companies such as J&J, Merck, Sanofi, Bayer, Glaxo SmithKline, Novartis, Eli Lilly, Vertex, and Pfizer have declared corporate goals of converting to continuous manufacturing more than half of their total production volume in the next few years. Many other companies are following suit.

The annual biopharmaceutical market exceeds a trillion dollars worldwide. IMS Quintiles, in their recent report “Outlook for Global Medicine through 2021 (Dec 2016),” estimates that this market will expand from $1.1 trillion in 2016 to $1.5 trillion by 2021. Within the next decade, we will witness a worldwide conversion to continuous manufacturing, which at maturity could reach 50% or more of total output. This means that at technological maturity for continuous manufacturing, biopharmaceuticals worth over $750 billion per year could be manufactured using the new continuous methods. Countries that are able to implement these methods effectively will capture much of this activity. Many other related industries, such as supplements, cosmetics, catalysts, and battery manufacture, will benefit as well. The total direct investment in equipment, instrumentation, and facilities required to implement the new manufacturing platforms could easily exceed $100 billion worldwide over the next 15 years. Specialty ingredients will need to be developed and commercialized to facilitate continuous manufacturing. An entire new generation of scientists, engineers, and technicians will need to be trained to implement, optimize, carry out, and regulate the new manufacturing methodologies required to manufacture products continuously. This innovation by the C-SOPS ERC clearly represents a revolutionary advance in a vital sector of the world’s industrial economy.

                                         iv.         Optoelectronics and ERCs


When the ERC Program began in 1985, photonics and optoelectronics were emerging fields for engineering research and new technology. These new directions were pioneered by the invention of laser technology and fiber optics. The advances that led to the laser began in the 1950s—on April 26, 1951, to be specific—when Charles Hard Townes, of Columbia University in New York, conceived his “maser” (microwave amplification by stimulated emission of radiation) idea while sitting on a park bench and demonstrated the first maser in 1954 at Columbia. In 1957, Gordon Gould, a graduate student at Columbia University, coined the term “laser” (light amplification by stimulated emission of radiation) in his lab notes (which he had notarized), describing how to construct one. In 1957, Townes sketched an early optical maser in his lab notes as well. In 1958, Townes, working as a consultant for Bell Labs, and his brother-in-law, Bell Labs researcher Arthur L. Schawlow, in a joint paper published in Physical Review Letters, showed that masers could be made to operate in the optical and infrared regions and proposed how it could be accomplished. At the Lebedev Institute in Moscow, Nikolai G. Basov and Alexander M. Prokhorov also were exploring the possibilities of applying maser principles in the optical region. By May 16, 1960, Theodore H. Maiman, a physicist at Hughes Research Laboratories in Malibu, California, constructed the first laser by shining a high-powered flash lamp on a rubyrod with silver-coated surfaces.[149]

In 1961 Elias Snitzer, of American Optical Corporation, published a theoretical description of single-mode fibers—a fiber with a core so small it could carry light with only one waveguide mode. Snitzer’s idea was suitable for a medical instrument looking inside a human, but the fiber had a light loss of one decibel (a measurement of light as well as sound) per meter. Communications devices needed to operate over much longer distances and required a light loss of no more than 10 or 20 decibels per kilometer. By 1964, a critical (and theoretical) specification was identified by Dr. C.K. Kao for long-range communications devices. The specification was 10 or 20 decibels of light loss per kilometer, which established the standard. Kao also illustrated the need for a purer form of glass to help reduce light loss. In 1970 one team of researchers began experimenting with fused silica, a material capable of extreme purity with a high melting point and a low refractive index. Corning Glass researchers Robert Maurer, Donald Keck, and Peter Schultz invented fiber optic wire or “Optical Waveguide Fibers” (patent #3,711,262) capable of carrying 65,000 times more information than copper wire. This wire allowed for information carried by a pattern of light waves to be decoded at a destination even a thousand miles away. The team had solved the problems presented by Dr. Kao, spawning the revolution in optical fiber communications technology that ensued.[150]

In the 1960s, at the University of Illinois, Joseph Verdeyn developed the coupled-cavity laser interferometer, a major advance in plasma and laser diagnostics. Interferometers are used to make ultra-sensitive measurements of the refractive index of a plasma or a material, which allows one to determine physical parameters that are critical to optical devices such as lasers or lamps. Verdeyn also was the first to publicly demonstrate a thermally pumped carbon-dioxide laser, publishing a paper in Applied Physics Letters in 1969. He discovered that the CO2 laser, which operates in the infrared, could be driven by temperature.[151] He later became the Director of the ERC for Compound Semiconductor Microelectronics at the University of Illinois, Urbana-Champaign.

NSF/ERC Involvement

During the 1970s, NSF provided some support for engineering research in photonics and lightwave technology through the physics division, but with the creation of the Directorate for Engineering in 1985, a new Lightwave Technology Program was instituted that provided support sufficient to cover both single-investigator and small-group research projects. The ERC Program, also begun in 1985, provided an opportunity for academics working in optics, photonics, and optoelectronics to join with industry to explore emerging high-risk research and technological opportunities in the field, together with a long-term opportunity to experiment at the systems level with proof-of-concept testbeds and expand the technical workforce in optical technology. Early ERCs, funded between 1985 and 1987, explored the use of optics as a means for transmitting signals in intelligent optical networks, optical interconnect systems, and computing systems. Later, ERCs funded between 2000 and 2010 focused on optics technology in medicine, extreme ultraviolet radiation technology, quantum cascade laser technologies, state-of-the-art lighting, and optoelectronic technologies for high-bandwidth, low-cost, widespread access and aggregation networks.

1.      Optics and Communication Networks

The Center for Telecommunications Research (CTR), funded at Columbia University in 1985, had a broad mandate to pursue research into many aspects of telecommunications, but in an interdisciplinary way.  One of its five thrust areas was optoelectronics and as such, it was the first ERC with some optoelectronics focus. Its vision was to advance optically-based broadband communications networks to link people and machines worldwide. The first Director was Mischa Schwartz; Anthony Acampora took over the leadership of the center in its sixth year. The ERC focused on multi-wavelength lightwave networks, as opposed to point-to-point systems, to exploit the enormous potential of optical communications (terabits per second). The state of the art at that time focused on applications of optical technology to telecommunications—more appropriately characterized as lightwave transmission systems, rather than lightwave networks. This was an incremental approach that would not fully exploit the potential of photonics. In contrast, the CTR focused on “revolutionary” new network architectures that might provide significantly improved telecommunications infrastructure to support broad-band applications over local, metropolitan, and wide-area service regions. It was not a vision to just replace copper wiring and electronic equipment with their analogous optical counterparts. Rather, they sought to exploit the enormous bandwidth potential of the optical medium itself, a potentially accessible bandwidth of 100,000 gigahertz. The speed mismatch between the potential bandwidth of the medium and the capabilities of existing electronic and optoelectronic devices (semiconductor lasers, etc.) gave rise to an electro-optical bottleneck, whereby no user could individually access more than a tiny fraction of the available spectrum at that time. This bottleneck, coupled with other limitations of state-of-the-art lightwave devices (e.g., gain flatness and spectral width of optical amplifiers) presented unique constraints, requiring unique new systems approaches for tapping the multiterahertz spectrum.

The team joined together skills in electrical engineering, physics, and computer science. The Center’s ACORN project (Advanced Communications Organization for Research Networks) provided a joint university/industry proof-of-concept testbed to experiment in leading-edge lightwave-based telecommunications infrastructure. The approach was to let optical media do what optics does best (move vast amounts of information across large geographical distances), while letting electronics do what electronics does best (process, store, and route information). The testbed systems supported both circuit and self-routing packet switching. The underlying cross-disciplinary research accomplishments in optoelectronics included: (1) optically powered, efficient circuit-switching linear lightwave architecture for connecting pairs of access stations via high-speed channels which are wavelength-multiplexed onto the optical medium; (2) the rearrangeable multihop architecture to support fast packet switching on top of a traffic-adaptive circuit switching lightwave network; and (3) implementation of the Teranet testbed to study technical feasibility.[152]

These systems-driven innovations advanced the state of the art in optics and optoelectronics by demonstrating, for the first time, that the vast capacity of optical fibers could be unleashed to support broadband multimedia communications over a wide area packet-based network.

The Center for Compound Semiconductor Microelectronics, established at the University of Illinois, Urbana-Champaign in 1987, focused on solving the interconnect problem in high-speed, high-density digital systems with rapidly advancing photonic and optoelectronic technologies. The ERC was initially led by Gregory Stillman; then by Joseph Verdeyn, who strengthened its systems focus; and finally, by Stephen Bishop, who led it from its third year until graduation. Throughout the Center’s eleven years of effort as an ERC, the research evolved from its roots in devices and materials to a focus on optoelectronic integrated circuits (OEICs) and systems. Letting go of their single-discipline approach to research, the team came to recognize that implementing optical interconnects in high-performance digital systems would require the integration of photonic and electronic devices in monolithic OEICs, which could offer high functionality and low-cost chip and packaging technology. With the encouragement of their site visit team members and their Technical Advisory Committee, the team reached a system-level goal, achievable only in a center-level setting because it required the integration of knowledge across several disciplines and systems-level testbeds. The initial testbed was a system-within-the-chip, which evolved to a fully systems-driven approach that was controlled by the role of the OEICs within a higher-level optical interconnect system. This was the “chip-within-the-system,” which recognized that the design constraints imposed on the OEIC chip are increasingly dominated by the functionality, performance, packaging, and manufacturing cost considerations of specific systems within which the chips must function.

To overcome the limitations of point-to-point optical interconnects and focus on systems-driven devices and packaging, which were bottlenecks to the growing optoelectronics industry, the ERC streamlined its systems focus with a lightwave systems testbed—iPOINT (Illinois Pulsar-based Optical Interconnect)—to fully exploit the potential of broadband fiber optics links, Asynchronous Transfer Mode (ATM) protocol, software/hardware co-design, computer-aided optimization of components specifications, and application software for multimedia services.

The ERC’s system design and analysis architecture was designed to guide optoelectronic devices and materials development projects through realistic assessment of device potentials and the determination of fundamental limits of devices and subsystems. A model-independent optoelectronic circuit simulator, iSMILE, was used to assess optoelectronic and semiconductor devices and circuits by a range of industrial partners, including McDonnell-Douglas, Hewlett-Packard, Texas Instruments, and Boeing Advanced Technology Center. Boeing acknowledged that the critical use of iSMILE in its photoreceiver development and the student who developed it, A.T. Yang, founded a business, Anagram, in Silicon Valley. [153]

2.      Optoelectronic Computing

Light has natural advantages over electronics because it does not have the time response limitations of electronics, does not need insulators, and can transmit hundreds of photon signal streams simultaneously using different color frequencies. The signals are immune to electromagnetic interference and free from electrical short circuits. In addition, they have low-loss transmission and provide large bandwidth capable of communicating several channels in parallel without interference. Photonic materials are capable of propagating signals within the same or adjacent fibers with essentially no interference or crosstalk; and they are compact, lightweight, and inexpensive to manufacture, as well as more facile with stored information than magnetic materials. By replacing electrons and wires with photons, fiber optics, crystals, thin films, and mirrors, researchers anticipate opportunities to build a new generation of computers.[154]

To exploit these advantages of optics, the vision of the Optoelectronic Computing Systems Center (OCS) at the University of Colorado (CU), in partnership with Colorado State University, was to join the advantages of photonics combined with the advantages of electronics to explore optoelectronic devices and systems for computing signal processing and artificial intelligence. The ERC was initially led by W. Thomas Cathey, a CU Professor of Electrical Engineering who specialized in pattern recognition, holography, and laser arrays. Multidisciplinary research thrusts included: (1) the design and fabrication of a bit-serial optical computer as a testbed for determining device reliability and functional complexity; (2) fabrication of novel devices using new materials and processes; (3) the development of a general methodology for identifying optoelectronic signal processing systems; and (4) the design of symbolic computers and associative memories for optical artificial intelligence.[155]

This ERC was started in 1987, at a time when such goals were high-risk but with a potentially long-term high payoff—just the type of area an ERC should focus on.[156] Several unique research challenges arose from these goals, including new neural network algorithms, specifically tailored to the application of optics in the design of neural network computing machines; speed-of-light digital computer architectures in which the information is in constant motion and is never static or stored in a matching memory; and ferroelectric liquid crystal materials that are tailored for use in spatial light modulators and optoelectronic neural computing systems. The research program involved a team of computer scientists, physicists, semiconductor device engineers, electrical engineers focused on digital signal processing, and mechanical engineers.

The systems challenges were structured to explore architectures and algorithms that exploit a different potential advantage of optics: high connectivity, speed, and parallelism. The three systems research programs were: (1) optoelectronic connections to develop the computing possibilities of highly interconnected dynamical systems implemented with optoelectronic materials and devices; (2) digital optical computing to explore computer architectures for high-speed and speed-scalable architectures and computing systems using optoelectronic devices; and (3) optical signal processing to explore the parallel processing capabilities of optics. These were supported by the materials and devices research program focused on identifying and exploring new optoelectronic materials and devices that might offer substantial increases in performance. These four programs were integrated by system needs and device capabilities.

The ambitious goals of the research program challenged this interdisciplinary team of faculty and students to undertake pioneering research in optoelectronics, which produced a range of fundamental breakthroughs and new technologies and a new generation of optoelectronic engineers skilled in both optoelectronic systems and devices. An example of a technology breakthrough comes from cross-faculty collaboration by Garrett R. Moddel, a Professor in the CU Physics Department, and Kristina M. Johnson, a Professor in the CU Electrical and Computer Engineering Department, who were awarded a patent in 1990 for an optically addressable spatial light modulator (SLM) as a result of the engineering challenges posed by their collaboration in the ERC.[157] These SLMs became an important tool throughout the ERC’s research program and beyond in the development of optoelectronic technology.

By 1993, Cathey transitioned the leadership to Kristina Johnson, a young faculty member with a Ph.D. in Electrical Engineering, whom he had encouraged to pursue engineering while she was still in high school in Denver. During the 1990s, he focused on stimulating a wide range of photonics startups along the Piedmont east of Boulder. For example, Cathey, R.C. “Merc” Mercure (the Deputy Director of the ERC), and Ed Dowski (one of Cathey’s students) spun out CDM Optics, Inc., in Boulder. “CDM is the exclusive licensee of patented Wavefront Coding technology, which increases the performance of a camera system by increasing the depth of field or correcting optical aberrations of a photographic image. It uses novel optics and innovative algorithms to transform the essential task of focusing a lens from an optomechanical process to one of optical encoding and signal processing. By merging optics design and digital signal processing, Wavefront Coding can, for example, significantly expand the depth of field of an image, meaning that the image is in focus over a much wider range of distances from the lens than is possible using conventional focusing systems. By eliminating motors and actuators, the technology significantly reduces the size and complexity of the auto-focus function on a camera module.”[158] In 2005, OmniVision Technologies Inc., a supplier of CMOS image sensors, acquired CDM Optics for $30 million in cash and stock.

Kristina Johnson was also responsible for spinning out several start-up firms in Boulder from the ERC’s technologies, including: Chorum, in applications of liquid crystals to telecomm (market cap over $250M); Colorado Microdisplay (liquid crystal on silicon head mounted displays); and ColorLink (fast switching color components for electronic projectors, including TVs and computer monitors. For ColorLink, Johnson and Gary Sharp, a student at the ERC, worked on birefringent materials, which are materials whose refractive index depends on the polarization and propagation direction of light. The most visible applications of birefringent filters have been in the entertainment and electronics industries. In 2007, the company was sold to RealD, a leader in the 3-D imaging field.

3.      Millennial Optics ERCs Funded in the 2000s

There was a long hiatus between the support of the OCS in 1987 and later ERCs with a significant component devoted to optics-based technology. ERCs funded between 2000 and 2010 focused on optics technology in medicine, extreme ultraviolet radiation technology, quantum cascade laser technologies, state-of-the-art lighting, and optoelectronic technologies for high-bandwidth, low-cost, widespread access and aggregation networks.

The first of this generation of ERCs was the Center for Subsurface Sensing and Imaging Systems (CenSSIS), funded in 2000. This ERC was headquartered at Northeastern University (NU), in partnership with Boston University (BU), the University of Puerto Rico at Mayaguez (UPRM), and Rensselaer Polytechnic Institute (RPI). With Michael Silevitch (electromagnetics) in the lead role as Center Director and Bahaa Saleh (photonic sensors and physics-based imaging) from BU serving as the Deputy Director, the CenSSIS mission was to revolutionize the existing technology for detecting and imaging biomedical, environmental, or geophysical objects or conditions that lie underground or underwater or are embedded in the human body. The Center’s unified, multidisciplinary approach combined expertise in wave physics (photonic, ultrasonic, electromagnetic, etc.), sensor engineering, image processing, and inverse scattering to create new sensing modalities and prototypes that were to be transitioned to industrial, clinical, and government partners for further development.

While the ERC used several imaging modalities, from optics to X-rays to radar, an important contribution lay in the development of image optimizing algorithms to improve detection of objects, cells, or masses from fewer images. “A novel idea of signal processing that emerged during the lifetime of CenSSIS was the concept of compressed sensing. In compressed sensing, the fundamental idea is that one can reconstruct ‘sparse’ signals using many fewer measurements than would normally be predicted by sampling theory.”[159]

This resulted in new approaches to image reconstruction with sparse arrays of sensors, applied to several domains. One of these is the Semi-Analytic Mode-Matching (SAMM) method that allows quick simulation of rough-interface buried object scattering from either plane wave or borehole sources. It’s a first-pass algorithm that produces fairly accurate forward models that are orders of magnitude faster than comparable finite difference algorithms; a 3D SAMM simulation can take two to three minutes on a laptop, compared to two hours for an alternative method suing an Alpha supercomputer.[160] Most importantly, the capacity of CenSSIS researchers to develop high-speed image processing algorithms led to a partnership between the CenSSIS team and radiologists at the Massachusetts General Hospital (MGH) breast imaging group. The ERC developed a parallelized version of the serial maximum-likelihood reconstruction algorithm, reducing execution time of the algorithm from four hours to less than five minutes. After clinical trials in partnership with General Electric, the FDA approved a commercial product, called GE SenoClaire 3-D breast tomosynthesis, in the U.S. That equipment and similar ones from other manufacturers are now widely used in breast imaging.

Technologies developed in the CenSSIS ERC to capture optical signals were:

  • Quantum Optical Tomography (Q-OCT), an enhancement of optical coherence technology, which uses light sources endowed with special spatiotemporal correlation properties called entanglements to improve resolution of OCT—even in biological tissue—offering higher signal-to-background ratio and axial resolution greater by a factor of two for the same source bandwidth.[161]
  • Acoustic-optic imaging (AOI), a dual wave method for biomedical imaging that promises to improve image resolution at greater depth by making use of a combination of diffuse laser light and focused ultrasound, can reveal optically relevant physiological information while maintaining ultrasonic spatial resolution; a patent was awarded in 2010.[162]

The ERC developed an optical imaging testbed combining funds from the ERC and the Keck Foundation. They created a unique, state-of-the-art 3-D Fusion Microscope (3DFM) that images specimens using multiple sensors simultaneously to perform true three-dimensional imaging. The microscope offers the ability to compare images from different modalities and to fuse them into single images offering greater information content. For the first time, non-invasive optical cell counting in complex multi-celled embryos was possible, supported by algorithms that can accurately count cells in embryos almost up to the blastocyst stage. Carol Warner and her team indicate that: “New imaging data obtained from the Keck 3DFM, combined with genetic and biochemical approaches, have the promise of being able to distinguish healthy from unhealthy oocytes and embryos in a non-invasive manner. The goal is to apply the information from our mouse model system to the clinic in order to identify one and only one healthy embryo for transfer back to the mother undergoing an ART (Assistive Reproduction Technology) procedure. This approach has the potential to increase the success rate of ART and to decrease the high, and undesirable, multiple birth rate presently associated with ART.”[163]

These systems-driven innovations advanced the state of the art in optics and optoelectronics by demonstrating the effectiveness of these emerging methods applied to real applications.

The ERC for Extreme Ultraviolet Science and Technology (EUV) was established in 2003 as a partnership between Colorado State University (CSU), the University of California, Berkeley (UCB), and the University of Colorado (CU), Boulder. The Director, Jorge Rocca, a CSU Professor of Electrical Engineering, had been a member of the OCS and the Deputy Director, Margaret Murnane, is a Professor of Physics at CU.

The vision of the EUV ERC, which graduated from ERC Program funding in 2013, was to: (1) advance the technology of small-scale and cost effective coherent EUV sources and (2) to demonstrate their utility by integrating them into testbed applications such as high-resolution imaging, materials metrology and characterization, elemental- and molecular spectro-microscopy, X-ray science, ultrafast X-ray science, nanoscience, and nanofabrication. The research was focused on a fundamental understanding of optical science, light generation, and optical instrumentation technologies in the extreme ultraviolet spectral region. EUV wavelengths are 10-100 times shorter than those for visible light, enabling technology that can “see” smaller features, “write” smaller patterns, and “generate” shorter pulses. Photon energies at 10’s-100’s eV interact strongly with matter, are well matched to atomic resonances in most elements, and allow element- and chemically-specific spectroscopies. Thus, with these advances in EUV technology, optical measurement techniques that have heavily relied on the use of radiation in the visible region of the spectrum over the past century were proposed to be extended to the EUV using tabletop sources invented by members of the proposing team. These tabletop EUV sources have tremendous potential for engineered systems in lithography, metrology, microscopy for imaging nanodevices and materials surfaces, nanomachining, material, and biological imaging at a very high spatial resolution of 20 nm and beyond. The EUV ERC proposed to generate and exploit tabletop, high-repetition rate EUV (5-50 nm) sources with spatially coherent average powers previously available only at large synchrotron facilities. Prior to the ERC, research in EUV coherent radiation was mostly limited to a handful of large national facilities.

Throughout its ten years of operation under NSF support the EUV ERC made many significant achievements in fundamental science and technology. These include:

  • At the time of the EUV ERC inception in 2003, only a handful of basic science experiments using coherent EUV light had been conducted outside of large national light source facilities. The wavelength range of compact coherent EUV sources was very limited and corresponded mostly to the long wavelength limit of the EUV spectrum (30-50 nm). Breakthroughs in both EUV Lasers and in High Harmonic Generation sources at the NSF EUV ERC have greatly expanded their spectral coverage down to 1 nm, have increased their average power by several orders of magnitude, and have in some cases reduced the source size down to desk-top scale. Tabletop EUV lasers were demonstrated for the first time at wavelengths below 10 nm with sufficient pulse energy to render single-shot images with nanoscale resolution and to produce record average power of 0.1 mW in the 13-18 nm spectral regions for applications. High harmonic sources achieved full phase matching at wavelengths of <8 Å, coherently combining >5000 mid-IR photons to generate bright soft X-ray beams with coherent bandwidths sufficient to support isolated 2.5 attosecond pulse.
  • Imaging experiments with the tabletop coherent EUV sources improved the resolution down to 22 nm, and demonstrated movies of nanoscale dynamic interactions for the first time using a tabletop setup. In the area of metrology, high harmonic pulses were used to measure the limiting demagnetization speed in widespread magnetic alloys and multilayer systems with <10 fs time resolution, yielding many surprising results. In acoustic nano-metrology the first method to characterize the mechanical properties of very thin <50 nm films, and probe heat flow in 1D and 2D <30 nm structures were demonstrated. Transient attosecond absorption is allowing the direct measurement of quantum material processes on attosecond timescales. A new EUV laser ablation nanoprobe was developed to map the chemical composition of samples in 3-dimensions with nanoscale resolution. This probe has the potential to map the chemical composition of biological specimens at the sub-cellular level. In the area of nanoscale patterning, error-free printing was demonstrated by coherent illumination of a mask with a compact EUV laser and dense patterning of record small 15 nm half pitch was achieved in a chemically amplified resist using synchrotron light. By offering X-ray that are the ultimate “probe light,” coherent X-ray beams promise revolutionary new capabilities for understanding and controlling how the nanoworld works on its fundamental time and length scales. This knowledge is highly relevant to next-generation electronics, data storage devices, and diagnostics of cells at the nanoscale. The unique ability of ultra-fast X-rays to probe functioning materials is uncovering new understanding of how electrons, spins, and phonons behave at spatial-temporal limits.
  • The Center has supported industry in the development of new manufacturing technologies based on EUV light, crucial to the nation’s economy. In particular Center graduates have made key contributions to the development of EUV sources for lithography, a multi-billion-dollar business, for the manufacturing of the most advanced computer processors and memory.[164] The leading EUV source manufacturer alone hired eight Ph.D. Center graduates. For example, a primary roadblock for EUV development has been the requirement by silicon chip manufacturers that a source power of 250 watts would be required to achieve a throughput of 125 wafers per hour. The lithography vendor ASML and Cymer, which ASML acquired in 2013, had been trying to push the technology to hit that mark. In 2017, ASML announced that they can claim that milestone—a 250 watt source power, a 10-fold improvement from 25 watts in 2012. These advances enabled volume manufacturing of semiconductor chips with EUV light to become a reality in 2018. Collaboration between Cymer and the EUV ERC researchers, and the work of several Center graduates now working in industry, contributed to make the advancement possible.[165]
  • Center graduates have made crucial contributions to the development and implementation of EUV technologies in manufacturing and have received awards from their companies for their achievements, many are now in leadership positions. The Center contributed to the creation and the growth of new small companies. Compact coherent sources developed at the Center are now commercially available and are making an impact in institutions world-wide.

The Mid-infrared Technologies for Health and the Environment (MIRTHE) ERC was established in 2006 at Princeton University (PU) with the following partner institutions: City College of New York, Johns Hopkins, University, Rice University, Texas A&M University, and the University of Maryland Baltimore County (UMBC). The Director was Claire M. Gmachl and the Deputy Director was Anthony M. Johnson (UMBC). MIRTHE first initiated, then continued, the development of a new platform of trace gas sensor systems that provide unprecedented high-performance and cost-effective sensing capabilities. These sensor systems are based on mid-IR spectroscopy and excel through their sensitivity, specificity, compactness, autonomy, networking capability, and fast time response. They fulfill the application requirements of trace chemical sensing on the individual, urban sensor network, regional and/or global scales. In doing so, MIRTHE addresses the important societal challenges of securing a clean, safe, sustainable, and healthy environment; clean air to breathe; and accessible healthcare on the national and global scale. These sensing systems are based on mid-infrared Quantum Cascade (QC) laser spectroscopy and are compact and autonomous, networked with fast time response. In addition to the situation for environmental sensing on various length scales, several more system-level sensor applications can be identified that pose similar challenges: industrial process control, automotive monitoring, homeland security, safeguarding of public spaces, and public and personal health. For these applications MIRTHE has been developing trace-gas sensing technologies that are high-performance, cost-effective, field-deployable, and have high temporal resolution, and hence are capable of addressing the important system level needs.[166]

The research at MIRTHE has rested on foundational work that encompasses the research and development of mid-IR light sources, detectors, and the materials research necessary to facilitate both enabling advances in QC lasers, other mid-IR devices, and related knowledge that supports the systems and testbeds. The key research areas were: high performance QC lasers; III-V growth; QC lasers in new materials (II-VI, nitrides); highly sensitive detectors; integrated mid-IR photonics, chalcogenides, and novel materials; and ultra-fast and nanoscale characterization.[167]

Technology applications include:

  • An open-path QCL (QCLOPS) system deployed and tested during the 2008 Olympic period in Beijing, China. The Beijing project also included deployment of a point sensor for nitrous oxide and implementation of WRF-Chem[168] as a component of weather and air quality forecasting for the Olympic period. Broadly tunable external cavity quantum cascade laser (EC-QCL) systems proved to be excellent sources for spectroscopic applications and trace gas sensing. For the first time, MIRTHE demonstrated broadband thermal infrared heterodyne spectro-radiometry over a frequency range of more than 100 cm^-1 using EC-QCL operating at 8.4 µm as a tunable local oscillator. A first prototype EC-QCL system was developed and wavelength tuning rates up to 5kHz of a pulsed EC-QCL operating at 10.2 µm were demonstrated.
  • An open-path, QC laser-based methane sensor for long-path (> 1 km) integrated measurements was developed and deployed over a two-week period in the Arctic. The system demonstrated stable and precise (0.5%) measurements of CH4 even in rapidly changing field conditions at Toolik Lake. Elevated methane emissions were observed over the lake.
  • Compact sensor platforms were developed using state-of the-art, mid-infrared, thermoelectric cooled (TEC), continuous wave (CW), distributed feedback (DFB) interband cascade lasers (ICLs) for the detection of greenhouse gases, pollution field monitoring and natural gas leak detection of methane (CH4) and ethane (C2H6). This was accomplished by focusing on two novel sensing technologies: tunable direct laser absorption spectroscopy (TDLAS) and quartz enhanced photoacoustic spectroscopy (QEPAS). Significant achievements include the sensitive and selective detection of CH4 at 3.29 µm (3039.5 cm^-1) and C2H6 at 3.36 µm (2976.2 cm^-1) at ppbv (parts per billion volume) concentration levels.
  • A new type of lens, based on “super-oscillation” of waves, that enables measurement of an object with sub-wavelength accuracy in all three dimensions. Major results included improved lens design and improved object/beam control. Actively mode-locked external cavity mid-IR QCLs operating in the whole dynamic range of a laser were demonstrated. A compact, ultra-broadband terahertz polarizer based on macroscopically long and aligned carbon nanotubes was fabricated and demonstrated.[169]
  • The Tunable Acoustic Gradient Index of Refraction Lens (TAG Lens) was developed at MIRTHE and licensed in 2017 to a MIRTHE start-up company, TAG Optics, which was acquired by Mitutoyo. The acquisition assures further market penetration of the lens. This lens is transforming how optical focusing is performed. Traditional optical systems were unable to rapidly change focal position or control depth-of-field independently of magnification. The TAG Lens is a computer-controlled adaptive lens that works with existing optical assemblies and can rapidly change focus for any user-defined applications, including high-resolution imaging, high-throughput industrial/biomedical scanning, and laser processing. It can be integrated into machine vision, microscopy, or laser processing optical systems. The lens increases throughput and reduces downtime due to mechanical system failure, while leveraging existing capital equipment.[170]

The Smart Lighting ERC was funded in 2008 and was later renamed the Lighting Enabled Systems & Applications (LESA) ERC. The Center is led by Robert F. Karlicek, Jr. and the Deputy until 2013 was Partha S. Dutta, who was succeeded in the same year by Richard Radke. Graduating from ERC support in 2019, it is headquartered at Rensselaer Polytechnic Institute (RPI) and functions in partnership with Boston University (BU), the University of New Mexico (UNM), and Thomas Jefferson University (TJU). The vision of the ERC over many years has been to develop intelligent digital lighting systems to promote health, productivity and energy savings. The engineered smart lighting systems envisioned would optically sense the environment to provide energy efficient, controllable, and comfortable illumination when and where it is needed. Beyond illumination, smart lighting systems will simultaneously provide high-speed data access and scan for biological and biochemical hazards. Furthermore, the same data used to autonomously optimize illumination can inform building management systems like heating, ventilation and air conditioning (HVAC) for more efficient operation. The ERC has undertaken the challenging task of developing smart lighting systems that are able to provide wireless communication and new sensing capabilities in tandem with revolutionary illumination features. State-of-the-art commercial solid-state lighting (SSL) products have already demonstrated performance improvements in illumination and energy efficiency. The ERC is advancing the technology on several fronts:

  • Smart Spaces – research on photonic sources with improved efficacy, greater spectral control, and greater temporal control of LEDs, leading to the following advances:
    • Successfully demonstrated the first monolithic integration of high electron mobility (HEMT) and light emitting diodes (LEDs) on the same gallium nitride (GaN)chips, leading to a new class of light emitting power integrated circuits (LEPICs) that can play a role in cost-effective monolithic integration of electronics and LED technology for new lighting-enabled systems because they offer improved power and frequency performance characteristics that can help operate LEDs more effectively in optoelectronics through the common platform enabled by GaN materials, which should make it possible to incorporate drivers, dimming circuits, high-speed digital switching, and other smart control functions in a single reliable and cost-effective package.[171]
    • Patent awarded with wider applications for “Growth of Cubic Crystalline Phase Structures on Silicon Substrates and Devices Comprising the Cubic Crystalline Phase,” derived from using the non-polar facet of GaN in a material system to demonstrate emissions across the visible spectrum by patterning the silicon surface with an array of nanoscale grooves that lead to a unique, geometrically-driven phase segregation that promotes the growth of the cubic phase of GaN—plus the addition of quantum-well layers to create the light-emitting material used to fabricate cubic LEDs, offering more colorful and efficient displays.[172]
    • Multiple patents awarded for engineering nanocomposites with high refractive indices and high optical clarity for improving the efficiency of LEDs and optical systems used in LED luminaires.[173]
    • Patent awarded for a method of using LED lights instead of cameras to detect the presence, pose, and location of a room’s occupants using time-of-flight (ToF) sensing that measures light and physical distance, facilitating mapping of the space and determining the location as well as heights of individuals and other objects in the room, which can be useful in building management, such as healthcare facility management.[174]
  • Visible Light Communications – low data rate control-enabling visible light optical communications and high data rate visible light-based wireless access—
    • Demonstrated a highly dense wireless network comprised of 15 small lighting cells, relying on light’s directional characteristics to narrow the coverage area access points to increase wireless access and bandwidth for smart devices while producing far less interference than RF technologies, which are omnidirectional and difficult to control in small-area coverage.[175]
  • Mediation of Human Circadian Rhythms for Health and Wellness—
    • Computer-controlled lighting system designed with products from ERC industry members Telelumen, Heptagon, and Austria Microsystems that mimics the diurnal and seasonal variations of natural outdoor light and can be customized to provide light with qualities to treat sleep-wake disorders and medical problems such as light-deprivation-induced depression.[176]

These systems-driven innovations advanced the state of the art in optics and optoelectronics by expanding the applications of advanced optical materials through precision refractive index engineering for improved light extraction efficiency and color control, demonstrating new light emitting circuit elements where diodes and transistors can be integrated in a single high powered device, and developing novel plenoptic optical sensors capable of simultaneously measuring digitally encoded light across the entire visible spectrum. Beyond energy efficient illumination, these materials and device innovations are finding new applications in micro-LED display technologies as well as energy efficient smart building and smart city applications.

The Center for Integrated Access Networks (CIAN) was funded in 2008. The University of Arizona (UA) is the lead institution, in partnership with the University of California at San Diego (UCSD), University of California at Los Angeles (UCLA), University of California at Berkeley, Columbia University (UCB), Norfolk State University, Tuskegee University, and the University of Southern California (USC). The ERC is led by Nasser Peyghambarian (UA) as the Director and Yeshaiahu Fainman (UCSD) as the Deputy Director. The research vision is to enable end user access to emerging real time, on-demand network services at data rates up to 100 Gbps anytime and anywhere at low cost and with high energy efficiency. Additionally, CIAN team’s vision led to developments of low-cost optical technologies in support of gigantic bisectional bandwidth in Data Centers.

The continued rapidly increasing demand for bandwidth has created serious bottlenecks at multiple levels in the network that cannot be solved by simply scaling up the current technology. CIAN recognized this challenge and its research specifically targets the following bottlenecks in communication networks: increased demand for higher subscriber access rates, scalability requirement for diversified users and applications, reduced latency tolerance for many applications, network management and control for dynamic bandwidth allocation, and demand for reduced energy consumption and most importantly the cost. CIAN’s mission is to develop optoelectronic technology—particularly network devices, silicon-based photonic integrated circuits (PICs) that can be manufactured at low cost using CMOS processes, and relevant architecture—to create transformative communication networks that address these emerging and existing bottlenecks both for data aggregation in regional networks and for data centers. The systems-integration testbeds are focused on data centers, optical aggregation, intelligent aggregation networks, and characterization of chip-scale PICs. As shown in Figure 5-33, these drive fundamental research in device physics and enabling technology research in silicon photonics.

Figure 5-33:  CIAN’s Strategic Research Plan (Source: CIAN)

Notable CIAN achievements in optics as of 2019 are:

  • Developed a new hybrid optical/electronic network routing that is the first of its kind to use optics to circuit switch large flows of data inside a data center, removing existing bottlenecks with higher speed and more advanced optical capabilities such as wavelength routing with three orders-of-magnitude improvement over the prior system.[177]
  • Discovered a way to establish non-symmetric mode propagation on a silicon chip, using a metallic-silicon optical waveguide system that channels light to travel in different modes depending on its propagation direction—symmetric when traveling forward and asymmetric when reflected backwards along the same path—thereby realizing a long-term goal of combining electronics with photonics to enable chip-scale integrated circuits and systems for scalable, energy efficient, and cost-effective information systems.[178]
  • Developed a new optical coding and modulation technique that could double or triple the speed at which data flows through optical cables by using multiple conditions to encode data—a multinary encoding—that goes beyond the on-off encoding of binary modulation, enabling data rates of 1 Pb/s at a time when the 2016 adopted standard was 100Gb/s Ethernet (100GbE), with potential uses in energy-efficient, signal-constellation designs, spectral multiplexing, polarization-division multiplexing, and orthogonal-division multiplexing to produce serial optical transmission rates greater than 1Pb/s.[179]
  • The CIAN team was the first to demonstrate holographic telepresence by recording 3D data in one location, transferring that data to a different location using the Internet and displaying the 3D data holographically.
  • CIAN pioneering work in silicon photonics led to the creation of large programs in chip-scale integration such as the AIM Photonics.

These systems-driven innovations advanced the state of the art in optics and optoelectronics by creating new class of nanoscale size laser sources and Si-photonics devices and circuits with novel functionality that can be manufactured with CMOS compatible materials and processes. The technology developed by CIAN serves as a foundation for development of scalable and cost-effective future quantum communications and quantum sensing circuits and systems essential for competitiveness of US economy in the future decades. 

The other optoelectronics ERCs based on (a) extreme ultraviolet radiation technology led to creation of new short wavelength and X-ray sources, (b) on the quantum cascade laser technologies led to creation of novel mid infrared semiconductor sources, and (c) on lighting led to new light emitting diodes and sources for state-of-the-art lighting systems. 

5-D(c) Research Team Development and Management

i. Forming the Team and Gaining Commitment

ERC research programs are designed to apply academic resources to achieve a technological vision. This requires building a team of faculty who are motivated by that vision to channel their creativity to carry out research to support and achieve that vision. Merely collecting a team of faculty who voice some interest in gaining support from the ERC to round out the support of their individual research programs will not achieve the goals of an ERC research program; those faculty should be avoided, no matter how prestigious their reputation. Rather, the faculty team member has to be willing to channel his or her curiosity and capability to address the challenges and barriers that stand in the way of the ERC achieving its vision. Care has to be taken at the proposal stage to engage potential faculty members in helping to design the vision and strategic plan so they can see how their research fits with the research plan and how they can gain new knowledge and new cutting-edge ideas from working at the interface of the disciplines that will be involved in the ERC’s research program.

The proposal process itself—as drawn out as it is through a pre-proposal, a formal proposal, a site visit, and a briefing to the final ERC Blue Ribbon Panel—can be used as both a commitment process and a weeding-out process for faculty. The vision should be challenging enough to generate a long-term commitment to the new ERC, while at the same time open to evolution and growth so that it remains at the cutting edge. The core faculty team should understand that the ERC is not a long-term sinecure that they alone can benefit from, but rather should be aware that as the research progresses, new barriers will continually arise that need to addressed by new faculty, so that some of the early core research will necessarily be phased out.

ii. New Hires to Address the ERC’s Vision

Both in the proposal stage and through time, the ERC team needs to be strengthened by the addition of new faculty to the core or partner universities. The ERC Best Practices Manual recommendation regarding new faculty is that: “The Director of an ERC may resolve to build an effective faculty intake mechanism into the center, select the new team members with great care, and choose research management structures that allow the newcomers to share power and resources on an equal footing with all other participants. The intent here is to ensure that the ERC survives beyond the ten-year time horizon by accommodating growth and preventing stagnation.”[180]

Faculty are hired by departments, not by ERCs. However, the Center Director and often the Dean are engaged with the departments in these hires so that the new faculty members will serve the needs of both the departments and the ERC.

iii. Funding

Funding for faculty and student time spent on ERC research projects is provided from the ERC’s budget to the source departments’ accounts and allocated in support of those efforts. It is not provided directly to the faculty outside the university’s financial system.

Faculty may also be part of the ERC team through the allocation of resources from projects funded by sources outside the ERC budget. These are called “associated projects,” many of which are key to achieving larger testbed goals of the center, since this funding can sometimes be substantial.

iv. Obligations

Faculty receiving funding from the ERC or including some or all of an associated project in the ERC’s research portfolio are obligated to funding under the ERC Strategic Plan and contribute to its evolution. They are also obligated to provide (1) data on resources used and outcomes achieved to the ERC Database and achievements systems, and (2) text for the ERC’s annual reports and renewal proposals.  

i. Rewarding and Valuing the ERC’s Cross-Disciplinary Culture

The Center Director, along with the associated dean(s) and department chairs, should establish a culture that values and rewards inter- and cross-disciplinary research. Preston presented these guidelines to the deans of engineering at the 2004 ASEE Deans’ Meeting in Washington DC.  See Figure 5-34.[181]

Figure 5-34: Rewarding and valuing cross-disciplinary research often requires a change in culture. (Source: Preston (2004), slide 14)

ii. Terminating Projects and Faculty

A robust research program that is guided by a strong and evolving strategic research plan can be the basis for decisions to terminate projects that are no longer productive in terms of their continued contributions to the plan. Input from advisory boards and the NSF site visit team will contribute to these decisions.

The decision to terminate a faculty member of the research team can result from the decision to terminate a project, if the faculty member has no new projects to propose to advance the strategic plan. At times, a faculty member may need to be terminated if it turns out that he/she is not an effective member of the team and does not share the goals of the ERC in terms of its vision, cross-disciplinary research culture, research plan, and student development. To maintain objectivity and keep these decisions from appearing arbitrary, it is best to set out “Rules of Participation” or the equivalent at the start of the ERC, to provide consistent evaluation criteria.

The MIT Biotechnology Process Engineering Center (BPEC), in its first year in 1985, developed a “report card” governing the role of faculty in the ERC. It defined the “rules of participation” to include contributing to the vision of the ERC, working collaboratively on interdisciplinary teams, respecting the input from industry, etc. Today it would also include contributing to and working under the strategic plan. Since this ERC was at the forefront of integrating the disciplinary perspectives of biochemical engineering and biology to strengthen bioprocess engineering for the then-nascent biotechnology industry, it was breaking new ground at the interface of these two major disciplines. One very respected biologist joined the team but showed little respect for engineering and soon fell back into the familiar mold of his disciplinary culture, preferring to work alone. At the end of the first or second year, the Director of the ERC, Daniel I.C. Wang, decided to apply the structure of the ERC’s Report Card to all faculty and found this faculty member received a failing grade. He decided to “fire” him from the BPEC  team; but a year or so later, as Wang told Preston, the faculty member returned to Wang to plead to be let back on the team, because he now realized that the work he was able to do with funding from the ERC was at the forefront of a new interdisciplinary field and he could find support for such innovative and high-risk research only from the ERC. He was returned to the team, participated effectively, and over time became one of the leaders of the field that BPEC was pioneering, biological engineering.

An excellent short essay by a successful ERC director on these and other matters is here.

5-E       Lessons Learned

The structure of the ERC research program that emerged over 30 years is a model for a broad range of programs of research that are focused on advancing innovative technologies and creating and sustaining cross-disciplinary research cultures that can generate future innovations. The ERC Program has developed principles of research management that have been tested over time and have proven successful across a wide range of technologies and disciplinary cultures.

These principles can be summarized as follows:

  • Teams should form an ambitious vision for next-generation technologies that are needed to address major economic or societal problems.
  • In some countries where R&D partnerships that merge academe and industry are a part of the economic culture, as opposed to the type of separations that exist in the U.S economy, the product development phase can be carried out in closer partnership with the center.
  • Taking a systems approach to that vision provides a structure that is most relevant to real-world issues, as it requires that the integration of technology and its commercialization with associated societal and environmental issues be addressed.
  • Research goals and more detailed research objectives, organized into coordinated research thrusts, derive from strategic planning in light of the center’s overarching vision.
  • The 3-plane strategic planning construct provides a powerful visual tool for organizing and communicating the research needed to fulfill the vision and it must be flexible over time to accommodate unforeseen developments and opportunities.
  • The research program must integrate basic and applied, research with proof-of-concept testbeds to be able to address the complexities of the vision and deal with barriers that necessarily arise as the testbeds uncover insufficiencies in knowledge or technology.
  • The research team must be formed from the vision and systems construct requirements, not from the research interests of faculty interested in support for their own work under the center’s “umbrella.”
  • Research team management in an academic setting is as much art as it is craft, and demands a careful balance between top-down control and freedom, especially in recruitment and weeding-out of faculty who do not fully support the center’s vision.
  • Inclusion of students as integral members of the research team eventually magnifies the center’s impact on the competitiveness of the industry on which its vision is focused.
  • ERC site visits have revealed a problem with the integration of students at partner institutions into the life of the center. Often there are only a handful of research projects at some of the partner and particularly affiliate institutions. To keep students from these institutions from feeling disconnected and “on their own” with independent projects, it is up to the center and its SLC to ensure that these students are integrated into the center.
  • Time and space need to be developed to enable faculty from diverse disciplines to learn each other’s research epistemologies, engage in joint research and even joint teaching, and find a shared space for collaboration in research, technology, and publication.
  • Center leadership should avoid the tendency for some partner institutions to be selected to “check the box” on diversity efforts and not to take full advantage of their research strengths by integrating their projects into the fabric of the center’s strategic plan.
  • Eventual users should be part of the cultural sphere of the research—industry that will develop future technology and users that would be impacted, especially in biomedical technology areas.
  • Collaborations with the commercial sector should be encouraged and translational research partnerships should be supported, especially with R&D-intensive small firms and start-ups, to help ensure the technology has support to bridge the high-risk development stages.

[1] National Academy of Engineering (1984). Guidelines for Engineering Research Centers: A Report to the National Science Foundation. Washington, DC: National Academy Press, p. 1.

[2] Ibid., p 3.

[3] National Research Council (1986). Report to the National Science Foundation Regarding the Systems Aspects of Cross Disciplinary Engineering Research. Cross-Disciplinary Engineering Research Committee. Washington, DC: National Academy Press, pp. 8-9.

[4] NSF (1984). Program Announcement, Engineering Research Centers, Fiscal Year 1985, April 1984. Washington, DC: National Science Foundation, Directorate for Engineering, p. 1.

[5] Currall, Steven C., Toby Stuart, Sara Jansen Perry, and Emily Hunter (2007). Engineering Innovation: Strategic Planning in NSF-funded ERCs (NSF Award No. EEC-0345195). Houston, Texas: Rice University Press.

[6] Steven C. Currall, Ed Frauenheim, Sara Jansen Perry, and Emily M. Hunter (2014). Organized Innovation: A Blueprint for Renewing America’s Prosperity. Oxford, England: Oxford University Press.

[7] Ibid., p. 51.

[8] Fair, Richard and Lynn Preston, Guest Editors (1993). Engineering Research Centers: Goals and Results. Proceedings of the IEEE81(1):3-9, 112.  []

[9] Ibid., p. 107

[10] National Research Council (1986). Report to the National Science Foundation Regarding the Systems Aspects of Cross Disciplinary Engineering Research. Cross-Disciplinary Engineering Research Committee. Washington, DC: National Academy Press, p. 1. []

[11] Ibid., p. 7.

[12] Ibid., p. 8.

[13] Fair and Preston, op. cit., p. 3.

[14] Ibid., p 11.

[15] Ibid., p. 12.

[16] National Research Council (1986). Management of Technology, the Hidden Competitive Advantage. Cross-Disciplinary Engineering Research Committee. Washington, DC: National Academy Press.

[17] Betz, Frederick (1998). Managing Technological Innovation: Competitive Advantage from Change. New York: John Wiley & Sons, Inc..

[18] National Research Council (1986), op. cit., p. 2.

[19] Original source:

[20] National Research Council (1986), op. cit., pp. 5-6.

[21] National Research Council (1986). The New Engineering Research Centers: Purposes, Goals, and Expectations. Cross-Disciplinary Engineering Research Committee, summary of a symposium, April 29-30, 1985, National Research Council. Washington, D.C.: National Academy Press, p. 104. []

[22] Ibid., pp. 117-119.

[23] Ibid., p. 4.

[24] Suh, Nam P. (1987). The ERCs: What we have learned? Engineering Education. October 1987, p. 17

[25] Fair and Preston, op. cit., p. 4

[26] GAO (1988). Engineering Research Centers:  NSF Program Management and Industry Sponsorship— Report to Congressional Requesters (August 1988). Washington, D.C.: General Accounting Office Report, GAO/RCED-88-177. p. 24.

[27] Ibid., p. 7.

[28] Engineering Centers Division (1991). The ERCs: A Partnership for Competitiveness, Report of a Symposium, February 28-March 1, 1990 (NSF 91-9). Washington, DC: National Science Foundation, p. 7.

[29] Fair and Preston, op. cit., pp. 72-73.

[30] Ibid., p. 4.

[31] Davis, Robert (2012). “Thrust Leader’s Role in Strategic Planning” (slide 7). Presentation at the 2012 ERC Program Annual Meeting, Bethesda, Maryland.

[32] NSF (1998). Program Announcement, Engineering Research Centers, Partnerships with Industry and Academe for Next-Generation Advances in Knowledge, Technology, and Education (NSF: 98-146). Directorate for Engineering, National Science Foundation, p. 4.

[33] Ibid., p. 3.

[34] Ibid., p. 4.

[35] Preston, Lynn (2004). “Mentoring Young Faculty to Success: Rewarding and Encouraging Involvement in Cross-Disciplinary Research.” Plenary Address to the ASEE Engineering Research Council, slide 4.


[37] Preston, op. cit., slide 4 talking points.

[38] Ibid., slide 7.




[42] Moudgil, Brij (2005). Particle Engineering Research Center. Final Report. Gainesville, FL: University of Florida.

[43] Currall (2007), op. cit., p. 6.

[44] Ibid., p. 52.

[45] Curall, op. cit., p. 6-7.

[46] Ibid., p. 23.

[47] Ibid., p. 24

[48] Ibid.

[49] Ibid.

[50] McLaughlin, David (2013). CASA, Engineering Research Center for Collaborative Adaptive Sensing of the Atmosphere, Final Report. University of Massachusetts-Amherst, p. 14.

[51] Pauschke, Joy and Lynn Preston (2001). National Science Foundation Engineering Research Centers. Paper prepared for the ASEE. p. 2-3.

[52] Moehle, Jack (2000). EERC Presentation to NSF.

[53] Moehle, Jack, Greg Deierlein, and Yousef Bozorgnia (2006). FY 2006 Annual Report, Pacific Engineering Research Center. University of California at Berkeley, pp. 1-13.

[54] Ibid.

[55] Ibid.., pp. 2-6, 2-12.

[56] Ibid., pp. 2-13.

[57]  and Moehle et. al., op. cit.


[59] Ibid.

[60] Mujumdar, V. (2008). National Science Foundation Investment in Earthquake Engineering Research Centers (EERCs). White paper prepared for Engineering Education and Centers Division, NSF, Nov. 1, 2008.

[61] Daniels, Thomas (2017). Personal communication with Lynn Preston.













[74] CSOPS (2013). NSF Engineering Research Center for Structured Organic Particulate Systems. Seventh-Year Annual Report, 2013. Section on Research Programs Testbeds, unnumbered pages 1-2.





[79] Contributed by Nasser Peyghambarian, May 21, 2018.


[81] Conant, Emily F. MD, FSBI and Liane Philpotts, MD, FSBI (2017). Digital Breast Tomosynthesis for Screening and Diagnostic Imaging. Society for Breast Imaging, White Paper, September 2017.

[82] Silevitch, Michael B. (2010). The Bernard M. Gordon Center for Subsurface Sensing and Imaging Systems (CenSSIS). Preliminary Final Report. Northeastern University. March 2010. Approved for use in this history by Michael Silevitch in 2018.


[84] Costa, Kevin and Shaila Kotadia (2016). Synberc–Building the Future with Biology. See



[87] (subsection 5.3.3)

[88] NSF ERC Best Practices Manual, Chapter 5 (2013). Page 5-60. (sec. 5.3.3)

[89] Quality of Life Technology Center (2012). Presentation to ERC Program annual meeting. Slides 17-18.


[91] McCullough, Rick (2011). Opportunities and Challenges in Technology Transfer and Startup Creation in a Research University, Slides 6 and 10.,

[92] Originally established under the Division of Civil and Mechanical Systems, in 1999 the three EERCs were brought into the ERC Program in the Engineering Education and Centers Division.

[93] McLaughlin (2013), op. cit., pp. 9-10.

[94] National Research Council (2010). When Weather Matters: Science and Service to Meet Critical Societal Needs. Washington, DC.: The National Academies Press.

[95] Ibid., pp. 13-14.

[96] For further detail see: Pepyne, D., D. Westbrook, B. Philips, E Lyons, M. Zink, & J. Kurose (2008). Distributed collaborative adaptive sensor networks for remote sensing applications. In American Control Conference, 2008, IEEE(pp. 4167-4172); Philips, B., D. Pepyne, D. Westbrook, E. Bass, J. Brotzge, W. Diaz, K. Kloesel, J. Kurose, D. McLaughlin, H. Rodriguez, and M. Zink (2007). “Integrating End User Needs into System Design and Operation: The Center for Collaborative Adaptive Sensing of the Atmosphere (CASA).” Preprints, 16th Conf. Applied Climatology, American Meteorological Society Annual Meeting, San Antonio, TX.; B. Philips, D. Westbrook, D.L. Pepyne, J. Brotzge, E.J. Bass, and D.J. Rude (2008). “User Evaluations of Adaptive Scanning Patterns in the CASA Spring Experiment 2007.” In Proceedings of IGARSS, 2008 5:156-159.


[98] Mujumdar (2008), op. cit., pp.1-2, 8-9.

[99] Ibid., p. 2-3.

[100] Ibid., pp. 3, 22-23.

[101] Gross, Rachel E. (2014). “A bionic eye that restores sight by bridging the gap between eye and brain–a new device has the capacity to help the blind regain their vision.” An Interview with Dr. Mark Humayun, August 21, 2014, published in Glaucoma Today, May/June 2018.

[102] Currall, Steven C., Ed Frauenheim, Sara Jansen Perry, and Emily M. Hunter (2014). Organized Innovation, A Blueprint for Renewing America’s Prosperity. New York, NY: Oxford University Press, pp. 107-108.

[103] Gross (2014), op. cit.

[104] Humayun, Mark and James Weiland (2012). “Biomimetic MicroElectronic Systems,” slides prepared for Lynn Preston, who presented them at 2012 NSF ERC Annual Meeting 2012, Bethesda, MD, slide 5.

[105] Ibid., Slide 6.


[107] Gross, Rachel, E. (2014)


[109] Meyers, Andy (2014). Surgical Precision. John Hopkins Engineering, a Century of Innovation, Celebrating our Past and Defining the Future. Baltimore, MD: Whiting School of Engineering, Johns Hopkins University Press, Winter 2014, pp. 12-17.

[110] Ibid., p. 14

[111] Schulz, Richard (ed.) (2013). Quality of Life Technology Handbook. Boca Raton, FL: CRC Press, pp. xi-xii.

[112] Srinivasa, Siddhartha (2013). HERB: Personal Assistive Mobile Manipulator in the Home. In S. Schulz, ibid., pp. 181-195.

[113] This article lays out the issues across that divide.



[116] This history was contributed by Professor Ralph Etienne-Cummings, Chair, Department of Electrical and Computer Engineering, the Johns Hopkins University, Baltimore, Maryland. April 2019.

[117] Chen, Shirley K. (1996). Interview with Carver A. Mead, Pasadena, California, July 17, 1996. Oral History Project, California Institute of Technology Archives. Retrieved April 22, 2019 from p. 20.

[118] Mead, Carver and Lynn Conway (1980). Introduction to VLSI Systems. Reading, Mass.: Addison-Wesley.

[119] Mueller, P. and D.O. Rudin (1967). Action potential phenomena in experimental bimolecular lipid membranes. Nature, Vol. 213, pp. 603–604 (11 February 1967); doi:10.1038/213603a0

[120] Van der Spiegel, J., P. Mueller, D. Blackman, P. Chance, C. Donham, R. Etienne-Cummings and P. Kinget (1992). An analog neural network with modular architecture for real-time dynamic computations. IEEE J. Solid-State Circuits, Vol. 27, pp. 82–92.

[121] Mahowald M. (1994). The Silicon Retina. In: An Analog VLSI System for Stereoscopic Vision. The Springer International Series in Engineering and Computer Science (VLSI, Computer Architecture and Digital Signal Processing), vol. 265. Boston, MA: Springer.

[122] Etienne-Cummings, Ralph. Personal communication, April 2019.

[123] This section is synthesized from Perona, Pietro and Joel Burdick (2006). Center for Neuromorphic Systems Engineering, Final Report, Pasadena CA, May 17, 2006. pp 8-9.



[126] Perona, op. cit., p. 9.  

[127] This and the CNSE sections below, unless otherwise indicated, were synthesized from Perona and Burdick, op. cit., pp. 9-14, 21-22


[129] Chen, op. cit.

[130] Lewis, Courtland (2010). Engineering Research Centers Innovations—ERC-Generated Commercialized Products, Processes, and Startups. Melbourne, FL: SciTech Communications, pp. 38-39.




[134] and





[139] Unless otherwise noted, the BMES sections below are synthesized from Humayun, Mark, S. and James D. Weiland (2012). BMES ERC Annual Report. Year 9, Volume 1. Los Angeles, CA: University of Southern California. May 18, 2012. Executive Summary and pp. 1-3.

[140] See Chapter 11, section 11-B(a) of this History for a fuller discussion of the impact of Argus II.

[141] Humayun, ibid., Executive Summary







[148] Written collaboratively by Fernando Muzzio and Lynn Preston, October 2018.




[152] Synthesized from Acampora, Anthony S. (1993). “Intelligent Optical Networks: Research, Education, and Industrial Programs at the Center for Telecommunications Research,” in Fair, Richard and Lynn Preston, Guest Eds. (1993). Engineering Research Centers: Goals and Results. Proceedings of the IEEE81(1):111–131. []

[153] Synthesized from University of Illinois (1998). Final Report to NSF of the Center for Compound Semiconductor Microelectronics, April 1986-May 1998.

[154] Goswami, Debabrata (2003). Optical Computing, Optical Components and Storage Systems. Resonance, June 2003, p.1.


[156] This section is synthesized in part from: Synthesized from Cathey, W. Thomas and R.C. Mercure, Jr., “Mastering the Challenges of Optoelectronic Computing,“ in Fair, Richard and Lynn Preston, Guest Editors (1993). Engineering Research Centers: Goals and Results. Proceedings of the IEEE81(1):95-110. []

[157] Patent number: 4941735.

[158] (2005). OmniVision acquires CDM Optics. Photonics Media, March 2005.

[159] Silevitch, Michael (2009). Looking into Hidden Worlds: Gordon-CenSSIS, the Bernard M Gordon Center for Subsurface Sensing and Imaging Systems—A Supplement to the Summative Review Final Report. Boston, MA: Northeastern University, p. 31.

[160] Rappaport, Gary (2009). “Wave-Based Computational Modeling for Detection of Tumors, Buried Objects and Subcellular Structures,” in Silevitch, ibid., p. 29.

[161] Abouraddy, Ayman F., Magued B. Nasr, Bahaa E.A. Saleh, Alexander V. Sergienko, and Malvin C. Teich (2008). Quantum-optical coherence tomography with dispersion cancellation. Phys. Rev. Appl. 65, 053817, 8 May 2002.

[162] Roy, R.A., Lei Sui, C.A. DiMarzio, and T.W. Murray (2006). “Shedding light on sound: The fusion of acousto-optic and B-mode ultrasound imaging.” 3rd IEEE International Symposium on Biomedical Imaging: Nano to Macro. IEEE Explore Digital Library.

[163] Warner, C.M., J.A. Newmark, M. Comisky, S.R. DeFazio, D.M. O’Malley, M. Rahadhyaksha, D.J. Townsend, S. McKnight, B. Roysam, P.J. Dwyer, and C.A. DiMarzio (2004). Genetics and imaging to assess oocyte and preimplantation embryo health. Reproduction, Fertility and Development, 16(7):729-41.

[164] For example, in late 2018 Samsung announced that it had begun wafer production of its new process node, 7LPP, a 7 nm LPP (Low Power Plus) using EUV lithography technology developed in partnership with the EUV ERC.


[166] Gmachl, Claire (2016). MIRTHE Final Report to NSF, Princeton, NJ: Princeton University, pp. 11-12.

[167] Ibid. p. 17.

[168] WRF-Chem is the Weather Research and Forecasting (WRF) model in conjunction with chemistry.

[169] Ibid. Synthesized from pages 17 through 33.











[180] ERC Best Practices Manual, Section, “Research Management.”

[181] Preston, Lynn (2004). “Mentoring Young Faculty for Success: Rewarding and Encouraging Involvement in Cross-Disciplinary Research.” Presentation at the ASEE Engineering Research Council Summit, Washington, DC, 2004.