This history of electronics research is, in large part, a history of electronics research in industry. It was at Bell Labs in December 1947 — at that time, the lab was the research and development division of the American Telephone and Telegraph Company — that William Shockley, John Bardeen and Walter Brattain first developed a transistor. It was at Texas Instruments in September 1958 that Jack Kilby demonstrated a working integrated circuit made from a single piece of germanium. And it was at Fairchild Semiconductor a few months later that Robert Noyce proposed a monolithic integrated circuit based on silicon.

A tabulating machine — the IBM 402 accounting machine — from the 1950s. Credit: Allan Cash Picture Library / Alamy Stock Photo.

This role of research in industry is also reflected in our Reverse Engineering articles — the section of the journal dedicated to the history of influential technologies. Here, for example, we have seen Robert Dennard recount his work at IBM in 1966 on the creation of dynamic random access memory (DRAM)1. We have seen Federico Faggin describe his work at Intel in 1971 on the development of the first commercial microprocessor — the Intel 4004 (ref. 2); Faggin had also previously worked at Fairchild Semiconductor, helping to develop silicon gate technology. And we have seen Radia Perlman detail her work at Digital Equipment Corporation in 1983 on the spanning tree protocol, an essential component of today’s Ethernet3.

Some of these featured companies have faded, some remain key players in research today — and their stories of success, and survival, are a revealing reflection of evolving technological demands. Take IBM, which is the subject of a new book by James Cortada, IBM: The Rise and Fall and Reinvention of a Global Icon, reviewed in this issue of Nature Electronics by Christophe Lécuyer.

The company emerged in the early 1900s as the Computing-Tabulating-Recording Company through the amalgamation of a number of smaller companies and was renamed International Business Machines Corporation (IBM) in 1924. With a strong sales culture, driven by CEO Thomas J. Watson Sr, it became a dominant force in tabulating machines, an electromechanical technology that can analyse information stored on punched cards. Later, and with the help of CEO Thomas J. Watson Jr, it made a successful transition to electronic computing.

The shift from mainframe computing to distributed computing in the 1980s and early 1990s proved challenging, and the company came close to collapse. But more recently, it has reoriented towards software and IT services, and remains a sizeable operation — employing, at the start of 2018, an estimated 378,000 people, spread over 170 countries4 — and at the forefront of various technological endeavours.

Of course, the development of new technology also relies on academic research, both fundamental and applied. And for the semiconductor industry, in particular, which is entering uncertain times due to the demise of Moore’s law, academia may need to play an increasingly prominent role, if advances in transistor technologies — and the social and economic progress they have brought about — are to be maintained. Erica Fuchs and colleagues at Carnegie Mellon University have, for example, argued that significantly increased public funding is now required because the incentives that have previously fuelled collective action in industrial research and development have been eroded5.

While the relationship between academia and industry is often constructive, their goals are not the same, and this can cause conflict. Concerns have, for instance, been raised about the potentially dominant role industry is playing in defining the research and regulation of artificial intelligence (AI)6. The superior salaries and resources available in AI and robotics companies, and thus their ability to lure researchers away from academia, has also been highlighted as an issue7,8, which could, in the long run, have an effect on transparency and fundamental research in the field. Though such ‘brain drains’ are not necessarily anything new for academia — the parallels with semiconductor research in the twentieth century have, in particular, been noted7. The history of electronics research could thus provide some helpful lessons.