Insights

Corporate survival in the tech world

IBM’s lesson in ‘everything innovation’ and longevity

By Dr Anthony Gandy
Image credit: istock

Corporate survival is a rare thing. In 1997, Leslie Hannah, of the London School of Economics and Political Science, noted that only 20% of the top 100 global enterprises in 1912 were still in the top 100 by 1995. What’s more, 29% had been in some form of receivership and the vast majority had been subsumed by other firms. These were the largest and most resourced firms in the world, and only a few maintained their position.  

Faced with uncertainty, Paul J DiMaggio and Walter W Powell, authors of The Iron Cage Revisited: Isomorphism in Organizational Fields, argued that firms are likely to undergo a process of ‘isomorphism’, where fear of the unknown leads them to increasingly ape the strategies of others in the market. The problem is that, as Hannah points out, companies that survive and remain top-100 enterprises over the long run have corporate architectures that are “complex and difficult to identify, describe and copy, for, if that were not the case, their value would be competed down by emulators”. 

Longevity relies on the ability to innovate, change, adapt and, at least some of the time, lead these processes. Simply knowing a firm has survived is useless information to others, however. What is interesting is understanding the workings of how they manage to renew themselves. One case we can follow is that of IBM, a firm that, in the early 1960s, undertook what we can call a programme of ‘Everything innovation’. It was a process that would make everything it had done before obsolete – from the products on which its success was built, to the way that it made them. 

Taking on the big guys

In the late 1950s and early 1960s, IBM had gained a controlling position in the market for commercial data-processing systems. Its two most important products were the transistor-based or second-generation computers. The biggest-selling computer in the world was IBM’s 1401 series, which was aimed at the lower-performance, commercial data-processing market, while its complex 700/7000 series of computers were leading the market for powerful science-orientated computing. Just simply stating this shows that IBM had already been through periods of enormous rebirth. Computer technology was very distinct from its heritage as a manufacturer of electro-mechanical tabulators, but, vitally, many of the customers were the same. 

“Longevity relies on the ability to innovate”

Potentially the most serious threats to IBM domination were the North American electrical and electronics behemoths, General Electric (GE) and the Radio Corporation of America (RCA). Both were vast producers of complex defence electronics. RCA was also the major incumbent in the market for electronic components, and knew how to mass produce electronics like no other business thanks to its leadership in broadcast and consumer electronics. GE not only made electronic devices for the military and others, but – because of its interests in nuclear engineering and gas turbines, and the complex modelling it needed to undertake – it was also a very major user of IBM’s 700/7000 series of power computers. 

IBM had leadership in computing, but these rivals had the tools to take it on. They knew as much – indeed, much more – about the underlying electronic technologies, and they had the manufacturing scale. 

A reimagining 

Management organisation was another area in which IBM was at the forefront. With the help of business school gurus and management consultants, it arranged its firms in new multidivisional – or M-form – structures, where each division was tasked with managing the firm’s activities in different industrial sectors. 

Multidivisional structures resulted in a multiplicity of computer innovation. One division might develop computers with an eye to improving military and civil telecommunications, while another would be working on smaller systems to improve local data processing. Each division had its own take on what the product should look like. Similar structures also meant GE and RCA ended up with a multitude of computer systems. It also left IBM with ranges of incompatible computers.

Decentralised innovation had created an enormous technological capability. By the late 1950s, however, it was clear that the computer industry was growing to such a scale that IBM’s rivals had the opportunity to reimagine their computer operations – not as support operations for other activities, but as separate entities within the M-form structure. They created divisions with total control over nearly all their computing resources (some industrial control systems being left within other electronic capital goods divisions) and set them the task of taking on the market incumbent, IBM. In such a fast-growing market, they should surely find space to succeed. They had the tools to achieve this.

The timing seemed fortuitous. In an example of isomorphism, the next generation of computers was expected to be built on a new underlying technology: the integrated circuit (IC). The IC would herald the dawn of what we now call third-generation computing. It was the collective understanding within the electronics firms that it would not be economically viable to mass produce ICs until the very end of the 1960s, but GE and RCA expected to be leaders in these components. This gave the new computer divisions of GE and RCA time to build their approach to the next generation of computing and truly compete against IBM. 

Innovate at pace

IBM was not on the same page, and was not going to stick to the assumed industry timetable for third-generation computing. Its cultural focus was on its salesforce, not just for its ability to sell, but also for its ability to sense the market. IBM had people embedded in its customers’ premises who could report back about what those customers wanted. The insight they gave was that what really mattered for the next generation of computing was not the components, but the family approach, where large and small systems could, approximately, run the same software. Customers would be able to develop software on small machines and roll it out to large production systems. Small operations could use small systems, large divisions giant systems – all running the same software. 

These insights led to one of the great corporate reboots of all time – initiated by the SPREAD committee, which IBM established in 1961. Wonderfully, the end report from this group is available online. It made obsolete all of what IBM had done before, from the computers themselves to the organisation of the firm and the manufacturing processes. It was not a product innovation strategy; it was an Everything Innovation strategy. 

SPREAD outlined how IBM’s 1401 and 7000 ranges of computers should be replaced by a single, modern, 32-bit architecture, covering all levels of small to large computing devices. To do this, it was necessary to remove the siloed approach of the past. The 20 or so engineering groups involved in processor design were to be made subservient to Corporate Processor Control. Further, SPREAD outlined the need for a new components division. This would eventually be tasked with introducing fast, powerful and less power-hungry integrated circuits. In the meantime, it would create what was termed Solid Logic Technology (SLT). The SLT building blocks sounded like integrated circuits, but were, in fact, a hybrid technology using a tightly packaged version of discrete transistors. IBM did not care – it was not circuits that its customers bought, but whole systems. 

In April 1964, IBM was ready to announce the System/360, and a new structure for the firm. The S/360 would dominate mainframe computing for decades, as the new 32-bit architecture was suitable to use integrated circuits and other advances as they became available. In 1970, IBM updated the S/360 range as the new IC-based System/370. It delivered these just after GE and RCA had imagined delivering their first third-generation families. IBM was running at a pace that put it half a generation ahead of its rivals.  

GE and RCA would leave the computer industry by the early 1970s. They never caught up.

Customer first

GE and RCA knew what the third generation of computers would be – they had their own internal SPREAD-like reports outlining the future. However, their focus was technological – the integrated circuit was the cue for what would come next; they were listening to their own narrative of this technology. IBM’s listening was focused outside of the firm: it was customer-led. 

Of course, IBM had another factor that made it exceedingly difficult to replicate, at least for GE and RCA. IBM was the leader in an industry experiencing stratospheric growth. GE and RCA were trying to ride many high growth sectors, all demanding resources and finance. IBM did not have this internal competition; it did not even have to compete with shareholders, who were more than happy to allow IBM to reinvest profits into its burgeoning business. That truly is hard to replicate. 

IBM’s innovation advantages in the early 1960s were many. However, none was more important than listening to the customer and focusing on one sector. Is this complex and difficult to replicate? Maybe not. ‘Everything innovation’ is possible. 

Dr Anthony Gandy is a passionate researcher in the history of computer technology markets, and has worked in the financial services sector for more than 30 years .

This article is adapted from a feature first published in the autumn 2024 issue of Edge.