ENIAC anniversary: What 75 years of computer technology have delivered

0
390
Oracle enhances customer experience platform with a B2B refresh

Source is ComputerWeekly.com

On 15 February 1946, Penn’s Moore School of Electrical Engineering in Pennsylvania, US, unveiled the Electronic Numerical Integrator And Computer (ENIAC). The machine, which was developed between 1943 and 1946 as part of the US war effort, is largely recognised as the world’s first general-purpose electronic digital computer. 

Computer firm Unisys can trace its lineage back to Univac, which was a commercial machine built by ENIAC inventors J Presper Eckert and John Mauchly.

Unisys ClearPath Forward CTO Jim Thompson is an ENIAC history enthusiast, and ahead of the 75th anniversary, he spoke to Computer Weekly about why people are fascinated with old hardware. “People are way more into computer history than me,” said Thompson. “It’s generational. My parents grew up without computers. My generation has seen a lot of change. History has a pull and roots are always important.”

Thompson says his children, who are in their 30s, have always had computers in their lives and when they were young, “they were writing programs – it’s like TV for my generation”.

For Thompson, the difference between the generation brought up with computers and the previous generation who grew up when the primary form of entertainment was through television is that computing has become stealthy and ubiquitous in people’s lives. “People think of TV as a thing you do,” he said.

According to Wikipedia, the cost of ENIAC in 1946 was $487,000, which in today’s money is about $6.53m. But throughout the history of computing, many have raised questions over the economic value of computer technology.

Since the late 1960s, Moore’s Law has been the formula the industry has used to drive progress. Every 18 months to two years, the industry is able to sell devices with twice as much computing power for the same price. This is measured by the volume of transistors that can be put on a chip. More transistors generally equates to more computing power.

But Thompson has a theory about this and says: “Moore’s Law has been overplayed or misplayed. It’s really about transistor technology and the relentless pursuit of more transistors on integrated circuits. But there is a corollary that as we make things small, cost goes up and up.”

Last year, a Computer Weekly reader got in touch about the first computer he worked on, which was an IBM system 360 in the early 1970s. At the time, it could be leased for about £8,000 a year. In today’s money, that is equivalent to £87,339 a year.

A business requiring a mainframe in 1973 would probably have been quite large. That figure of about £87,339 a year is now what relatively small businesses need to spend on IT.

Programmers Betty Jean Jennings (left) and Fran Bilas (right) operate ENIAC’s main control panel

Although, in real-world terms, the cost of technology has increased, Thompson believes investment in technology still creates value. “The work the institutions are doing on enterprise hardware today is not the same as it was 50 years ago,” he said. “We are putting banking on a phone, for example.”

This needs software-powered services that wrap around the core banking system to provide a mobile-friendly user interface, he added. “It takes a lot of computing power to do the wrapping around a core banking application.”

Losing programming skills

Although such advances make it possible for everyone to have access to advanced computer systems, Thompson believes programming skills are being eroded. “We are not teaching people how to do things any more,” he said.

One example is the trend for using low-code/no-code tooling to lower the skillset needed to create useful business applications. But, says Thompson: “Tasks get automated that shouldn’t be.” In other words, automation does not necessarily help to grow programming skills.

Glen Beck (background) and Betty Snyder (foreground) program ENIAC

Unlike a modern computing architecture, the ENIAC did not use memory; instead, it comprised a series of modules for performing calculations. Recalling his experiences of programming, Thompson says: “I remember working on mainframes with 6MB of memory. We needed to code in obscure ways to get the most from the machine.

“Today’s programmers do less coding than before. Memory is limitless, I/O [input/output bandwidth] is not a problem. In a traditional sense, we are not doing data processing any more. Programmers today are building applications.”

The fact that IT resources are now considered limitless and, thanks to Moore’s Law, get faster with each new generation, has made some programmers take technology for granted, said Thompson.

He is concerned that programmers may lose the skills that were necessary to program in resource-constrained hardware and the ability to understand low-level coding, such as the architecture of operating systems.

“We went through the bubble of operating systems,” he said. “But now there are a couple of mainframe operating systems, Unix, Linux and Windows. Programmers are not interested in building an operating system.”

In the past, low-level coding techniques and an understanding of the operating system were usually prerequisites to getting a piece of code to “work”. But today, said Thompson, it is all about the app being built, and any functionality required can usually be downloaded from the cloud.

Source is ComputerWeekly.com

Vorig artikelIndustrial Internet Consortium Technical Report Defines Framework For Distributed Edge Computing
Volgend artikelWorldline to lead EU blockchain project