Disclaimer: This is an example of a student written essay.
Click here for sample essays written by our professional writers.

Any opinions, findings, conclusions or recommendations expressed in this material are those of the authors and do not necessarily reflect the views of UKEssays.com.

Development of Computer: Past, Present and the Future

Paper Type: Free Essay Subject: Information Technology
Wordcount: 3952 words Published: 23rd Sep 2019

Reference this

2 – 3D Computing

Table of Contents


  • Introduction …………………………………………………………………………………… 3
  • Individual Section – Ciaran Mangan – Computing in the Past ..…………….………………. 3
  • Individual Section – James Nugent – Computing in the Present ..….…………..…………….
  • Individual Section – David Blair – Computing in the Future …….……………………….….
  • Conclusions …..………………………..……………..………………………………………. 8
  • Glossary ……………………………………………………………………………………….. 4


  • References ………………………….……………..………………………………………….. 9
  • Bibliography …….………………………..………………………………………………….. 10
  • Appendix 1 ..…………………………..……………..……………………………………….11
  • Appendix 2 ………………….……………..…………………………..………………….….12



Moving from rooms full of vacuum tubes, wires and men in white coats to the devices that sits on most of our desks or in our pockets today, the development in computing, in a relatively short space of time, is one of the few undoubted successes and achievements in the history of mankind.

The drive to make computers smaller, faster and less expensive than their previous iteration has seen prices plummet, speeds rocket skywards and devices scale down to fit in our pockets. This report intends to detail the timeline from when bugs literally meant bugs, to the present-day computer and the plateau in processing power expected with current Integrated Circuits, as well as outlining the potential solutions to this problem which will see a continuation in the exponential growth in processing power.

Computing in the Past

One has to go back to the 1940’s to begin to understand how the computer processor came to be developed and become so commonplace in modern day computing. Until this point, computing required large spaces which housed machines made up of thousands of vacuum tubes, resistors, capacitors, relays, switches and millions of soldered joints connecting them all together.

John Mauchley and John Presper Eckert’s, Electric Numerical Integrator and Computer, or “ENIAC”, built in 1943, is considered one of the world’s first computers. “ENIAC contained 17,468 vacuum tubes, 70,000 resistors, 10,000 capacitors, 1,500 relays, 6,000 manual switches and 5 million soldered joints. It covered 1,800 square feet of floor space and weighed 30 tonnes. ENIAC could calculate 5000 additions, 357 multiplications or 38 divisions in one second and relied on technicians to maintain the machine 24 hours a day. Any re-programming of ENIAC required weeks to complete. The machine was also very expensive, very big and used huge amounts of electricity in order to operate.” (Theworkplacedepot.co.uk, 2018)

Get Help With Your Essay

If you need assistance with writing your essay, our professional essay writing service is here to help!

Essay Writing Service

The 1940’s and World War II in particular, saw Alan Turing’s ‘Bombe’ machine used to great effect in de-cyphering the German Enigma machine code. This machine, and subsequent versions of it were funded by various Government agencies, as the requirement to compute more operations in less time became essential in staying ahead of the opposition, irrespective of the expense required. As would be shown many more times in history, technological developments tended to come from advancements in military technology.

John Von Neumann first outlined his structure of a device, which would become the essential element of the device we now know as a computer, in 1946. The architecture, which incorporates a binary number system, organised memory to store programs and data, a controller to fetch from memory and an ALU (Arithmetic Logic unit) for computations has become integral to computing. “The paper, entitled, ‘Preliminary Discussion of the Logic Design of an Electronic Computing Instrument’, advanced the concept of the stored program and introduced the idea of the program counter. In this paper and in subsequent writings von Neumann laid out some of the fundamental concepts inherent in the design of computer systems.” (Anderson, 1994)

Developments over the next ten years saw the emergence of the transistor and the second generation of computers, which over-night replaced vacuum tubes as a component. Until then, vacuum tubes were used to control the flow of electric current through a device. However, transistors were a fraction of the size and a fraction of the cost. They required less power to operate were not as fragile as the large, glass vacuum tube. Transistors allowed computers to become smaller, faster and cheaper. As a result of this development, the first programming languages emerged. FORTRAN (developed in 1957) and COBOL (developed in 1960) were developed for scientists and engineers and allowed them to move away from working in assembly language to a more approachable, language-based way of programming.

Transistors did not remain the solution for long as in 1958, Jack Kilby, who was working for Texas Instruments at the time, patented the IC, or Integrated Circuit. There seems to be some dispute as to whether Kilby is the originator of the IC as we know it, as its combination with silicon, the semi-conducting material which allows it to perform as a conductor or as an insulator, was actually posited by Robert Noyce at Fairchild Semiconductor in 1960. However you look at it, Kilby certainly presented the idea first and in 2000 was presented with the Nobel Prize in Physics for his invention.

Irrespective of the patent wars which followed throughout the 1960’s, Fairchild Semiconductors proceeded to develop many more iterations of the IC and certainly commercialised the product. Its emergence saw the use of punched cards and printouts replaced with keyboards and monitors as the Input/Output methods used. The first OS, or Operating System, was written, allowing multiple programs and applications to operate at the same time, which was hitherto impossible.

The fourth generation of computers began with the embedding of thousands of IC’s in a  silicon chip, forming the first microprocessor. Dr Gordon Moore posited in a paper in 1965 that the number of transistors held in an Integrated Circuit would double every two years. This would become known as ‘Moore’s Law’. (Chart 1 – Appendix 1). Intel introduced the 4004 chip in 1971 which contained the CPU, memory and I/O controls all on one chip. It is at this time that the exponential growth in processing power starts and the strides in personal computer development begin.

It can be said that the ability for IBM to release the first home computer (PC) in 1981 and for Apple, in 1984, to release the first version of their Macintosh computer was only possible due to the strides made in the development of microprocessors and their capabilities in the preceding decades.

“Present day computers are less easily distinguished from preceding generations. There are some striking and important differences, however. The manufacture of integrated circuits has become so advanced as to incorporate millions of active components in the area of an inch, leading to levels called large-scale integration (LSI) and very large-scale integration (VLSI). This has led to small-size, lower-cost, large-memory, ultrafast computers ranging from the familiar personal computers (PCs) to the high-performance, high-priced supercomputers.” (Thomas C., 1991)

The Intel4004 processor chip, launched to the public in 1971, utilising the new silicon gate technology developed by Federico Faggin in Fairchild Semiconductor in 1968, contained 2,300 transistors on a 10-micron chip and had a processing speed of 108KHz. A year later the Intel8008 processor was introduced, comprised of 3,000 transistors and capable of 62,000 instructions per second. In the decade that followed, Intel’s chips saw their number of transistors increase from 3,500 to 275,000 on the Intel386 processor in 1985. However, the release of the Intel486 processor in 1989 was the first to break the one million mark. The 25MHz, 1-micron chip, comprised of 1.2 million transistors was the first instance of the monumental leaps in processing power to come.

In the following decade, Intel released five chips, each one making a dramatic increase in processing power and consistent decrease in size. The scale and capability of the Intel Pentium III processor in 1999 was 9.5 million transistors on a .25-micron chip. However, the following year, with their release of the Intel Pentium 4 processor, new ground was made that had not been seen before. The Intel Pentium 4 processor held 42 million transistors on a marginally smaller .18-micron chip and the processing power jumped from 600MHz to 1.5GHZ.

2006 saw the release of the 2.4 GHz Intel Core 2 Duo processor containing 410 million transistors on a dramatically smaller 45-nano micron chip. Dr Gordon Moore’s prediction, which sounded like science fiction at the time he made it in his 1965 paper, was proving to be reality. In 2010 the one billion transistor mark was broken with the launch of the 2nd Generation Intel Core processor, holding 1.6 billion transistors on a 32 nano-micron chip, providing 3.8GHz of processing power.


Computing in the Present

Since 2010 the advances in computer processors have not halted in any way, shape of form. During and beyond 2010 Intel continued their advances with the 2nd generation Intel Core Processor moving into the era of one billion transistors on a single Nano-micron chip. This was also the same time that the dominance of Intel within the processor production market would be seriously challenged. For the previous two decades Intel were market leaders and had contributed greatly to the development and enhancements of the computer processor.

With the introduction of new competition into what was once a very heavily dominated market, Intel were now presented with competition. This was a revelation for customers and companies across the world. The introduction of competition into the market saw the reduction of prices, customers now had additional choice with the option of moving to a cheaper competitor.

AMD released their first Phenom II X6 (hex/six core) Desktop processor on April, 27 2010. Their Phenom II processor had been in production since 2008 and was viewed as a serious competitor to Intel’s processors of that time. As described by AMD as “The backbone of the VISION Black technology platform, the new AMD Phenom™ II X6 six-core processors are the fastest CPUs AMD has ever created”

AMD (2018) AMD Phenom™ II Key Architectural Features [Online] Available at: https://web.archive.org/web/20100514013729/http://www.amd.com/us/products/desktop/processors/phenom-ii/Pages/phenom-ii-key-architectural-features.aspx [Accessed: 06 December 2018].

Form that moment on, Intel had very serious competition within what was once their sector.

While Science and Technology had continued to progress at such an explosive speed, now increasing competition was widespread. AMD were competing in all areas of the Computer processor industry, producing equal if not better, server based products along with new innovative ideas, introducing new features such as, TV viewing capabilities into their products.

Improvements on their platforms and the introduction of Direct Connect Architecture 2.0 consistency, meant that their hardware platforms were quickly being consumed by both home users and large corporations. While it could be said, that AMD could never seriously compete with such a strong market leader such as Intel, they certainly provided a product that could deliver the same processing power in home use and across large industry sectors.

Find Out How UKEssays.com Can Help You!

Our academic experts are ready and waiting to assist with any writing project you may have. From simple essay plans, through to full dissertations, you can guarantee we have a service perfectly matched to your needs.

View our services

With two strong market leaders competing for position, the Global Tech industry saw continues improvements in the science and technology sectors.  Dr Gordon Moore suggested the power of computers would double each year, with two leading Tech companies, with thousands of dedicated scientists working continuously we could potentially see some very exciting developments within the industry in the very near future. Dr Gordon Moore’s observation was, as more transistors fit into smaller spaces, processing power increased and energy efficiency improved.

Intel.com (2018) 50 Years of Moore’s Law – Economic Impact [Online] Available at:https://www.intel.com/content/www/us/en/silicon-innovations/moores-law-technology.html [Accessed: 06 December 2018]

“Fit into a smaller space” Little did Dr Gordon Moore know, that back in 1965 that his statement would still hold firm, more so than ever in today’s society. Made possible by the evolution of micro processing power, we are faced with so many more new devices packed with more compute than Moore himself could have ever imagined. “Whatever has been done, can be outdone” (Moore, 2015)

From 2010 on into present day (2018) the advances in Computer Processing are obvious for everyone to see. From what was once a dedicated music player (The personal Walkman) that read from tape and weighed approximately 750g, to today’s Smart phone device which has the ability to deliver unlimited playing music weighing in at 174g. Today’s generation of modern people have so much selection from one device that they can be frequently overwhelmed with some of the technologies that are available to them.

Smart Phone and Next Generation Mobile Computing can allow today’s business man or woman travel for 24 hours and still accomplish a full day’s work as noted by (Ni, 2010). Meetings can be attended remotely, emails can be answered and phone calls can be made, all via a Smart Device. Extras would include listening to music, researching via web browsers, weather monitoring and so many more apps and features which can also be used simultaneously.  In tandem, with the development of microprocessors Smart devices are continuously improving along with battery life expectancy and usability. People in general are becoming accustomed and almost expect that technology should improve every year. Mobile phone manufactures are benefiting greatly from the advances that are being made on the processing performance within every mobile device that’s sold. The faster a phone can perform, along with extras such as internal storage, battery life expectancy – the more units a retailer will sell.

The Smart age is now, every man, woman along with the majority of teenagers are part of it weather they like it or not. Regardless if you buy into the Smart phone idea, home desktops and laptop computers are benefiting from modern day computer architecture and are becoming more powerful and affordable to the majority of people. Modern homes are also becoming interconnected trough technology and the ability to manage them all from one convenient location is readily available. It’s almost hard to believe that with the advancements in technology’s that we can control, the heating in our homes, the lights, the gates, security camera systems and many more home comforts from anywhere in the world via our Smart phones.

The technology is so much better and reliable that more critical services are being managed by computers. The medical and hospital industry are depending more and more on computer hardware driven systems to complete routine jobs and even preserve life in some cases. Flying Planes, Rockets and driving Autonomous cars, these are some of the new technologies born from the last 10 years of research and development that are soon to become part of everyday living. If one is comfortable to sit into an Autonomous Vehicle and be driven to a destination of their choice by computer, this will and might be the way of our future.

Everything is becoming big! Big Data, Data Centre, Infinite Storage, Transmission Speeds. Teradata and Analytics. Cloud Computing, Cloud Storage and Cloud Analytics are all very current, and here today. So many of the mobile services that we have come to depend on every day, are hosted within these new big Cloud Environments. Companies are moving to these new platforms to ensure uninterrupted continued service to their clients and at a lower running cost than maintaining their own hardware. The improvements in computing and technology over the past 10 years, have made it all seem so effortless for the end user. It’s hard to know now, what technologies will develop and present themselves over the next 10 years.

3D Computing leading to the Future:


As embedded intelligence is finding its way into ever more areas of our lives, fields ranging from autonomous driving to personalized medicine are generating huge amounts of data. But just as the flood of data is reaching massive proportions, the ability of computer chips to process it into useful information is stalling.

With all the advancements in technology stated above, companies like Intel and AMD realized that they needed to step up their game to beat the competition. Enter 3D computing…

Researchers at Stanford and MIT have developed and built a new computer chip to help combat the lack of ability current computer chips have.

Computers today comprise different chips cobbled together. There is a chip for computing and a separate chip for data storage, and the connections between the two are limited. As applications analyse increasingly massive volumes of data, the limited rate at which data can be moved between different chips is creating a critical communication “bottleneck.”



  • ALU – Arithmetic Logic Unit. A circuit that performs mathematical calculations and logical operations. The central part of the microprocessor.
  • Assembly Language – A low level programming language utilising instructions usually in one to one correspondence with the machine language instructions of the microprocessor.
  • Binary – A numbering system with a base of 2 that expresses quantities using a combination of the digits 1 and 0.
  • CPU – Central Processing Unit. Also known as the microprocessor.
  • GUI – Graphical User Interface. Software that interacts with users through the use of icons, menus and windows instead of text alone.
  • I.C. – Integrated Circuit. A collection of circuits on a silicon chip.
  • I/O – Input / Output. The communication of information between the computer and a peripheral device.
  • Intel – An American multinational corporation and technology company. It is the second largest semiconductor chip maker.
  • Memory – Integrated circuits used to store information for immediate use in computers.
  • Micron – An SI derived unit of length equalling 1 x 10 -6 metre, that is, one millionth of a metre or one thousandth of a millimetre.
  • Moore’s Law – A prediction made by Dr Gordon Moore, one of Intel’s founders, in 1965, that the number of transistors per silicon chip would double, every two years.
  • OS – Operating System. The software required to manage the hardware and logical resources of a personal computer, including device handling, process scheduling and file management.
  • PC – Personal Computer. What we now perceive as a computer.
  • Micro Processing Power – A microprocessor is a computer processor that incorporates the functions of a central processing unit on a single integrated circuit.
  • Smart Phones – Smart Phones are a type of mobile phone device which can execute multi-purpose applications on request from an end user.
  • Smart Homes – Smart Homes are a type of home which can execute or accommodate multi-purpose applications on request from end user.
  • Autonomous Cars – A self-driving car which is controlled without the input from a human.
  • Big Data – Big data is a phrase that we can associate with large amounts of Data which can be processed.
  • Data Centre – A Data centre is a dedicated space used to house computer systems and associated components, such as telecommunications and storage systems.
  • Teradata and Analytics – Teradata is a provider of large data repositories and analytics-related results on processing the data.
  • Cloud Computing – Cloud computing is an instance combined computer system resources with services that can be rapidly set up or shut down, via the internet.
  • Cloud Storage – Cloud storage is an instance combined computer storage resources which can be managed via the internet.
  • Cloud Analytics – Cloud analytics is a term for analysis of a company’s data, using cloud computing.


  • Theworkplacedepot.co.uk. (2018). How have computers developed and changed?. [online] Available at: https://www.theworkplacedepot.co.uk/news/2013/03/22/how-have-computers-developed-and-changed/ [Accessed 28 Nov. 2018].
  • Anderson, A. (1994). Foundations of Computer Technology. 1st ed. London: Chapman & Hall, p.399.
  • Thomas C., B. (1991). Computer Architecture And Logic Design. 3rd ed. Boston: McGraw-Hill, p.3.
  • Intel.com (2018) 50 Years of Moore’s Law – Economic Impact [Online] Available at:https://www.intel.com/content/www/us/en/silicon-innovations/moores-law-technology.html [Accessed: 06 December 2018]
  • AMD.com (2018) AMD Phenom™ II Key Architectural Features [Online] Available at: https://web.archive.org/web/20100514013729/http://www.amd.com/us/products/desktop/processors/phenom-ii/Pages/phenom-ii-key-architectural-features.aspx [Accessed: 06 December 2018].


  • How Microprocessors Work. Gregg Wyant and Tucker Hammerstrom. Ziff-Davis Press. 1994
  • Foundations of Computer Technology. A John Anderson. Chapman & Hall. 1994
  • Computer Architecture and Logic Design. Thomas C Bartee. McGraw-Hill Inc. 1990.
  • https://www.intel.com/content/www/us/en/history/history-intel-chips-timeline-poster.html
  • Smart Phone and Next Generation Mobile Computing. Ni, L. Morgan Kaufmann. 2010

Appendix 1:



Cite This Work

To export a reference to this article please select a referencing stye below:

Reference Copied to Clipboard.
Reference Copied to Clipboard.
Reference Copied to Clipboard.
Reference Copied to Clipboard.
Reference Copied to Clipboard.
Reference Copied to Clipboard.
Reference Copied to Clipboard.

Related Services

View all

DMCA / Removal Request

If you are the original writer of this essay and no longer wish to have your work published on UKEssays.com then please: