• Trending Now
  • Foundational Courses
  • Data Science
  • Practice Problem
  • Machine Learning
  • System Design
  • DevOps Tutorial

Fifth Generation of Computers

  • Fourth Generation of Computers
  • First Generation Of Computer
  • Third Generation of Computers
  • Second Generation of Computers
  • Generations of Computers - Computer Fundamentals
  • Who Invented Computer?
  • Ethernet Evolution in Computer Networks
  • Introduction to Human Computer Interaction (HCI)
  • Basics of Computer and its Operations
  • Types of Computer Networks
  • What is a General Purpose Computer?
  • Full Form of Computer
  • Computer Security Threats
  • Introduction of Quantum Computers
  • Importance of Computer Networking
  • 10 Interesting Facts About Computers
  • Human - Computer interaction through the ages
  • Issues in Computer Design
  • Evolution of Cloud Computing
  • What are Numbers?
  • What is an Operating System?
  • Types of Computers
  • Software and its Types
  • What is Computer Networking?
  • What is Internet? Definition, Uses, Working, Advantages and Disadvantages
  • Computer Memory
  • Introduction to Microsoft Word
  • What are Different Output Devices?
  • Read Only Memory (ROM)

By the time of the discovery of the computer through Charles Babbage, technology had advanced and superior in a completely vast manner. This development in technology and consequently the improvement of computer systems are grouped in numerous generations. Each generation of computer systems has a few vast alternates of their function and far greater benefit than the preceding generation of computer systems. So, it is often stated that a generation is regularly referred to as an alternate and development in the era. Basically, there are 5 generations of computer systems indexed under and they vary from each other in terms of architecture, occupying space, language, specification, function or operation performed, etc. Following is the list of computer generations:

1. First Generation of Computers(1940 – 1956): The duration from 1940-1956 changed into the duration of first-generation computer systems. They are essentially primarily based totally on vacuum tubes, and vacuum tubes are used because of the simple components for memory and circuitry for the CPU (Central Processing Unit). For example, UNIVAC-1 and ENIVAC.

2. Second Generation of Computers (1957 – 1963): This generation includes styles of gadgets transistors and magnetic core in the systems. For example, IBM 1401, IBM 1920, etc.

3. Third Generation of Computers(1964 – 1971): Computer circuits changed the usage of transistors within-side the third generation of computer systems. Integrated Circuits themselves include many transistors, capacitors, and resistors and because of this third-generation computer systems are smaller in size, efficient, and extra reliable. For example, CDC 1700, IBM-360 Series, etc.

4. Fourth Generation of Computers(1972 onward): VLSI (Very Large Scale Integrated) Circuit or they’re additionally referred to as microprocessors are utilized in this generation. A microprocessor chip is made from hundreds of Integrated Circuits construct on a single silicon chip. The use of Personal Computer(PCs) elevated on this generation and First Personal Computer (PC) changed into advanced through IBM. For example, Apple, CRAY-1, etc.  

5. Fifth Generation of Computers(Present and Future): It is primarily based totally on Artificial intelligence (AI) software. Artificial intelligence describes the medium and manner of creating computer systems like people, the manner human thinks, the manner people act, etc. and that is a rising department and has all of the scopes for studies work too. For example, PARAM 10000, IBM notebooks, etc.

Fifth Generation Computers

Fifth-generation computers were introduced after the fourth-generation computers were invented. Fifth-generation computers, also known as modern computers, are still in the development stage and are based on artificial intelligence. In 1982, Japan was invented the FGCS (Fifth Generation Computer System). Computers of this generation are based on microelectronic technology with high computing power and parallel processing.  

This is the most recent and technologically advanced computer generation. Modern high-level languages such as Python, R, C#, Java, and others are used as input methods. These are incredibly dependable and use the Ultra Large Scale Integration (ULSI) technology. War. Parallel processing hardware and artificial intelligence software are used in computers. 

These computers are at the cutting edge of modern scientific computations and are being utilized to develop artificial intelligence (AI) software. Artificial intelligence (AI) is a popular discipline of computer science that examines the meaning and methods for programming computers to behave like humans. It is still in its infancy.

In the fifth generation of computers, all high-level languages are employed. The primary goal of the fifth generation is to create machines that can learn and organize themselves. Artificial intelligence and parallel processing hardware are at the heart of this generation of computers, and artificial intelligence encompasses terms like Robotics, Neural Networks, etc.

The fundamental goal of this system is to make development in artificial intelligence and incorporate it into a new generation of extremely powerful computers that can be used by the average person. AI-based systems are employed in a variety of real-world applications and give a variety of benefits. When a specific set of knowledge and skills is required, systems are capable of performing well in scenarios that a human could encounter with the help of proper training. They do not, however, fit in situations where there is a need for tacit knowledge and a human can get it by talking in natural language and is concerned with form and speech recognition

The usage of AI, which helps to make computers more powerful, is one of the primary elements of 5th generation computers. From navigation to browsing, AI applications may be found everywhere. It’s also used for video analysis, image processing, and other tasks. Artificial intelligence is projected to automate practically every element of computing.

Even though they are still in development, computers in the fifth generation are more powerful, functional, and speedy. Some of the benefits of computers that use ULSI (Ultra Large-Scale Integration) technology. The fifth-generation computers employ AI (artificial intelligence) technology, which includes expert system development, gameplay, and more. These machines were able to interpret human language as well as recognize graphs and photos thanks to AI technology. Fifth-generation computers are being developed to address extremely difficult tasks, such as working with natural language. They will, hopefully, be able to utilize more than one CPU and will be less expensive than the current generation. It is relatively simple to move these computers from one location to another. Some fifth-generation computers are PARAM 10000, IBM notebooks, Intel P4, Laptops, etc.

Features of Fifth-generation Computers

Following are some features of fifth-generation computers:

  • The ULSI (ultra large scale integration) technology is used in this generation of computers.
  • Natural language processing is now in its fifth phase of development.
  • In this generation’s computers, artificial intelligence has progressed.
  • Parallel processing has advanced on these computers.
  • The fifth-generation computer includes more user-friendly interfaces and multimedia functions.
  • These PCs can be purchased for a lower price.
  • Computers that are more portable and powerful.
  • Computers are dependable and less expensive.
  • It’s easier to manufacture in a commercial setting.
  • Desktop computers are straightforward to operate.
  • Mainframe computers are extremely efficient.

Advantages of Fifth Generation of Computer

Following are some advantages of fifth-generation computers:

  • These computers are far quicker than previous generations.
  • These computers are simpler to repair.
  • These computers are substantially smaller in size than other generation computers.
  • They are lightweight and easy to move.
  • True artificial intelligence is being developed.
  • Parallel Processing has progressed.
  • Superconductor technology has progressed.

Disadvantages of Fifth Generation of Computer

Following are some disadvantages of fifth-generation computers:

  • They’re usually sophisticated but could be difficult to use.
  • They can give businesses additional power to monitor your activities and potentially infect your machine.

Sample Questions

Question 1: What this counting machine is called developed by Charles Babbage known as the father of the computer?

Charles Babbage developed a counting machine called a difference engine.

Question 2: Which generation of computers uses integrated circuits?

The third generation computers were the enhanced version of second-generation computers they used integrated circuits.

Question 3: What are the key technologies used in the fifth generation of computers?

VLSI architecture, parallel processing such as data flow control, logic programming, knowledge base based on a relational database, and applied artificial intelligence and pattern processing appear to be the key feature of fifth generation computer.

Question 4: Which generation support AI? 

Fifth generation computers support AI(Artificial Intelligence).

Question 5: Which generation of computers supports the operating system and other application software?

Third generation computers supports operating system and other application software.

Please Login to comment...

Similar reads.

  • School Learning
  • School Programming

advertisewithusBannerImg

Improve your Coding Skills with Practice

 alt=

What kind of Experience do you want to share?

Digitalworld839.com

Generations of Computer 1st to 5th Explained with Pictures.

The history of computer technology is often used to refer to the origin of all the different generations of computers . From first to fifth each computer generation is characterized by significant technological development in their components, memory , and elements which essentially changed the way these devices work.

Several periods of generation from over the years advanced the technological evolution leads to the creation of today’s modern computer with more complex, more powerful, and increased capability and functionality.

Introduction to Computer Generations

This development period of electronic computing technology is called Computer Generation. There are five generations of computers identified, although the sixth generation could be in development now in the early 21st century.

During the evolutionary timeline, each generation of computers has improved a lot by undergoing considerable changes in their size, type, and functionality.

By analyzing them, one can trace the evolution of computer technology, to see how the computer industry has changed over the years and how great capabilities and software progress has been made by humankind in under a hundred years , as a result, the creation of different generations.

At present, the computer is playing a significant part in human existence because today’s digital computer is being used for every work in each field. If someday an issue occurs in the computer or the server is down, at that point all the work stops. This is how significant it is for technology development!

In this article, I will introduce you to all the generations of computers with pictures by explaining the complete information about their characteristics , names, components , and examples too.

Generations of Computer From 1st to 5th

Generations of Computer 1st to 5th

Let’s discover the series of computer generations in the following list:

1st Generation of Computer (1940-1956)

This first generation of computers was based on vacuum tube technology used for calculations, storage, and control, invented in 1904 by John Ambrose Fleming. The vacuum tubes and diode valves were the chief components of the first generations of computers.

vacuum tube technology

First-generation computers relied on the lowest-level machine language, in order to perform operations, and could only solve a single problem at a point of time.

Magnetic drums were used as the memory in these computers (were very slow in speed). The punched and magnetic tapes were used for the input and output function of the computer in order to display on prints even the results weren’t 100% accurate.

punched and magnetic tapes

Also, the first generation of computers available was based on the 8-bit microprocessor.

The disadvantages of 1st gen computers are that they were very enormous in size and heavy in weight (made of thousands of vacuum tubes ) , occupying large rooms. Also, once they were kept in one place it was difficult to transfer. Another con like using a decimal number system and many switches and cables.

In addition, they were also very expensive to operate with using a large amount of electricity, the vacuum tubes produced large amounts of heat, so an air conditioner was required for the proper functioning unless a lot of heat can cause a malfunction.

The advantage of the first generation of computers is that they could calculate in milliseconds (about five thousand sums per second.)

The computers of first-generation were managed to use in different fields like weather forecasting, solving mathematical problems, energy tasks, also in space research, military, and other scientific tasks.

In the first generation of computers, the first computer of the world named “ENIAC” (Electronic Numerical Integrator and Computer) was discovered by John Mauchly and J. Presper Eckert in the year between 1943 to 1945.

ENIAC used panel-to-panel wiring and switches for programming, occupied more than 1,000 square feet, used about 18,000 vacuum tubes, and weighed 30 tons.

very huge size of computer

Characteristics of the 1st Generation of Computer:

  • Vacuum tubes and diode valves were used as the main electronic component in the first generation computers.
  • Punch cards, paper tape utilized for input and output operations.
  • Magnetic drums used for storage.
  • Huge in size and weight with a lot of power consumption.
  • Very expensive in price also not reliable.
  • Computers were programmed with low-level machine language also has low operating speed.

Examples of the first generation of computers are ENIAC (Electronic Numerical Integrator and Computer), UNIVAC (Universal Automatic Computer) EDSEC (Electronic Delay Storage Automatic Calculator), EDVAC (Electronic Discrete Variable Automatic Computer), (Electronic delay storage automatic calculator), IBM -701 and IBM 650.

ENIAC, the first general-purpose electronic digital computer . This computer about 18,000 vacuum tubes used for the calculation result in huge in size, occupied more than 1,000 square feet, and weighed 30 tons. These were the harbingers of today’s digital computers. This first computing machine was designed by people J. P. Eckert, W. Mosley, J. W. Mauchly.

2nd Generation of Computer (1956-1964)

The second generation of computers replaced the vacuum tubes with a reliable component called transistors for manufacturing of computers was invented by William Shockley in 1947.

transistors

The transistors were the revolution in the computer field because this component advantaged the 2nd gen computer by increasing the performance, operating speed (hundreds of thousands of operations per second), as well as decreasing the electricity consumption of the computers.

Transistors were far superior to the vacuum tube, allowing computers to get faster, cheaper, more energy-efficient made and possible to reduce the size of computing equipment and ultimately heat reduced and reliability improved.

Computers of second-generation are characterized by the use of the first high-level programming languages, allowing programmers to specify instructions in words. At this time, early versions of COBOL, ALGOL, SNOBOL, and FORTRAN languages were developed .

These were the first computers to store their instructions in their memory, which went from a magnetic drum to magnetic core technology. During this period, the first computer game name “ Spacewar ” was seen on a PDP-1 computer.

Spacewar game in PDP-1 computer

Do you know~ that the oldest abacus was a computing machine designed to calculate thousands of years ago, which is still used in schools today to do calculations.

Also, the concept of Central Processing Unit (CPU), multi-programming operating systems, programming language, memory, and input and output units (I / O units) were developed in the timeline of second-generation computers.

The major disadvantages of Second-generation computers were they still relied on punch cards for input and hard copies for output as well as still it was difficult to move the computers for the reason they were enough large and even some computers needed ACs.

2nd generation of computers still huge in size

This second generation of computers was first used in the fields like the atomic energy industry and nuclear power plants and other commercial fields.

Characteristics of the 2nd Generation of Computer:

  • Computers based on transistors instead of vacuum tubes.
  • Magnetic Tape was used to store data.
  • Relatively small in size and reduced weight with low energy consumption than 1st gen computers.
  • Faster, reliable, and less expensive than the first generation.
  • Use of storage devices, printers, and operating systems, etc.
  • Higher-level languages like COBOL, ALGOL, SNOBOL, and FORTRAN were developed and used.

Examples of the second generation of computers include IBM 1620, CDC 1604, IBM 7094, UNIVAC 1108, IBM 620, CDC 3600, IBM 4044, Honeywell 400, IBM 1401 Mainframe, and PDP-1 minicomputer. IBM was actively working, producing transistor versions of its computers.

3rd Generation of Computer (1964-1971)

The third generation appeared in the form of integrated circuits (invented by Jack Kilby from 1958 to 1964). An IC (integrated circuit) is consists of many small transistors mounted on chips , which are called semiconductors.

integrated circuits

This synchronized chip became an important foundation for the third generation computers when scientists combined hundreds of transistors fit in this circuit result in a more powerful electronic segment called an integrated circuit.

Multiprogramming was implemented (this is when there are several executable programs in memory) at the same time that it diminished their manufacturing costs. In the mid-60s. IBM improved the term “computer architecture”. By the end of the 60s. mini-computers appeared.

This revolutionary innovation allowed to expansion of the processing capacity and memory of the machines.

Instead of punch cards and prints, users interacted via keyboards and monitors , and interacted with an operating system, allowing the device to run various applications at once with a central program that monitored the memory.

3rd Generation of Computer

As you can see, the first appearance of computer monitors fell on the second generation of computers. The invention belongs to the company IBM, which in 1964 released the commercial display station IBM-2250.

it was used in the system/360 series. The model had a vector monochrome display measuring 12×12 inches, with a resolution of 1024×1024 pixels and a refresh rate of 40 Hz. This invention revolutionized today’s different types of monitors including LCD, LED, OLED monitors.

The invention of IC incredibly decreased the size of computers and made it easy for transportation from one place to another. The working speed and efficiency of this generation of computers were much faster than the previous generation and even cheaper.

High-end languages such as PASCAL, BASIC, FORTRAN – II TO IV, COBOL, ALGOL developed in this generation.

For the first time, they got access to a mass audience allowed computers to penetrate into different spheres of human activity since they were smaller and cheaper. Along these, they turned out to be more specialized (i.e., there were different computers for different tasks).

The 3rd generation of computers was the initial move towards the miniaturization of computers and quickly expanded their scope: control, automation of scientific experiments, data transmission, etc. In addition to being used in the manufacture of radios, TVs, and other similar devices .

Characteristics of the 3rd Generation of Computer:

  • In this generation, computers based on Integrated Circuit was more powerful than the transistor.
  • The size of the computers was likewise little because the size of the IC being more modest than the circuit size of the transistors.
  • More reliable, inexpensive, faster, energy-efficient, as well as very light in weight than 2nd gen computers.
  • The first Computer Mouse and Keyboard were appeared and used in the 3rd generation of computers
  • Use of new versions of high-level languages like BASIC, COBOL, FORTRAN, PASCAL, and ALGOL
  • Available for a mass audience and made it possible for general purpose usage.

Some of the most popular models of the 3rd generation of computers were the ICL 2903, ICL 1900, TDC-B16, IBM 360 and 370, Honeywell 6000, UNIVAC 1108, PDP-8, and PDP-11, which were ideal in their handling multiprocessing capabilities, reliability, and flexibility than previous generations.

4th Generation of Computer (1971-2010)

The microprocessor brought the fourth generation of computers, as thousands of integrated circuits equivalent to about millions of transistors were assembled and brought the whole central processing unit and other fundamental elements of the machine into a small chip called a microprocessor fitted on the CPU socket.

microprocessor chip

These computers used Very Large Scale Integrated circuits technology also called VLSI technology. After the invention, the microprocessor began to used in computing machines in the fourth and fifth generations of computers.

Within the framework of the considered generation in 1971, the first microprocessor appeared as an unexpected result of Intel’s work on calculator circuits and further development of minicomputers ( PDP-11 ).

first microprocessor of Intel 4004

The first personal computer and a microcomputer was “ ALTAIR ” developed by the company MITS in 1974. Also, the first microprocessor was the Intel 4004, manufactured in 1971, initially for an electronic calculator. Whereas the computers of the first generation filled an entire room, while now the 4th generation ‘microprocessors’ fit in the palm of the hand.

This generation of computers used an operating system based on the graphical user interface (GUI), which means these numbers were very easy to perform mathematical and logical tasks.

The computers started to utilize high-speed memory systems on integrated circuits with a capacity of several megabytes. Computer performance has increased significantly (hundreds of millions of operations per second).

The high-level language like C, C ++, Java, PHP, Python, Visual Basic,  was utilized to compose programs in the computers of the fourth generation.

high-level languages in 4th generation of computers

The advent of the first personal computers in the mid-70s gave every common user the same computing resources that enormous computers had during the 60s. These computers were made more modest, faster, and less expensive can undoubtedly be put on a table or desk. Which marked the so-called era of personal computers .

Peripheral devices examples , such as mice, joysticks, handheld devices, etc., were developed during this 4th generation. Computers could be connected together in a network to share information with each other, this has played an important role in the birth and development of LAN, Ethernet, and the Internet .

Era of personal computers and Internet

The most popular companies in the world like Intel and AMD were rising. Then again, companies like Microsoft and Apple introduced their operating systems ‘Windows’ and ‘Macintosh’ in the generation of this computer. Because of which the act of multimedia started.

This is the era where personal computers were born, an idea that actually persists today. Also, these were the generation of DEC’s (Digital Equipment Corporation) minicomputers.

Characteristics of the 4th Generation of Computer:

  • Computers based on microprocessors and VLSI technology .
  • The computers of 4th gen were small in size, lightweight, and almost portable computers.
  • The integrating of multi cores in processors like Dual core , Octa core, etc has began.
  • The processing speed of this computer generation was much faster and reliable than the previous three generations.
  • The size and cost of power supply units has reduced.
  • Use of languages ​​like C, C ++, .Net, Java, PHP, Python , Visual Basic.
  • Use of GUI Based OS with more memory capacity.
  • Accessible to the Internet .
  • Due to the low cost of these computers, they were available to every common man.

Desktops, Laptops, Workstations, Tablets, Chromebooks , and Smartphones, are examples of the fourth generation of computers.

Good to Know~ Alan Turing is the father of modern computers born in England in 1912.

5th Generation of Computer (2010-At Present)

Artificial intelligence is the name of the fifth as well as the latest generation of computers based on ULSI (Ultra Large Scale Integration) technology is the process of integrating or embedding millions of transistors on a single silicon microchip.

5th Generation of Computer

Computing in the 5th computer generation is versatile made portable, powerful, lightweight, innovative, comfortable with low electricity consumption . Because of the Internet’s advantages , it extended its limits of use to limits never before suspected.

The main objective of the latest fifth-generation computing and effort made by computer researchers is to make them smart by incorporating Artificial Intelligence so as to develop devices that respond to the input of natural language and are capable of learning and self-organizing even in 2022 it is under development.

This new information technology has greatly increased the size and working ability of the microprocessor, which has prompted the use of computers in the various fields of Entertainment, Accounting, Educational institutes , Film-making, Traffic-control, Business applications , and Hospitals, Engineering, Researches, Defense, etc.

That’s why a computer of the 5th generation is also known as the AI (Artificial Intelligence) generation of computers.

Some computers are being intended to do all the work themselves as a human act, behave, and communicate. The best example of this is an Artificial Intelligence (AI) based computing machine in the 5th generation of computers “ Sophia ” a robot.

Artificial intelligence

Characteristics of the 5th Generation of Computer:

  • The main focus on AI-based computers.
  • Computers made of microprocessors based on ULSI (Ultra Large Scale Integration) technology.
  • The processing speed is quite high can perform billions of calculations in a second.
  • Computers are portable, cheap, reliable, fast, and available in various forms and sizes like a Desktop, Laptop, Smartphone, Smartwatches, etc.
  • Invention of the operating system such as Windows, Macintosh and ChromeOS of Chromebooks .
  • Multimedia has evolved in this generation by combining Sound, Graphics, or Picture and Text.
  • Development of Internet of Things.

Computers of the fifth generation are being made to think like us. For which continuous advancement of technologies like Artificial Intelligence, Internet of Things, Robotics, etc. Although the examples of AI computing software such as Chatbots, Windows Cortana, Google Assistant, Apple Siri, Speech recognition, that are being used today.

Classification of the computer by generations

Factors/reasons for the development of computer generations:.

There below are the general factors associated with the development and change in the generations of electronic computers:

  • Improvement of the element base,
  • Downsizing,
  • Technological progress (increased performance, speed, and memory)
  • Reduced cost,
  • Development of  software ,
  • Changes in architecture, expansion of the range of tasks solved by computers,
  • Simplification and standardization of hardware.
  • Changing the way of interaction between the user and the computer.

How many generations of computers have there been?

There are 5 computer generations till now i.e. vacuum tubes, transistors, integrated circuits, microprocessors, and the last one is artificial intelligence. 6th generation yet to come may be either in the form of quantum computers or developing the existing artificial intelligence technology to a greater extent.

What is the 6th generation of computers?

Electronic computers are usually divided into five generations now and the 6th generation is still in development but has the potential to give birth to the sixth generation of computers may be in the form of quantum computing.

Which is the current modern generation of computers today?

The technologies based on artificial intelligence are the current and the latest generation of computers(5th GEN) today.

What is the historical development of computers according to generation?

In accordance with the methodology for assessing the development of computer technology, the first generation was considered to be vacuum tube computers, the second – transistor computers, the third – computers on integrated circuits, the fourth – using microprocessors, and the fifth generation computers is based on the artificial intelligence.

What is the generation of a colossus computer?

Colossus computer was the first generation of the computer developed and designed by Tommy Flowers at Bletchley Park in the year 1944 with the purpose of cracking Hitler’s codes.

The sixth will also discover in the future since there are some flaws of technology in this generation that will be revived or resolved in the upcoming generation.

It takes much time and research to publish such an article ” Generation of Computer 1st to 5th “ If you liked the insights of the article you can support us by sharing this post on social networks.

Share this Post !

Similar Posts

what is the difference between wifi and internet

Wifi vs Internet, Difference Between Explained.

Advanatges of Internet

13 Key Advantages and Benefits of Internet in Daily Life.

Differences between SATA I, SATA II, and SATA III

Difference Between SATA I vs SATA II vs SATA III Explained.

Different Sizes of Motherboards [Explained with Pictures]

Different Motherboard form factors Explained with Pictures.

26 thoughts on “generations of computer 1st to 5th explained with pictures.”.

yes that awesome

You’re welcome. And I’m happy to hear that you enjoyed this information!

This the best platform for student to learn from very gradual, this information is really helpful

It was so wonderful and interesting thank you so much

Hi Rachel, you’re welcome. And I’m happy to hear that you enjoyed this information!

You have explained the generation of computers very well, by reading this article anyone will understand about the generation of computers.

You’re welcome. Glad you learned some new & informative stuff.

Yes you right sir

Thanks for DIGITALWORLD839.COM for publication of the topics on computers

Wow it helped a lot

Hi Angel, you’re welcome. And I’m happy to hear that you found this information helpful!

You’re welcome, Asif.

Thank you so much

You’re welcome, Zamzam.

thank you! you help me a lot

Very informative and really precise on the subjects. Thanks.

This’s really helped me with my school project. Thanks so much!

It’s outstanding To much details given by the writer

Well understood!

well understood! thank you

That sounds nice It’ll boost the academic performance of computer student😁

Thanks i found this platform very interesting

Thanks for the information it’s really useful

That’s great

thank so much for the help much appreciated.

Leave a Comment Cancel reply

Save my name, email, and website in this browser for the next time I comment.

Computer Fundamentals Tutorial

  • Computer Fundamentals
  • Computer - Home
  • Computer - Overview
  • Computer - Applications
  • Computer - Generations
  • Computer - Types
  • Computer - Components
  • Computer - CPU
  • Computer - Input Devices
  • Computer - Output Devices
  • Computer - Memory
  • Computer - RAM
  • Computer - Read Only Memory
  • Computer - Motherboard
  • Computer - Memory Units
  • Computer - Ports
  • Computer - Hardware
  • Computer - Software
  • Computer - Number System
  • Computer - Number Conversion
  • Computer - Data and Information
  • Computer - Networking
  • Computer - Operating System
  • Computer - Internet and Intranet
  • Computer - How to Buy?
  • Computer - Available Courses
  • Computer Useful Resources
  • Computer - Quick Guide
  • Computer - Useful Resources
  • Selected Reading
  • UPSC IAS Exams Notes
  • Developer's Best Practices
  • Questions and Answers
  • Effective Resume Writing
  • HR Interview Questions
  • Computer Glossary

Computer - Fifth Generation

The period of fifth generation is 1980-till date. In the fifth generation, VLSI technology became ULSI (Ultra Large Scale Integration) technology, resulting in the production of microprocessor chips having ten million electronic components.

This generation is based on parallel processing hardware and AI (Artificial Intelligence) software. AI is an emerging branch in computer science, which interprets the means and method of making computers think like human beings. All the high-level languages like C and C++, Java, .Net etc., are used in this generation.

Fifth Generation

AI includes −

  • Neural Networks
  • Game Playing
  • Development of expert systems to make decisions in real-life situations
  • Natural language understanding and generation

The main features of fifth generation are −

  • ULSI technology
  • Development of true artificial intelligence
  • Development of Natural language processing
  • Advancement in Parallel Processing
  • Advancement in Superconductor technology
  • More user-friendly interfaces with multimedia features
  • Availability of very powerful and compact computers at cheaper rates

Some computer types of this generation are −

WebNots

Home » Tech Tips » Internet » A Comprehensive Guide to Generations of Computers

A Comprehensive Guide to Generations of Computers

There are five generations of computers and the sixth generation is an emerging one. Over past decades, computers have evolved significantly, with each generation introducing new capabilities, improved performance, and enhanced features. The journey of computer’s development through different generations represents a fascinating tale of innovation, progress, and technological advancement. In this guide, we will delve into the various generations of computers, highlighting their characteristics, key advancements, and the impact they had on shaping the digital landscape.

Computer Generations

Learn more about types of computer keyboards and types of search engines .

Generations of Computers

There are five generations of computers.

  • First generation computers used vacuum tubes.
  • Second generation computers used transistors.
  • Third generation computers used ICs (Integrated Circuits).
  • Microprocessors are used in fourth generation computers.
  • Fifth generation computers are the most modern ones that are commonly used nowadays.

And finally, the sixth generation is AI powered super computers that are emerging and evolving as of today. So, this is not yet an officially and widely accepted category.

Download this entire guide to generations of computers as a PDF file

Guide to Generations of Computers

1. First Generation Computers – Vacuum Tubes

The first generation of computers, spanning the 1940s to the early 1950s, represents the initial foray into electronic computing. These machines were huge, expensive and marked by the use of vacuum tubes as their primary electronic component. Here are key aspects of the first generation of computers, along with notable examples.

Vacuum Tubes – Characteristics

Vacuum tubes are glass tubes containing electrodes used to control electrical current. They were the heart of early computers, performing functions like amplification and switching. The first generation marked the shift from mechanical calculating devices to electronic computing. This transition laid the foundation for subsequent generations to build upon. First generation computers processed data in binary code, using ones and zeros to represent information. These computers were primarily designed for scientific and mathematical calculations, often related to military or defense applications.

Vacuum Tube

Programming Challenges & Other Issues

Programmers in the first generation had to physically wire the machine to perform specific tasks. This process was time-consuming and required a deep understanding of the machine’s architecture. Debugging and correcting errors in the programs were complex tasks due to the lack of high-level programming languages and debugging tools.

Vacuum tubes generated a considerable amount of heat, were prone to failure and consumed significant amounts of power. This made the machines large, cumbersome and challenging to maintain. Despite being revolutionary at the time, these computers were relatively slow by today’s standards and their applications were limited compared to modern computing.

Interaction with these computers was minimal and users often had to physically reconfigure the machine for different tasks. Skilled operators played a crucial role in the operation of first generation computers, handling tasks like loading programs and managing hardware components.

Examples of First Generation Computers

  • ENIAC (Electronic Numerical Integrator and Computer): Completed in 1945, ENIAC was one of the earliest electronic general-purpose computers. It consisted of around 17,468 vacuum tubes and occupied a large room.
  • UNIVAC I (Universal Automatic Computer): Developed in the early 1950s, UNIVAC I was the first commercially produced computer. It used vacuum tubes and magnetic tape for data storage.

ENIAC

Moving to Second Generation

First generation computers quickly became outdated as technology evolved. The rapid pace of advancements in subsequent generations rendered these machines obsolete within a relatively short time frame. Understanding the challenges and innovations of the first generation of computers provides valuable insights into the monumental strides made in subsequent generations. The transition from vacuum tubes to transistors in the second generation marked a pivotal moment in the history of computing, paving the way for smaller, more reliable and efficient machines.

2. Second Generation Computers – Transistors

The second generation of computers, spanning the late 1950s to the early 1960s, marked a significant leap forward in terms of technology and design compared to the first generation. The key innovation defining this era was the replacement of vacuum tubes with transistors, leading to improvements in size, reliability and efficiency. Here are some crucial aspects of the second generation, along with notable examples.

Transistor

Prominent Features

The most defining feature of second generation computers was the use of transistors as electronic components, replacing the bulky and less reliable vacuum tubes. Transistors were smaller, faster, more durable and consumed less power than vacuum tubes. This transition resulted in more compact and efficient computer systems. It also made them more affordable and accessible to a broader range of organizations and businesses.

  • Magnetic Core Memory – Second generation computers replaced the drum memory used in the first generation with magnetic core memory. This type of memory was faster, more reliable and allowed for random access to data. Magnetic core memory improved the overall performance and efficiency of computers , making them more suitable for a wider range of applications.
  • Printed Circuit Boards – Second generation computers saw the adoption of printed circuit boards, which simplified the construction of electronic circuits and contributed to the overall reliability of the systems. The use of printed circuit boards allowed for easier maintenance and troubleshooting.
  • Speed & Processing – Second generation computers demonstrated substantial improvements in processing speed compared to their predecessors, allowing for more complex calculations and data processing. These computers found applications in scientific research, business data processing and military operations, reflecting the growing versatility of computing technology.

Programming & Processing

With the advent of assembly languages and high-level programming languages like FORTRAN and COBOL, programming became more accessible and less reliant on low-level machine code. This shift allowed for more efficient programming, making it easier for developers to write and debug code.

Second generation computers often operated in batch processing mode, where a series of jobs were submitted for processing together. This mode improved the overall efficiency of computing tasks.

Examples of Second Generation Computers

  • IBM 1401 and CDC 1604 are examples of second generation computers that were widely used for batch processing applications.
  • IBM 7090 and UNIVAC 1107 were examples of second generation computers that were smaller and more commercially viable.

IBM 1401 Computer

Moving to Third Generation

The second generation marked the beginning of the end of the punched card era. While punched cards were still used for input and output, magnetic tapes and disks became more prevalent, offering faster and more efficient data storage solutions. The transition to transistors and other technological advancements during the second generation laid the groundwork for subsequent developments in computing. The improvements in size, speed and reliability set the stage for further innovation in the third generation, which would see the integration of integrated circuits and bring about a new era in computing.

3. Third Generation of Computers – Integrated Circuits

The third generation of computers, spanning the 1960s to the 1970s, marked a significant evolution in computing technology, introducing integrated circuits (ICs) and bringing about improvements in performance, reliability and versatility. This era witnessed a shift from discrete transistors to integrated circuits, enabling more powerful and compact computer systems. Here are key aspects of the third generation, along with notable examples.

Integrated Circuits (ICs)

The defining feature of third generation computers was the use of integrated circuits, which incorporated multiple transistors and other electronic components onto a single semiconductor chip. Integrated circuits significantly reduced the size of computers, enhanced reliability and improved overall performance. The miniaturization allowed for the creation of smaller, more efficient and cost-effective systems.

Microprocessor Chip

Advancements with Third Generation

  • Graphics – Third generation computers started to incorporate basic graphics capabilities, paving the way for the development of graphical user interfaces (GUIs) in subsequent generations. Graphics capabilities found applications in scientific visualization, engineering and early computer-aided design (CAD).
  • High-level Programming Languages –  The use of high-level programming languages continued to evolve in the third generation. Languages such as COBOL, FORTRAN and ALGOL gained popularity, making programming more accessible and efficient. The availability of high-level languages allowed programmers to focus on problem-solving rather than dealing with the complexities of machine code, fostering greater productivity and software development.
  • Time-Sharing Systems – Third generation computers introduced more sophisticated operating systems, facilitating better management of resources and scheduling of tasks. Time-sharing systems emerged, enabling multiple users to access a computer simultaneously. This marked a departure from batch processing, allowing for interactive computing and improved resource utilization.
  • Input/Output Devices – The third generation saw improvements in input/output devices. The use of terminals and displays became more widespread, enhancing user interaction and making computing more user-friendly.
  • Remote Data Access – With improvements in communication technology, third generation computers began to support remote data access. This facilitated the sharing of information across different locations and laid the groundwork for the interconnected computing environments of the future.
  • Magnetic Tape and Disk Storage – While magnetic tapes were still used for data storage, third generation computers witnessed the increased adoption of magnetic disk storage. Disk storage allowed for faster access to data and became a standard feature in computer systems.

Examples – Mainframe & MiniComputers

Third generation computers saw the widespread adoption of mainframe computers, which became the backbone of large-scale data processing for organizations and businesses. IBM System/360, introduced in 1964, was a groundbreaking series of mainframe computers that offered a range of compatible models for different applications. The System/360 architecture set a standard for compatibility across various models and paved the way for future computing systems.

Third generation also saw the rise of minicomputers, which were smaller, more affordable and suitable for medium-scale computing tasks. DEC PDP-11, introduced in 1970, was a highly successful minicomputer that found applications in research, education and industrial control systems.

Mainframe Computer

Moving to Fourth Generation

The third generation of computers represented a significant step forward in terms of technology, with integrated circuits revolutionizing the design and capabilities of computing systems. The adoption of high-level programming languages, sophisticated operating systems and advancements in storage and communication set the stage for the continued evolution of computers in the fourth generation and beyond.

4. Fourth Generation Computers – Microprocessors

The fourth generation of computers, spanning the late 1970s through the 1980s and into the 1990s, witnessed transformative advancements in technology, introducing microprocessors, personal computers and a shift towards user-friendly interfaces. This era marked a departure from the large, centralized mainframe systems of the previous generations. Here are key aspects of the fourth generation, along with notable examples.

Microprocessor

Features & Advancements

  • Microprocessors – The most significant development of the fourth generation was the integration of microprocessors. Microprocessors combined the central processing unit (CPU) onto a single semiconductor chip, bringing unprecedented computing power to smaller, more affordable systems. Microprocessors enabled the creation of compact, powerful and energy-efficient computers. This innovation paved the way for the personal computer revolution.
  • Personal Computers (PCs) – The fourth generation saw the rise of personal computers, making computing accessible to individuals and small businesses.
  • Storage Advancements – Fourth generation computers saw the widespread adoption of hard disk drives (HDDs) for mass storage. Hard drives offered larger capacities and faster access to data than previous storage technologies. The introduction of CDs as a storage medium for software distribution and multimedia content became prominent during this era.
  • Parallel Processing and Supercomputers – The fourth generation saw advancements in parallel processing, enabling computers to perform multiple tasks simultaneously.
  • Graphical User Interfaces (GUIs) – GUIs became a standard feature in the fourth generation computers, providing users with visual interfaces, icons and point-and-click interactions. GUIs made computers more user-friendly and accessible to individuals with limited technical expertise, contributing to the democratization of computing.
  • Software Development – Fourth generation computers saw a proliferation of software applications for various purposes, including word processing, spreadsheets, databases and entertainment. The availability of commercial software expanded, providing users with a wide range of options to enhance productivity and creativity.

Networking and the Internet

The fourth generation saw the expansion of computer networking, laying the groundwork for the development of the internet.

  • TCP/IP Protocol – The adoption of TCP/IP protocol standardized communication on the emerging internet, facilitating global connectivity .
  • ARPANET – The precursor to the internet, ARPANET, continued to evolve during this era, connecting research institutions and paving the way for the information age.

Examples of Fourth Generation Computers

The fourth generation witnessed the development of portable computers and laptops, providing users with mobility and flexibility.

  • Personal Computers – Introduced in 1981, the IBM PC became a standard for personal computing. Its open architecture allowed for the use of third-party hardware and software, contributing to the widespread adoption of PCs.
  • Portable Computers – The Osborne 1 (1981) and the IBM ThinkPad (1992) were early examples of portable computers that contributed to the evolution of mobile computing.
  • Apple Macintosh – Launched in 1984, the Macintosh brought a graphical user interface (GUI) to personal computers, enhancing user interaction and making computing more intuitive.
  • Supercomputers – High-performance computing became more accessible, with the development of supercomputers like the Cray-2 (1985) and the Connection Machine (1987).

Apple’s Macintosh System Software (macOS) and Microsoft Windows were prominent examples of operating systems with graphical user interfaces.

Moving to Fifth Generation

The fourth generation of computers revolutionized the landscape by making computing power available to individuals, fostering a new era of accessibility and innovation. The integration of microprocessors, the rise of personal computers and the development of user-friendly interfaces laid the foundation for the diverse and interconnected computing ecosystem we experience today.

Apple Macintosh

5. Fifth Generation of Computers

The fifth generation of computers represents a period of computing that extends from the late 20th century into the early 21st century. This era is characterized by advancements in parallel processing, artificial intelligence (AI) and the development of novel computing architectures. While the exact timeline of the fifth generation can vary, it generally covers the period from the mid-1980s to the present day. Here are key aspects of the fifth generation, along with notable examples.

  • Parallel Processing – Fifth generation computers embraced parallel processing, the simultaneous execution of multiple tasks to enhance computational speed and efficiency. Parallel processing allowed for the development of supercomputers and high-performance computing clusters capable of tackling complex problems in fields like scientific research, weather modeling and cryptography.
  • Artificial Intelligence (AI) – The fifth generation is often synonymous with the integration of artificial intelligence into computing systems. Advanced programming languages, expert systems and neural networks became integral tools in the development of AI applications. AI supports in areas like natural language processing, image recognition and expert systems for decision-making.
  • Knowledge-Based Systems – Knowledge-based systems, also known as expert systems, were developed during the fifth generation. These systems used human knowledge to make decisions and solve complex problems.
  • Natural Language Processing (NLP) – Fifth generation computers focused on improving the ability to understand and respond to human language. NLP applications included language translation, voice recognition and text understanding.
  • Massive Parallelism and Distributed Computing – The fifth generation witnessed a shift towards massive parallelism and distributed computing architectures.
  • Quantum Computing (Emerging) – Towards the latter part of the fifth generation and into the sixth generation, quantum computing emerged as a groundbreaking field. Quantum computers leverage the principles of quantum mechanics to perform computations at speeds that classical computers cannot achieve.
  • Personal Computing Evolution – The fifth generation saw the continued evolution of personal computing, with advancements in hardware, software and user interfaces.

Fifth Generation Computer Systems (FGCS) & Internet

The Japanese government launched the Fifth Generation Computer Systems project in the 1980s, aiming to develop advanced computer systems with AI capabilities. The project was focused on parallel processing, knowledge-based systems and natural language processing. While it didn’t achieve all its ambitious goals, it contributed to advancements in AI research.

The fifth generation witnessed the widespread adoption of the internet as a global communication and information-sharing platform. The development of the World Wide Web in the early 1990s transformed how information is accessed and shared, leading to the interconnected digital world we experience today.

Examples – Mainframe & Minicomputers

  • IBM’s Deep Blue, which defeated a world chess champion in 1997, is a notable example of AI achievements during this era.
  • Systems like IBM’s Watson, known for winning Jeopardy! in 2011, showcased advancements in natural language processing.
  • Distributed computing projects, like SETI@home, utilized the power of networked computers worldwide to analyze radio signals from space in the search for extraterrestrial intelligence.

The proliferation of personal computers, laptops and the eventual rise of smartphones and tablets exemplify the ongoing evolution of computing devices. Companies like IBM, Google and startups like Rigetti and D-Wave are actively working on quantum computing research and development.

IBM Watson

Moving to Sixth Generation

The fifth generation of computers represents a period of profound transformation, with a focus on AI, parallel processing and the development of technologies that continue to shape the digital landscape. As technology continues to advance, the fifth generation sets the stage for ongoing innovations in computing, including the exploration of quantum computing and the continued integration of AI into various aspects of our lives.

6. Sixth Generation of Computers

The sixth generation of computers are still in the early stages of development and concrete examples are not yet been fully realized. Predictions and expectations for the sixth generation generally involve advancements in technologies such as quantum computing, artificial intelligence (AI) and further integration of computing into various aspects of daily life. Here are key concepts associated with the potential characteristics of the sixth generation.

AI Chips

  • Quantum Computing – Quantum computing represents a paradigm shift in computing, utilizing the principles of quantum mechanics to perform calculations at speeds that surpass classical computers. Quantum computers have the potential to solve complex problems, such as optimization tasks, cryptography and simulations, at a pace that was previously unimaginable.
  • Biocomputing and Neuromorphic Computing – The sixth generation may explore the integration of biological components into computing systems. This includes the use of DNA computing and other biologically-inspired computing approaches. Drawing inspiration from the human brain, neuromorphic computing aims to create processors that mimic the brain’s architecture, potentially leading to more efficient and powerful computing systems for tasks like pattern recognition and learning.
  • AI Integration – The sixth generation is expected to witness the development of even more advanced and sophisticated AI systems , capable of complex reasoning, problem-solving and decision-making. AI may become further integrated into various aspects of daily life, from autonomous vehicles and smart homes to personalized healthcare and virtual assistants.
  • Advanced Robotics – Sixth generation computers may contribute to the development of more advanced and autonomous robotic systems. These could find applications in fields like healthcare, manufacturing and space exploration.
  • Brain-Computer Interfaces (BCIs) – The integration of computers with the human brain through BCIs could become more sophisticated in the sixth generation, allowing for direct communication between the brain and computing systems.
  • Augmented and Virtual Reality – Advances in augmented and virtual reality technologies may further enhance the integration of computing into human experiences. You can expect spatial computing devices like Apple Vision Pro will take the computer technology to imaginary level.
  • Green Computing and Sustainability – The sixth generation may prioritize sustainability and energy efficiency in computing, exploring new technologies to reduce the environmental impact of large-scale computing systems.
  • Edge Computing – This involves processing data closer to the source rather than relying on centralized cloud servers. The sixth generation may see further developments in edge computing for faster data processing and reduced latency.
  • Hybrid Architectures – Hybrid computing architectures that leverage a combination of classical computing, quantum computing and other specialized computing technologies may become prevalent in the sixth generation.
  • Advanced Encryption – With the growing importance of cybersecurity, the sixth generation is likely to bring advancements in encryption and security measures to protect sensitive data.

It’s essential to note that the predictions for the sixth generation are speculative and the timeline for its full realization may extend well into the future. Ongoing research and development in various fields, including quantum computing, AI and biotechnology, will play a crucial role in shaping the characteristics of the sixth generation of computers.

Sixth Generation Computers

The evolution of computers across different generations reflects the relentless pursuit of innovation and improvement in the field of computing. Each generation has left an indelible mark on the digital landscape, shaping the way we work, communicate and live. As we look to the future, the ongoing advancements in technology continue to redefine the possibilities of computing, promising a world where the sixth generation and beyond will unlock new frontiers in computational capabilities.

Use of Latest Computers

About Editorial Staff

Editorial Staff at WebNots are team of experts who love to build websites, find tech hacks and share the learning with community.

You also might be interested in

How to Change Drive Letter in Windows?

How to Change Drive Letter in Windows 10?

By default, people install Windows operating system in drive C:[...]

Change Name of Your Windows PC

4 Ways to Change Computer Name in Windows 10

Windows 10 comes with many great features that improve user[...]

Find Serial Number of Windows Laptop and PC

5 Ways to Find Serial Number of Windows Laptop and PC

There are hundreds of models available in Windows computers from[...]

DOWNLOAD EBOOKS

  • SEO Guide for Beginners
  • WordPress SEO PDF Guide
  • Weebly SEO PDF Guide
  • Alt Code Emoji Shortcuts PDF
  • Free ALT Code Shortcuts PDF
  • View All eBooks

TRENDING TECH ARTICLES

  • 600+ Windows Alt Codes for Symbols
  • Fix Chrome Resolving Host Problem
  • Fix Slow Page Loading Issue in Google Chrome
  • View Webpage Source CSS and HTML in Google Chrome
  • Fix Safari Slow Loading Pages in macOS
  • Fix Windows WiFi Connection Issue
  • ROYGBIV or VIBGYOR Rainbow Color Codes
  • Fix I’m Not A Robot reCAPTCHA Issue in Google Search
  • Structure of HTTP Request and Response

POPULAR WEB TUTORIALS

  • Move WordPress Localhost Site to Live Server
  • Move Live WordPress Site to Localhost
  • Move WordPress Media Folder to Subdomain
  • Fix WooCommerce Ajax Loading Issue
  • Create a Free Weebly Blog
  • Edit Weebly Source Code HTML and CSS
  • Add Scroll To Top Button in Weebly
  • Add Table in Weebly Site
  • How to Add Advanced Data Table Widget in Weebly?
  • Up to $500 Free Google Ads Coupon Codes

FREE SEO TOOLS

  • Webpage Source Code Viewer
  • HTTP Header Checker
  • What is My IP Address?
  • Google Cache Checker
  • Domain Age Checker Tool
  • View All Free Web and SEO Tools

© 2024 · WebNots · All Rights Reserved.

Type and press Enter to search

Fifth generation of computers

The fifth generation computer , also known by its acronym in English, FGCS (for Fifth Generation Computer Systems ), was a project made by Japan that began in 1981 His goal was the development of a new class of computers that would use artificial intelligence techniques and technologies both at the hardware and software levels, using the PROLOG language at the machine language level, and would be able to solve problems. complex, such as machine translation from one natural language to another (Japanese to English, for example). As a unit of measurement of the performance and benefits of these computers, the amount of LIPS ( Logical Inferences Per Second ) capable of performing during the execution of the different programmed tasks was used. For its development, different types of VLSI ( Very Large Scale Integration ) architectures were used.

The project lasted eleven years, but it did not obtain the expected results: current computers continued like this, since there are many cases in which, either it is impossible to carry out a parallelization of the same, or once carried out this there is no improvement, or, in the worst case, there is a loss of performance. It must be clear that to carry out a parallel program we must, to begin with, identify parts within it that can be executed separately on different processors. In addition, the other generations are almost no longer used; It is important to note that a program that is executed sequentially must receive numerous modifications so that it can be executed in parallel; that is, first it would be interesting to study if the work that this entails is really compensated by the improvement in the performance of the task after parallelizing it.

History and development of the project

Background and design.

Throughout the multiple generations since the 1950s, Japan had been the follower in terms of advancing and building computers based on the models developed in the United States and the United Kingdom. Japan, through its Ministry of Economy, Trade and Industry (MECI), decided to break with this nature of following the leaders and in the mid-1970s began to forge a path towards a future in the computer industry. The Japan Information Processing and Development Center (JCDPI) was commissioned to carry out a plan to develop the project. In 1979 they offered a three-year contract to carry out more in-depth studies with the joint participation of companies in the industry dedicated to technology and academic institutions, at the request of Hazime Hiroshi. It was during this period that the term "fifth generation computer" began to be used.

In 1982, at the initiative of MITI, an international conference was held, during which Kazuhiro Fuchi announced the research program, and on April 14, 1982, the government decided to officially launch the project, creating the Institute for New Generation Computer Technology ( Institute for Next Generation Computing Technologies or ICOT for its acronym in English), under the direction of Fuchi, whom he would succeed in the position as director of the Tohru Moto-Oka institute, and with the participation of researchers from various Japanese companies dedicated to the development of hardware and software, including Fujitsu, NEC, Matsushita, Oki, Hitachi, Toshiba and Sharp.

The main fields for research in this project initially were:

  • Technologies for the process of knowledge.
  • Technologies to process databases and mass knowledge bases.
  • High performance work sites.
  • Functional information distributed.
  • Supercomputers for scientific calculation.

Institutional and international impact

Because of the shock that caused the Japanese to be successful in the electronics area during the 1970s, and to do much the same in the automotive area during the 1980s, the fifth generation project it had a lot of reputation among the other countries.

Such was its impact that parallel projects were created. In the United States, the Microelectronics and Computer Technology Corporation and the Strategic Computing Initiative ; on the European side, in the United Kingdom it was ALVEY , and in the rest of Europe his reaction was known as ESPRIT ( European Strategic Program for Research in Information Technology , in Spanish European Strategic Program on Information Technology Research ).

International popularity

Apart from the reactions at the institutional level, on a more popular level it began to be known in the West thanks to the appearance of books in which the project was discussed more or less directly or was cited but mainly by articles that appeared in magazines dedicated to computer enthusiasts; thus, for example, in the August 1984 issue of the American Creative Computing an article was published that dealt extensively with the subject, "The fifth generation: Japan& #39;s computer challenge to the world" (translated, The fifth generation: The Japanese computer challenge to the world ). In the Spanish-speaking area, for example, MicroHobby magazine can be cited, which in July 1985 published an interview with Juan Pazos Sierra, a PhD in Computer Science and linked at that time to the Faculty of Computer Science of the University of Madrid, in which briefly described the project as:

...a Japanese project that has curious and special features; first, the claim is to build a VLSI technology-based computer, with an architecture not Von Neumann and that would lead logical programming as a software core, the PROLOG language, to finally build on all this Expert Systems.

And regarding its potential results, he expressed a relatively optimistic opinion, in line with what was predicted by the promoters of the project themselves. Thus, when asked if any results had been obtained in it, he replied:

Right now, nothing. Much is going to be developed, new technologies will appear, new systems and research will be greatly enhanced by the tremendous injection of money that the fifth generation project has meant for Artificial Intelligence.

For his part, Román Gubern, in his essay The computerized ape of 1987, considered that:

...the fifth generation computer is a real attempt at technological duplication of the intellect Homo sapiens .

Main events and completion of the project

  • 1981: The international conference is held that outlines and defines the objectives and methods of the project.
  • 1982: the project begins and receives grants for equal shares from industry and government sectors.
  • 1985: the first hardware developed by the project, known as Staff Sequential Inference machine (PSI) and the first version of the operating system Sequential Inference Machine Programming Operating System (SIMPOS). SIMPOS was programmed in Kernel Language 0 (KL0), a concurrent Prolog variant with extensions for object-oriented programming, the ESP metalenguaje. Shortly after PSI machines, CHI machines were developed ( Co-operative High-performance Inference machine ).
  • 1986: Last Machine Delta , based on relational databases.
  • 1987: a first prototype of the hardware called Parallel Inference Machine (PIM) using several networked PSI machines. The project receives grants for five more years. A new version of the proposed language is developed, Kernel Language 1 (KL1) very similar to "Flat GDC" Flat Guarded Definite Clauses ), influenced by subsequent developments of the Prolog and directed to parallel computing. The SIMPOS operating system is rewritten in KL1 and renamed Parallel Inference Machine Operating System , or PIMOS.
  • 1991: Work on PIM machines is completed.
  • 1992: the project is extended one more year from the original plan, which ended this year.
  • 1993: The project of the fifth generation of computers is officially completed, although a new two-year project, called for, is launched to make the results known. FGCS Folow-on Project . The source code of the PIMOS operating system is released under public domain license and the KL1 is ported to UNIX systems, resulting in the KLIC ( KL1 to C compiler ).
  • 1995: complete all institutional initiatives linked to the project.

As one of the final products of the Project, five Parallel Inference Machines (PIM) were developed, called PIM/m, PIM/p, PIM/i, PIM/k and PIM /c , having as one of its main features 256 Network Coupled Processing elements. The project also produced tools that could be used with these systems such as the Kappa parallel database management system, the HELIC-II legal reasoning system, the programming language Quixote , a hybrid deductive object-oriented database and logic programming language, and the automatic theorem prover MGTP .

Eleven years after the start of the project, the large sum of money, infrastructure and resources invested in it did not correspond to the expected results and it was terminated without having met its objectives. William Zachman criticized the project a year before its completion, arguing:

It perjuates the development of AI applications; with AI, it does not matter the system, while there are no powerful inference mechanisms. There are already a lot of IA-type applications, and I'm waiting for the arrival of the powerful inference engine, which is why the fifth generation computer is an error.

The proposed hardware and its software developments had no place in the computing market, which had evolved since the project was launched, and in which general-purpose systems could now handle most of the tasks. tasks proposed as initial objectives of the fifth generation machines, in a similar way to what had happened in the case of the potential market of Lisp machines, in which systems for the creation of rule-based Expert Systems such as CLIPS, implemented on common computers, had made these expensive machines unnecessary and obsolete.

On the other hand, within the disputes between the different branches of artificial intelligence, the Japanese project started from the paradigm based on logic programming and declarative programming, dominant after the publication in 1969 by Marvin Minsky and Seymour Papert of the book Perceptrons , which would gradually pass into the background in favor of Artificial Neural Network (RNA) programming after the publication in 1986 by McClelland and Rumelhart of the book Parallel Distributed Processing , which together with its few results contributed to the fact that the fifth generation project fell into oblivion when it came to an end in 1993.

The Institute for New Generation Computer Technology (ICOT) was renamed in 1995 to the Research Institute for Advanced Information Technology (AITEC), a center that was closed in 2003, passing all its resources to the Advanced IT Research Group (AITRG), dependent on the Research Department of the JIPDEC.

First stage

Sequential machines PSI ( Personal Sequential Inference machine ) and CHI ( Co-operative High-performance Inference machine ):

  • PSI-I: 30 KLIPS (Logical Inference Per Second)
  • PSI-II: PSI-I + CPU VLSI
  • CHI-I: 285 KLIPS

Machine in parallel PIM ( Parallel Inference Machine ):

Relational database machine:

Second stage

Sequential machines:

  • CHI-II: 490 KLIPS

Machines in parallel:

Third stage

  • PIM/p: 512 RISC microprocessors, 256 MB memory
  • PIM/m: 256 CISC microprocessors, 80 MB memory
  • PIM/c: 256 CISC microprocessors, 160 MB memory
  • PIM/k: 16 RISC microprocessors, 1 GB memory
  • PIM/i: 16 RISC microprocessors (LW type), 320 MB memory

From Fifth Generation Computing to Skill Science

A Biographical Essay of Koichi Furukawa

  • Open access
  • Published: 25 April 2019
  • Volume 37 , pages 141–158, ( 2019 )

Cite this article

You have full access to this open access article

fifth generation computer essay

  • Tomonobu Ozaki   ORCID: orcid.org/0000-0001-7769-4504 1 ,
  • Randy Goebel 2 &
  • Katsumi Inoue 3 , 4 , 5  

2624 Accesses

1 Altmetric

Explore all metrics

Professor Koichi Furukawa, an eminent computer scientist and former Editor-in-Chief of the New Generation Computing journal, passed away on January 31, 2017. His passing was a surprise, and we were all shocked and saddened by the news. To remember the deceased, this article reviews the great career and contributions of Professor Koichi Furukawa, focusing on his research activities on the foundation and application of logic programming. Professor Furukawa had both a deep understanding and broad impact on logic programming, and he was always gentle but persistent in articulating its value across a broad spectrum of computer science and artificial intelligence research. This article introduces his research along with its insightful and unique philosophical framework.

Similar content being viewed by others

fifth generation computer essay

Harald Ganzinger’s Legacy: Contributions to Logics and Programming

fifth generation computer essay

From Logic to Computer Science – A Personal Experience

fifth generation computer essay

Larry Wos: Visions of Automated Reasoning

Avoid common mistakes on your manuscript.

Biography Summary

Professor Koichi Furukawa (1942–2017) was born in Manchuria, the present northeastern part of China, and returned to Japan when he was 3 years old. He was recognized as a talented young man from an early age, and was especially good at mathematics, science, and music. Even as a young man, he was also socially active, e.g., he participated in a choral group.

In 1961, he entered the University of Tokyo. In addition to studies in statistical mathematics, he also joined the university orchestra group, and was eagerly practicing cello playing in his college years. It is not a surprise that both mathematics and music became a consistent thread throughout his entire research career.

After graduation, he worked on a time-sharing system at the public national Electro Technical Laboratory (ETL), which eventually became part of the National Institute of Advanced Industrial Science and Technology (AIST) in 2001. He received a Ph.D. in Engineering from the University of Tokyo in 1980, based on a dissertation on the topic of query answering in deductive databases, implemented in LISP.

Furukawa spent 1976 in California at the Stanford Research Institute (now SRI International) where he first encountered the idea of logic programming in the form of a Prolog interpreter, written by Alain Colmerauer. He was very excited about this implementation of computational logic, and returned to ETL the following year, and introduced the idea to his colleagues at ETL. The response was very positive at ETL, which then acquired David H.D. Warren’s DEC-10 implementation of Prolog. Furukawa wrote a Prolog program for database indexing, and quickly discovered that one could use logic programming to both specify and implement very efficient and high-level systems, including the first production system interpreter shown in Fig.  1 . According to [ 6 ], Furukawa described this experience as follows.

I wrote a program to solve the Rubik cube problem in DEC-10 Prolog [ 12 ]. It ran efficiently and solved the problem in a relatively short time (around 20 seconds). It is a kind of expert system in which the inference engine is a Production System realized efficiently by a tail-recursive Prolog program. From this experience, I became convinced that Prolog was the language for knowledge information processing.

David H. D. Warren mentioned that one can easily take this message to mean that Furukawa had fallen in love with Prolog. He was one of the earliest researchers to exploit the clarity and power of Prolog for building meta-interpreters, where sophisticated reasoning (e.g., abduction, analogy, higher-order predicate logic) could be pursued with both ease and clarity. Since that early work on Prolog, logic programming became a foundation of both his future theoretical and application work, which carried on until the very end of his life.

figure 1

Abstract specification of a production system in Prolog (meta-interpreter), extracted from [ 12 ]

The Japanese national project on Fifth Generation Computing Systems (FGCS) began in 1982, and Furukawa joined the project as a research director at the Institute for New Generation Computer Technology (ICOT). His friend and colleague from the University of Tokyo and ETL, Professor Kazuhiro Fuchi, was a “born leader” and helped convene both public and industrial funders to create the joint private–public 10 year project. The overall project had the goal of advancing research in the area of parallel inference machines, their applications and even operating systems, based on logic programming. Furukawa became the deputy director, recruited to lead the foundational research on logic programming. Further details of his role in major achievements in advancing logic programming at ICOT are described in “ Research at Institute for New Generation Computer Technology ”.

In 1992, after the completion of the Fifth Generation Computing project, Furukawa returned to academia, to the Graduate School of Media and Governance at Shonan Fujisawa Campus (SFC), Keio University. There he became a mentor for many graduate students, and led further work on Inductive Logic Programming (ILP) [ 36 ] and data mining. ILP is a subfield of machine learning which uses logic programming for inductive inference. He had already started to work on ILP when he was in ICOT, and continued to conduct both theoretical and practical projects with colleagues and students while at Keio.

At this time, his strong interest in human activities was revived, especially the skill in playing music instruments. With a focus on musical instrument performance, he began to intensively develop a scientific framework for verbalization of human tacit knowledge, especially to be able to make implicit musical performance skills sufficiently explicit to enable performance improvements in performers. This subsequently led to the creation of a new research field called skill science. In this new research field, Furukawa sought the development of knowledge-based systems to acquire explicit and understandable knowledge, by exploiting abductive and analogical reasoning implemented in logic programming. In 2010, he moved to Kaetsu University, where he continued his intensive research on skill science.

He finally published a summary of his skill science work, including a handbook for cello players, co-authored with Toshiki Masuda, published in 2016 [ 34 ]. He intended to translate the book into English, but that project remained uncompleted at the time of his death. He also planned a more comprehensive monograph entitled “Abduction and Induction—A logic programming approach.” But in the middle of that dream, he passed away suddenly just on the day where he was to meet with a publishing company.

In the following sections, Furukawa’s research achievements are introduced in chronological order, that is, (1) the results in the FGCS project (“ Research at Institute for New Generation Computer Technology ”), (2) the basic research on ILP (“ Research on Inductive Logic Programming ”), and (3) introduction and advancement of skill science (“ Research on Skill Science ”).

A particular focus is put on the relationship with logic programming, and it is shown that all these achievements are tightly connected.

Research at Institute for New Generation Computer Technology

In 1982, the Fifth Generation Computer Systems (FGCS) Project began, motivated by a Japanese national policy of developing as a technologically advanced nation. The Institute for New Generation Computer Technology (ICOT) was established to run the project, and Furukawa joined as a research director. The main aim of the project was research and development of new computer technologies for knowledge information processing on parallel inference machines based on logic programming. Until the project finished successfully in 1992, Furukawa engaged in working on various research challenges in logic programming, including concurrent logic programming, partial evaluation, and advanced reasoning such as abduction and induction. In addition, he was in charge of international services, where he not only worked hard to deliver results to conferences and institutes around the world, but also recruited young researchers to ICOT.

Parallel Inference and Logic Programming

At the beginning of the project, ICOT decided to make logic programming the foundation principle of the project. However, at that time the programming language Prolog was the only implementation, and it could not explicitly express concurrent processes, which were required as a basic component of modern operating systems. So ICOT explored the possibility of concurrent logic programming by focusing on the concurrent programming within a framework of logic programming proposed by Keith Clark and Steve Gregory [ 1 ]. Both Parlog [ 2 ] and Concurrent Prolog [ 47 , 48 ] were examined as candidates; finally, ICOT consolidated all those ideas in the development of its own concurrent logic programming language named Guarded Horn Clauses (GHC) [ 51 ]. One noticeable feature in GHC is the ability to control parallel computation with a very simple mechanism called a guard. It is of note that Furukawa’s broad knowledge of computing found the relationship between Dijkstra’s “guarded commands” for programming based on predicate transformation, and his vision for their implementation in GHC. This development of ICOT’s own kernel language is significant, since it made possible to advance the development of hardware and software at the same time. As a consequence, the hardware design of a parallel inference machine (PIM) as well as a parallel operating system named PIMOS was developed, with one of the few projects that embraced simultaneously the development of hardware and software.

As a concrete application of parallel knowledge information processing, Furukawa encouraged Ryuzo Hasegawa and his colleagues in the development of a theorem prover for full first-order predicate logic as a powerful extension of Prolog’s inference mechanism. They employed a Prolog technology theorem prover SATCHMO [ 33 ] written in only eight clauses in Prolog as a target, and created an implementation in GHC. The resulting interpreter of SATCHMO in GHC is shown in Fig.  2 . This attempt led to a parallel theorem prover named MGTP (Model Generation Theorem Prover) [ 16 , 17 ]. MTGP was one of the principal research outcomes of ICOT. It achieved very high performance in theorem proving by parallel speedup (it ran 220 times faster on a parallel inference machine with 256 processors), and solved a number of open problems in finite algebra. The MGTP achievement was a promising demonstration of the potential of parallel inference for the whole world.

figure 2

A part of SATCHMO interpreter in GHC

Partial Evaluation in Logic Programming

Furukawa and his colleagues also worked on a variety of techniques known as partial evaluation of logic programs [ 32 ]. The idea is that, since all computation with logic programs is inference, even partially executed logic programs can be interpreted and provide the basis for other kinds of reasoning. As noted below, this has been tried for both conventional systems and compilation, but also as an implementation technique for diagnostic, analogical, and higher-order reasoning.

Partial evaluation was first described as an optimization technique for fast computation of meta-interpreters. A simple partial evaluator in Prolog is shown in Fig.  3 . As a promising application of the partial evaluator for knowledge-based systems, Furukawa developed a self-applicable partial evaluator with an incremental compilation method in Prolog, which was used for compiler generation and compiler generator generation [ 3 ]. Using similar and extended techniques of partial evaluation, Furukawa, together with Randy Goebel and David Poole, proposed a theory formation system that provided a reformulation of rule-based diagnosis systems [ 15 ]. Furthermore, he developed partial evaluation techniques for GHC programs [ 4 ]. It should be noted that partial evaluators developed by Furukawa are based on meta-interpreters of Prolog or GHC programs.

figure 3

Partial evaluator in Prolog

Artificial Intelligence and Logic Programming

Furukawa’s work on theory formation with Goebel et al. [ 15 ] is also considered as an early work on abductive logic programming [ 27 ], which performs abduction in logic programming. In fact, Furukawa conducted at ICOT several projects on advanced reasoning in logic programming. Furukawa had noticed that artificial intelligence (AI) or intelligent computing systems would need these advanced reasoning methods in addition to deductive technologies.

An integrated framework emerged in the work on knowledge assimilation , which maintains and updates knowledge bases whenever a new piece of knowledge or information is acquired. Integrity constraints play an important role in knowledge assimilation, and all modes of reasoning, i.e., deduction, induction and abduction, are involved in the update process. The first knowledge assimilation system was developed in Prolog at ICOT [ 35 ], and since then, hypothetical reasoning systems have been developed in ICOT including a version of Theorist [ 15 ]. These attempts were indeed too advanced in the middle of 1980s, and people could only notice the importance much later in 1990s [ 27 ].

As a natural extension of Furukawa’s depth of understanding of AI through logic programming, researchers at ICOT developed an interest in nonmonotonic reasoning , constraint ( logic ) programming , and inductive logic programming (ILP). These logic programming paradigms could be specified and implemented with some techniques of meta-programming and partial evaluation, providing both clarity of their scopes and exposing practical challenges in their implementation.

It is remarkable that Furukawa was involved in early work on abductive and inductive reasoning as contributions to the foundations of AI. He encouraged David Poole, Randy Goebel, Stephen Muggleton and others in their pursuit of the creation of new AI systems such as Theorist [ 45 ] and Progol [ 39 ]. Furukawa also invited many other AI researchers to ICOT, including Mark Stickel for theorem proving and abductive reasoning and Nicolas Helft for ILP and nonmonotonic reasoning. Katsumi Inoue enjoyed discussions and collaboration, which resulted in the successful development of a consequence finding system for first-order full clausal theories called SOL resolution [ 19 ]. SOL resolution was later implemented in SOLAR [ 42 ], and Furukawa used SOLAR for meta-level abduction in his later work on skill science.

In the late 1980s, researchers of logic programming and AI in the world began collaborations to extend the class of definite logic programs by introducing negation and abducibles in the bodies of program rules, to allow the users to declaratively express defaults or hypotheses. Logic programs with negation became the basic form of nonmonotonic reasoning, and then lead to answer set programming in the 21st century. On the other hand, logic programs with abducibles were soon called abductive logic programs, and were integrated in ILP [ 41 ]. Around the end period of the FGCS project, Hasegawa, Inoue and their colleagues implemented both nonmonotonic reasoning [ 24 ] and hypothetical reasoning [ 25 ] on top of MGTP, thereby showing the effect of parallelism of these reasoning systems on PIM machines. Bob Kowalski admitted, in his perspective of FGCS [ 31 ], that this last achievement was very important, by saying that “ICOT has been a significant participant in these developments.” Furukawa’s perspectively concluded [ 6 ] as follows:

However, most research output, including the knowledge assimilation system was not integrated into the concurrent logic programming framework. Therefore, very little was produced for final fifth-generation computer systems based on PIM. An exception is a parallel bottom-up theorem prover called MGTP, and application systems running on it. An example of such applications is hypothetical reasoning. I expect this approach will attain my original goal of “high-speed knowledge information processing based on parallel computation mechanisms and specialized hardware” in the near future.

Managing International Affairs Services

In the 1980s, Furukawa took a leadership role at ICOT, to connect the world to the Japanese initiative in logic programming and artificial intelligence. He organized many bilateral (UK–Japan, French–Japan, and USA–Japan), trilateral (Italy–Japan–Sweden) and international workshops. During his many visits to research laboratories, he invited many top-level researchers to ICOT: J. A. Robinson, Robert A. Kowalski, Ehud Shapiro, Keith Clark, Randy Goebel, David Poole, Mark Stickel, Donald W. Loveland, Wolfgang Bibel, Gerd Brewka, Ray Reiter, Vladimir Lifschitz, and Stephen Muggleton, among many others. He was always noted to be a gracious intellectual host, always seeking to find ideas and methods that could strengthen and compliment the research goals of those visitors. In his final lecture at Keio University, he said that his “total number of business trip abroad reaches to about sixty in eleven years.” So in addition to providing tireless research leadership within ICOT, he was also a tireless missionary for logic programming all around the world.

Within the logic programming network he created around the world, he was a central figure in organizing the series of International Conference on Logic Programming (ICLP). He was the official program chair of the Eighth International Conference on Logic Programming (ICLP 1991) in Paris [ 5 ].

Within the research community on logic programming, he was also famous as a cello player in “Logic Programming Trio,” (Fig.  4 ) formed with J. A. Robinson and Jacques Cohen. After their first concert at the Joint International Conference and Symposium on Logic Programming in 1992, they subsequently showed many more amazing musical performances at many international conferences.

figure 4

Logic Programming Trio: Jacques Cohen, J. Alan Robinson, Koichi Furukawa, circa 1996 ( http://jc.cs.brandeis.edu/?gallery=music-2 )

Research on Inductive Logic Programming

Inductive logic programming (ILP) [ 36 , 40 , 41 ] is an interdisciplinary research area that combines logic programming and machine learning methods for inductive inference. Furukawa already had a great interest in ILP while he was in ICOT, and invited Nicolas Helft as a researcher to ICOT and Stephen Muggleton as a guest researcher. Furukawa participated in the first international workshop on Inductive Logic Programming [ 37 ] held in Portugal, then organized the second workshop [ 38 ] with Fumio Mizoguchi in Tokyo in 1992.

As a theoretical research pursuit within the scope of ILP, Furukawa made an effort to make Inverse Entailment [ 39 ] complete. This idea is highly appealing, for it provides the basis to understand the formal relationship between facts stated about some domain, and those hypotheses which could minimally cover those facts, much like the formation of “best” or minimal hypotheses in the framework of scientific reasoning. Inverse Entailment [ 39 ] was formulated as an inductive inference rule, which effectively uses the most specific hypothesis to derive hypotheses for given positive examples and background knowledge. Right after the incompleteness of Inverse Entailment was articulated [ 53 ], Furukawa proposed a sufficient condition for existence of the most specific hypothesis [ 11 ] as well as procedures for constructing a complete most specific hypothesis for recursive programs [ 9 , 13 ].

Furukawa was always a fearless intellectual, and in addition to the specification of sufficient conditions for the most specific hypothesis, he also worked on that foundation as a use for higher-order concept formation, perhaps the most elaborate form of ILP to date, e.g., [ 44 ]. He also pointed out the relationship between inverse entailment of inductive inference and the consequence finding, and in 1995 he encouraged Katsumi Inoue to use a consequence finding procedure like SOL resolution for computing inverse entailment. In 2001, Inoue presented a complete algorithm called CF Induction [ 20 ] based on this suggestion.

Furukawa applied his general understanding of ILP to a variety of real-world problems including business applications. For example, he developed an expert system named AUTOMAIL, to support consumer product call center operators to promptly and accurately prepare near optimal responses to their customers’ questions [ 50 ]. In that system, Shimazu and Furukawa used ILP to construct rules, as shown in Fig.  5 , to classify call center inquiries into 86 classes. This practical system greatly reduced the burden on the operator’s work in real environments.

figure 5

Rules in Prolog for classifying a given inquiry into prototypical question no. 85

Furukawa was also devoted to the dissemination of ILP in Japan. He prepared a lecture course for ILP in the graduate school at Keio University, and published the first text book written in Japanese [ 14 ].

Research on Skill Science

After moving to Keio University, Furukawa’s creation of the field of “Skill Science” started out as research on the “verbalization of human tacit knowledge.” He believed, like with the use of logic programming for parallel systems building and for induction, that it could also provide the basis for articulating the tacit knowledge of physical skills like playing an instrument or performing as an athlete. In fact, this intellectual motivation combined well with his passionate interest in music, especially playing the cello. He always believed that the performance skills of highly skilled musicians could be captured in some logical form, and then conveyed to musicians to improve their performance skills. In this case, his focus was on cello playing, and on improving his own performance skills as a competent amateur cellist. In this regard, the scientific framework he developed included a variety of methods, including inductive logic programming, time series data mining, abduction, and analogical reasoning. He was an early advocate of research on “human-like computing,” and he and his colleagues developed a new research field called skill science. Skill science is a new multidisciplinary research area with approaches including artificial intelligence, cognitive science, sports science, biomechanics and kinesiology, and ecological psychology [ 52 ]. The overall goal was to bring skills of tacit embodied expertise onto the podium of scientific exploration, to understand deeply the processes of skill acquisition, and to explore ways of designing good pedagogical environments. In this sense, Furukawa sought to focus all his research skills and background to help understand how to build the knowledge to help be a better skill science teacher.

Induction in Skill Science

In the early stages of his skill science study, Furukawa organized the “Keio International Workshop on Verbalization of Tacit Knowledge using Inductive Inference” [ 7 ] (Fig.  6 ) by inviting experts with different backgrounds, including computer scientists, cognitive scientists, and professional musicians. As the title of the workshop suggested, he considered inductive inference, especially inductive logic programming, as a central technology of the project. This underlying thought can be confirmed in the preface of the workshop proceedings, which is concluded by the following sentence:

In particular, we wish to discuss and explore the possibility of using inductive logic programming as a common tool to uncover the structures of tacit knowledge across different aspects of human activities.

figure 6

Proceedings of Keio International Workshop on Verbalization of Tacit Knowledge using Inductive Inference (left), and Poster of International Symposium on Skill Science 2007 (right)

In the research pursuit of articulating human tacit knowledge, Furukawa employed ILP for the analysis of human respiration during musical performance [ 18 ]. While respiration control is one of the most important factors for effective musical performance, it is often difficult for even experts to explain how to clearly explain the role and control of respiration during performance. So in Furukawa’s research approach, he sought to capture rules that showed repeatability and regularity in a performer’s respiration patterns. These rules were successfully extracted with ILP from sensor data together with musical/performance background knowledge, such as harmonic progression and bowing direction. It was also discovered that players tend to exhale at the beginning of new large musical structures, and inhale immediately before the change of keys.

Apart from the improvements of his own cello performance, Furukawa concentrated on language acquisition and concept formation as a central requirement to capture complex human tacit knowledge. For example, he proposed a computational model for children’s language acquisition process using ILP [ 30 ]. His system named WISDOM (Fig.  7 ) succeeded in reproducing a part of a co-evolution mechanism of acquiring concept definitions for words and in the development of concept hierarchies by incorporating cognitive biases.

figure 7

Configuration of WISDOM (from Figure 1 in [ 30 ])

In 2007, Furukawa organized the International Symposium on Skill Science [ 8 ] (Fig.  6 ). The symposium had three invited lectures, one of which was a professional violinist Ikuko Mizuno, who gave a lecture with the title of “How to play the violin - Controlling the body and mind.” Also included was a panel discussion entitled “Three important problems of skill science for next five years,” by Tsutomu Fujinami, Masaki Suwa, J. A. Robinson, Jacques Cohen, and Koichi Furukawa. This symposium demonstrated the true interdisciplinary nature of skill science and research presentations were made from a broad variety of areas. In addition, all the members of the musical group known as the Logic Programming Trio, participated in the panel, so we can see that logic programming was indeed considered as one of the fundamental techniques in skill science research. Although it is a digression, we enjoyed the concert by “Logic Programming Trio++”, i.e., Logic Programming Trio with Ikuko Mizuno (violin) and Maya Okamoto (viola) in the symposium. In fact, many of Furukawa’s interactions with his colleagues included some kind of musical performance.

Abduction in Skill Science

Furukawa investigated the application of abductive reasoning to skill science, because he believed that the framework of abductive reasoning for generating hypotheses was necessary to explain the basis for amazing observed performance facts; this exactly matched his idea of the mechanism of skill acquisition.

He first employed an abductive logic programming system ProLogICA [ 46 ] to find appropriate hypotheses to explain the behavior of both professional and amateur incorrect cello performances [ 29 ]. An example of Abductive Logic Programming for cello playing is shown in Fig.  8 . The program in Fig.  8 is for finding a cello playing technique for the rapid position shift accomplished by two possible methods: (1) move the elbow up and down by adduction and abduction of shoulder and elbow joints, or (2) move the hand position up and down by incycloduction and excycloduction of the upper arm. In the course of his experiments, the abductive logic program derived only the second method, because of the specified integrity constraints.

figure 8

Abductive Logic Programming for explaining the proposition rapidPositionShift (a method of cello playing accompanied with high-speed position shift)

In fact, during his study of skill science using abduction, Furukawa experienced some sudden skill improvement in his own cello playing, after his final lecture concert at the Keio University (March 2008). Footnote 1 This improvement arose by simply keeping his right arm close, that is, to keep his elbow close to the body side. This method not only increased the sound volume, but helps keep bowing stable and supports maximum bow usage. As noted in the comments about making tacit knowledge explicit for teaching, Furukawa believed that the reproduction of good skill requires explanation, which helps makes the skill robust and tolerant of situation changes. In fact, his belief was that no explicit discovery of tacit knowledge is useful if it could not be explained. The process of explanation led further to another important finding, which connected abduction and ILP to the general notion of scientific discovery. With respect to skill science and, specifically musical instrument performance, Furukawa focused on what he called “knack discovery,” and worked hard to formulate knack discovery using a bold combination of ILP, abduction, and analogical reasoning, which he called “meta-level abduction.”

Meta-level abduction [ 21 , 23 ] performs abduction on meta-level axioms, and abduces rules that infer hypotheses explaining empirical rules by means of hidden rules and containing unknown nodes as new predicates. The combination of rule abduction and fact abduction is possible using conditional query answering in a consequence finding system SOLAR [ 42 , 43 ]. Meta-level abduction was applied to knack discovery [ 23 ] as well as biological network completion [ 22 ]. From the axioms and domain-specific background knowledge shown in Fig.  9 , a hypothesis

is successfully inferred for explaining the goal process caused(inc_sound, keep_arm_close) : that goal process represents the alleged causality that keeping one’s arm close ( keep_arm_close ) makes the sound increase ( increase_sound_volume ). This use of meta-level abduction provided confirmation that a knack represented by an \(\exists {\mathtt{X}}\) statement can be automatically discovered and confirmed.

figure 9

An example of knack discovery in meta-level abduction

While the above form of meta-level abduction can abduce missing links, the abduced rules must be interpreted in some external way. That is, it is not easy to find language concept equivalents in which to convey the meta-level abduced structures. As a potential solution to this hypothesis interpretation problem, Furukawa developed an extension of meta-level abduction named analogical rule abduction [ 10 , 28 ] by adding axioms for analogical inference. Analogical rule abduction makes it possible to exploit a vocabulary of analogical relations to provide understandable interpretation to the introduced predicates and rules; analogical inferences across vocabularies can create appropriate cross-vocabulary language concepts. The axioms for analogical rule abduction are shown in Fig.  10 . In those axioms, causalities in source and target worlds of analogy are represented in the predicate b_ - and t_caused/3, respectively. Three kinds of connections in target worlds represented in the predicate t_connected/2 are considered: (1) observed connections (connections without assumption), (2) connections by abduction, and (3) connections by analogy. The proposed framework was applied to the problem of explaining the difficult cello playing techniques of spiccato and rapid cross strings of the bow movement.

figure 10

Axioms in analogical rule abduction

In fact in this near penultimate work, Furukawa had finally integrated all of his background on logic programming, inductive logic programming, abduction, analogical reasoning, and higher-order reasoning, into one grand demonstration of how the knacks of performance could be identified and articulated as outputs of his skill science framework.

By this final research achievement, Furukawa was invited to the 25th International Conference on Inductive Logic Programming (ILP 2015), which was held in Kyoto, Japan, as a special invited speaker. It was his first return to an ILP conference since 2001. The editor of the special issue on ILP 2015 in Machine Learning journal [ 26 ] drew the following conclusion:

Dr. Furukawa contributed to establish the field of ILP and will be missed by the entire ILP community. In ILP 2015 he was very pleased that ILP conferences were held for 25 years. We would like to dedicate this special issue to the memory of Dr. Furukawa.

Katsumi Inoue also gave a speech in memory of Koichi Furukawa at ILP 2017 held in Orléan, France, September 5th, 2017, which was organized together with memorials of Alain Colmerauer and Alan Robinson given by Frédéric Benhamou and Stephen Muggleton, respectively.

This article reviewed Professor Koichi Furukawa’s research activities from Fifth Generation Computing to Skill Science, focusing on the logic programming point of view. As his history shows, he contributed to research on logic programming for a long time. Even in his unpublished book, mentioned in “ Biography Summary ”, he was planning to include a small section for explaining a major advantage of logic programming, which is an ability for explicit understanding and explanation necessary for handling human knowledge and skills. We believe that this attribute was one of big reasons that Furukawa was such a strong advocate for logic programming.

Logic programming played a significant role in his research activity. He also reviewed his research history by himself in his final lecture at Keio University and special lecture in the 28th Annual Conference of the Japan Society of Artificial Intelligence, and stated the following comments (translation from Japanese):

There always existed logic programming as the background even when changing the research greatly from the fifth generation computer systems project to skill science.

Divergent researches from fifth generation project and data mining to skill science are converging within myself.

Researches on meta-programming, inductive reasoning, and abductive reasoning in the fifth generation project era helped the research on skill science.

A significant personal attribute of Koichi Furukawa is that he was never afraid of tackling very difficult fundamental problems, and he always kept in mind how one should document any research progress, so that others could understand and exploit that progress. As a mentor, his style was to ask simple deep questions about how an idea was working (or not), and to encourage his students and colleagues alike, to always broaden their perspective when they reached an apparent research challenge.

We conclude this essay by noting that we expect his research legacy will continue to lead to further development of logic programming and skill science. May he rest in peace.

His final lecture concert, which is also a part of his research results on skill science, can be seen at https://www.youtube.com/watch?v=r1qoyQIYt6s . A full version is available at http://gc.sfc.keio.ac.jp/cgi/video_gc/view_video_gc.cgi?2007_gc00001+10+1 by using a flash player (start from about 38:00).

Clark, K.K., Gregory, S.: A relational language for parallel programming. In: Proceedings of the 1981 Conference on Functional Programming Languages and Computer Architecture, pp. 171–178. ACM, Boston (1981). https://doi.org/10.1145/800223.806776

Clark, K.L., Gregory, S.: PARLOG: a parallel logic programming language. Research Report DOC 83/5, Department of Computing, Imperial College, London (1983)

Fujita, H., Furukawa, K.: A self-applicable partial evaluator and its use in incremental compilation. New Gener. Comput. 6 (2–3), 91–118 (1988). https://doi.org/10.1007/BF03037133

Article   MATH   Google Scholar  

Fujita, H., Okumura, A., Furukawa, K.: Partial evaluation of GHC programs based on UR-set with constraint solving. ICOT Technical Report, TR-344 (1988)

Furukawa, K.: In: Proceedings of the Eighth International Conference on Logic Programming. MIT Press, Cambridge (1991)

Furukawa, K.: Contribution in [49], pp. 60–65 (1993)

Furukawa, K.: In: Proceedings of the Keio International Workshop on Verbalization of Tacit Knowledge Using Inductive Inference. Keio University, Tokyo (1996)

Furukawa, K.: In: Proceedings of the International Symposium of Skill Sciences. Keio University, Tokyo (2007)

Furukawa, K.: On the completion of the most specific hypothesis computation in inverse entailment for mutual recursion. In: Proceedings of the First International Conference on Discovery Science. Lecture Notes in Computer Science, vol. 1532, pp. 315–325. Springer, Berlin (1998). https://doi.org/10.1007/3-540-49292-5_28

Furukawa, K., Kinjo, K., Ozaki, T., Haraguchi, M.: On skill acquisition support by analogical rule abduction. In: International Workshop on Information Search, Integration, and Personalization, Revised Selected Papers, Communications in Computer and Information Science, vol. 421, pp. 71–83. Springer, Cham (2014). https://doi.org/10.1007/978-3-319-08732-0_6

Furukawa, K., Murakami, T., Ueno, K., Ozaki, T., Shimazu, K.: On a sufficient condition for the existence of most specific hypothesis in Progol. In: Proceedings of the 7th International Workshop on Inductive Logic Programming. Lecture Notes in Computer Science, vol. 1297, pp. 157–164. Springer, Berlin (1997). https://doi.org/10.1007/3540635149_44

Furukawa, K., Nakajima, R., Yonezawa, A., Goto, S., Aoyama, A.: Problem solving and inference mechanisms. In: Moto-oka, T. (ed.) Fifth Generation Computer Systems, pp. 131–138. Elsevier (1982). https://doi.org/10.1016/B978-0-444-86440-6.50008-6

Furukawa, K., Ozaki, T.: On the completion of inverse entailment for mutual recursion and its application to self recursion. In: Work-in-Progress Reports of the 10th International Conference on Inductive Logic Programming, CEUR Workshop Proceedings 35 (2000)

Furukawa, K., Ozaki, T., Ueno, K.: Inductive Logic Programming. Kyoritsu Shuppan, Tokyo (2001) (in Japanese)

Google Scholar  

Goebel, R., Furukawa, K., Poole, D.: Using definite clauses and integrity constraints as the basis for a theory formation approach to diagnostic reasoning. In: Proceedings of the Third International Conference on Logic Programming. Lecture Notes in Computer Science, vol. 225, pp. 211–222. Springer, Berlin (1986). https://doi.org/10.1007/3-540-16492-8_77

Hasegawa, R., Fujita, H., Fujita, M.: A parallel theorem prover in KL1 and its application to program synthesis. ICOT Technical Report, TR-588 (1990)

Hasegawa, R., Koshimura. M., Fujita, H.: MGTP: a parallel theorem prover based on lazy model generation. In: Proceedings of the 11th International Conference on Automated Deduction. Lecture Notes in Computer Science, vol. 607, pp. 776–780. Springer, Berlin (1992). https://doi.org/10.1007/3-540-55602-8_223

Igarashi, S., Ozaki, T., Furukawa, K.: Respiration reflecting musical expression: analysis of respiration during musical performance by inductive logic programming. In: Proceedings of the Second International Conference on Music and Artificial Intelligence. Lecture Notes in Computer Science, vol. 2445, pp. 94–106. Springer, Berlin (2002). https://doi.org/10.1007/3-540-45722-4_10

Inoue, K.: Linear resolution for consequence finding. Artif. Intell. 56 (2–3), 301–353 (1992)

Article   MathSciNet   MATH   Google Scholar  

Inoue, K.: Induction, abduction, and consequence-finding. In: Proceedings of the 11th International Conference on Inductive Logic Programming. Lecture Notes in Computer Science, vol. 2157, pp. 65–79. Springer, Berlin (2001). https://doi.org/10.1007/3-540-44797-0_6

Inoue, K.: Meta-level abduction. IfCoLog J. Log. Appl. 3 (1), 7–36 (2016)

MathSciNet   Google Scholar  

Inoue, K., Doncescu, A., Nabeshima, H.: Completing causal networks by meta-level abduction. Mach. Learn. 91 (2), 239–277 (2013). https://doi.org/10.1007/s10994-013-5341-z

Inoue, K., Furukawa, K., Kobayashi, I., Nabeshima, H.: Discovering rules by meta-level abduction. In: Proceedings of the 19th International Conference on Inductive Logic Programming (ILP ’09). Lecture Notes in Computer Science, vol. 5989, pp. 49–64. Springer, Berlin (2009). https://doi.org/10.1007/978-3-642-13840-9_6

Inoue, K., Koshimura, M., Hasegawa, R.: Embedding negation as failure into a model generation theorem prover. In: Proceedings of the 11th International Conference on Automated Deduction (CADE-11). LNCS, vol. 607, pp. 400–415. Springer, Berlin (1992). https://doi.org/10.1007/3-540-55602-8

Inoue, K., Ohta, Y., Hasegawa, R., Nakashima, M.: Bottom-up abduction by model generation. In: Proceedings of the 13th International Joint Conference on Artificial Intelligence (IJCAI-93), pp. 102–108 (1993). http://ijcai.org/Proceedings/93-1/Papers/015.pdf

Inoue, K., Ohwada, H., Yamamoto, A.: Special issue on inductive logic programming. Mach. Learn. 106 (12), 1863–1865 (2017). https://doi.org/10.1007/s10994-017-5679-8

Kakas, A.C., Kowalski, R.A., Toni, F.: The role of abduction in logic programming. Handb. Log. Artif. Intell. Log. Program. 5 , 235–324 (1998)

Kinjo, K., Ozaki, T., Furukawa, K., Haraguchi, M.: On skill acquisition support by analogical rule abduction. Trans. Jpn. Soc. Artif. Intell. 29 (1), 188–193 (2014). https://doi.org/10.1527/tjsai.29.188 (in Japanese)

Article   Google Scholar  

Kobayashi, I., Furukawa, K.: Modeling physical skill discovery and diagnosis by abduction. Trans. Jpn. Soc. Artif. Intell. 23 (3), 127–140 (2008). https://doi.org/10.11185/imt.3.385

Kobayashi, I., Furukawa, K., Ozaki, T., Imai, M.: A computational model for children’s language acquisition using inductive logic programming. In: Progress in Discovery Science, Final Report of the Japanese Discovery Science Project. Lecture Notes in Computer Science, vol. 2281, pp. 140–155. Springer, Berlin (2002). https://doi.org/10.1007/3-540-45884-0_7

Kowalski, R.A.: Contribution in [49], pp. 54–60 (1993)

Lloyd, J.W., Shepherdson, J.C.: Partial evaluation in logic programming. J. Log. Program. 11 (3–4), 217–242 (1991). https://doi.org/10.1016/0743-1066(91)90027-M

Manthey, R., Bry, F.: SATCHMO: a theorem prover implemented in Prolog. In: Proceedings of the 9th International Conference on Automated Deduction. Lecture Notes in Computer Science, vol. 310, pp. 415–434. Springer, Berlin (1998). https://doi.org/10.1007/BFb0012847

Masuda, T., Furukawa, K.: You Approach the Cello, and the Cello Approaches You—An Approach Based on Skill Science. Doremi Publishing, Tokyo (2016)

Miyachi, T., Kunifuji, S., Kitakami, H., Furukawa, K., Takeuchi, A., Yokota, H.: A knowledge assimilation method for logic databases. New Gener. Comput. 2 (4), 385–404 (1984). https://doi.org/10.1007/BF03037329

Muggleton, S.: Inductive logic programming. New Gener. Comput. 8 (4), 295–318 (2001). https://doi.org/10.1007/BF03037089

Muggleton, S.H.: In: Proceedings of the First International Workshop on Inductive Logic Programming (1991)

Muggleton, S.H.: In: Proceedings of the Second International Workshop on Inductive Logic Programming (1992)

Muggleton, S.: Inverse entailment and Progol. New Gener. Comput. 13 (3–4), 245–286 (1995). https://doi.org/10.1007/BF03037227

Muggleton, S., De Raedt, L.: Inductive logic programming: theory and methods. J. Log. Program. 19 (20), 629–679 (1994). https://doi.org/10.1016/0743-1066(94)90035-3

Muggleton, S., De Raedt, L., Poole, D., Bratko, I., Flach, P., Inoue, K., Srinivasan, A.: ILP turns 20—biography and future challenges. Mach. Learn. 86 (1), 3–23 (2012). https://doi.org/10.1007/s10994-011-5259-2

Nabeshima, H., Iwanuma, H., Inoue, K.: SOLAR: a consequence finding system for advanced reasoning. In: Proceedings of International Conference on Automated Reasoning with Analytic Tableaux and Related Methods. Lecture Notes in Computer Science, vol. 2796, pp. 257–263. Springer, Berlin (2003). https://doi.org/10.1007/978-3-540-45206-5_22

Nabeshima, H., Iwanuma, K., Inoue, K., Ray, O.: SOLAR: an automated deduction system for consequence finding. AI Commun. 23 (2–3), 183–203 (2010). https://doi.org/10.3233/AIC-2010-0465

MathSciNet   MATH   Google Scholar  

Padmanabhuni, S., Goebel, R., Furukawa, K.: Curried least general generalization: a framework for higher order concept learning. In: Learning and Reasoning with Complex Representations, PRICAI’96 Workshops on Reasoning with Incomplete and Changing Information and on Inducing Complex Representations, Selected Papers. Lecture Notes in Computer Science, vol. 1359, pp. 45–60. Springer, Berlin (1998). https://doi.org/10.1007/3-540-64413-X_27

Poole, D., Goebel, R., Aleliunas, R.: Theorist: a logical reasoning system for defaults and diagnosis. In: Cercone, N., McCalla, G. (eds.) The Knowledge Frontier: Essays in the Representation of Knowledge, pp. 331–352. Springer, Berlin (1987) [Also Research Report CS-86-06, Department of Computer Science, University of Waterloo (1986)]

Ray, O., Kakas, A.C.: ProLogICA: a practical system for abductive logic programming. In: Proceedings of the 11th Workshop on Nonmonotonic Reasoning, pp. 304–314. Institut für Informatik, Technische Universität Clausthal, Clausthal-Zellerfeld (2006)

Shapiro, E.Y.: A subset of concurrent Prolog and its interpreter. ICOT Technical Report, TR-003 (1983)

Shapiro, E., Takeuchi, A.: Object oriented programming in concurrent Prolog. New Gener. Comput. 1 , 25–48 (1983). https://doi.org/10.1007/BF03037020

Shapiro, E., Warren, D.H.D. (eds.): Launching the new era—personal perspectives of Fifth Generation Computer Systems project. Commun. ACM 36 (3), 47–101 (1993)

Shimazu, K., Furukawa. K.: DAISY, an RER model based interface for RDB to ILP. In: Proceedings of the 22nd International Conference on Conceptual Modeling. Lecture Notes in Computer Science, vol. 2813, pp. 390–404. Springer, Berlin (2003). https://doi.org/10.1007/978-3-540-39648-2_31

Ueda, K.: Guarded horn clauses. ICOT Technical Report, TR-103 (1985)

Ueno, K., Furukawa, K., Bain, M.: Motor skill as dynamic constraint satisfaction. Linköp. Electron. Artic. Comput. Inf. Sci. 5 (2000), nr36 (2000). http://www.ep.liu.se/ea/cis/2000/036/ . Accessed 10 Apr 2019

Yamamoto, A.: Which hypotheses can be found with inverse entailment? In: Proceedings of the 7th International Workshop on Inductive Logic Programming. Lecture Notes in Computer Science, vol. 1297, pp. 296–308. Springer, Berlin (1997). https://doi.org/10.1007/3540635149_58

Download references

Author information

Authors and affiliations.

College of Humanities and Sciences, Nihon University, Tokyo, Japan

Tomonobu Ozaki

Department of Computing Science, University of Alberta, Edmonton, Canada

Randy Goebel

National Institute of Informatics, Tokyo, Japan

Katsumi Inoue

Department of Informatics, SOKENDAI (The Graduate University for Advanced Studies), Tokyo, Japan

Department of Computer Science, School of Computing, Tokyo Institute of Technology, Tokyo, Japan

You can also search for this author in PubMed   Google Scholar

Corresponding author

Correspondence to Tomonobu Ozaki .

Additional information

Publisher's note.

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

This article is published under an open access license. Please check the 'Copyright Information' section either on this page or in the PDF for details of this license and what re-use is permitted. If your intended use exceeds what is permitted by the license or if you are unable to locate the licence and re-use information, please contact the Rights and Permissions team .

About this article

Ozaki, T., Goebel, R. & Inoue, K. From Fifth Generation Computing to Skill Science. New Gener. Comput. 37 , 141–158 (2019). https://doi.org/10.1007/s00354-019-00058-y

Download citation

Received : 07 February 2019

Accepted : 10 April 2019

Published : 25 April 2019

Issue Date : 01 April 2019

DOI : https://doi.org/10.1007/s00354-019-00058-y

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Institute for New Generation Computer Technology
  • Logic programming
  • Inductive logic programming
  • Skill science
  • Find a journal
  • Publish with us
  • Track your research

Logo for BCcampus Open Publishing

Want to create or adapt books like this? Learn more about how Pressbooks supports open publishing practices.

Unit 7. Evolution of computers

Topic A: Computer generations

Click play on the following audio player to listen along as you read this section.

Basic Terms

fifth generation computer essay

Vacuum tube – an electronic device that controls the flow of electrons in a vacuum. It used as a switch, amplifier, or display screen in many older model radios, televisions, computers, etc.

fifth generation computer essay

Transistor – an electronic component that can be used as an amplifier or as a switch. It is used to control the flow of electricity in radios, televisions, computers, etc.

fifth generation computer essay

Integrated circuit (IC) – a small electronic circuit printed on a chip (usually made of silicon) that contains many its own circuit elements (e.g. transistors, diodes , resistors, etc.).

fifth generation computer essay

Microprocessor – an electronic component held on an integrated circuit that contains a computer’s central processing unit (CPU) and other associated circuits.

fifth generation computer essay

CPU (central processing unit) – It is often referred to as the brain or engine of a computer where most of the processing and operations take place (CPU is part of a microprocessor).

fifth generation computer essay

Magnetic drum – a cylinder coated with magnetic material, on which data and programs can be stored.

Magnetic core – uses arrays of small rings of magnetized material called cores to store information.

fifth generation computer essay

Machine language – a low-level programming language comprised of a collection of binary digits (ones and zeros) that the computer can read and understand.

Assembly language is like the machine language that a computer can understand, except that assembly language uses abbreviated words (e.g. ADD, SUB, DIV…) in place of numbers (0s and 1s).

fifth generation computer essay

Artificial intelligence (AI) – an area of computer science that deals with the simulation and creation of intelligent machines or intelligent behave in computers (they think, learn, work, and react like humans).

First Generation of Computers

Classification of generations of computers.

The evolution of computer technology is often divided into five generations.

The main characteristics of first generation of computers (1940s-1950s)

fifth generation computer essay

  • Main memory – magnetic drums and magnetic tapes
  • Programming language – machine language

fifth generation computer essay

  • Speed and size – very slow and very large in size (often taking up entire room).
  • Input/output devices – punched cards and paper tape.
  • Examples – ENIAC, UNIVAC1, IBM 650, IBM 701, etc.
  • Quantity – there were about 100 different vacuum tube computers produced between 1942 and1963.

Second Generation of Computers

The main characteristics of second generation of computers (1950s-1960s).

  • Memory – magnetic core and magnetic tape / disk

fifth generation computer essay

  • Power and size – low power consumption, generated less heat, and smaller in size (in comparison with the first generation computers).
  • Speed – improvement of speed and reliability (in comparison with the first generation computers).
  • Input/output devices – punched cards and magnetic tape.
  • Examples – IBM 1401, IBM 7090 and 7094, UNIVAC 1107, etc.

Third Generation of Computers

The main characteristics of third generation of computers (1960s-1970s).

fifth generation computer essay

  • Memory – large magnetic core, magnetic tape / disk

fifth generation computer essay

  • Size – smaller, cheaper, and more efficient than second generation computers (they were called minicomputers).
  • Speed – improvement of speed and reliability (in comparison with the second generation computers).

fifth generation computer essay

  • Examples – IBM 360, IBM 370, PDP-11, UNIVAC 1108, etc.

Fourth Generation of Computers

The main characteristics of fourth generation of computers (1970s-present).

fifth generation computer essay

  • VLSI– thousands of transistors on a single microchip.
  • RAM (random-access memory) – a type of data storage (memory element) used in computers that temporary stores of programs and data (volatile: its contents are lost when the computer is turned off).

fifth generation computer essay

  • A mix of both third- and fourth-generation languages
  • Size – smaller, cheaper and more efficient than third generation computers.
  • Speed – improvement of speed, accuracy, and reliability (in comparison with the third generation computers).

fifth generation computer essay

  • Network – a group of two or more computer systems linked together.
  • Examples – IBM PC, STAR 1000, APPLE II, Apple Macintosh, etc.

fifth generation computer essay

Fifth Generation of Computers

The main characteristics of fifth generation of computers (the present and the future).

fifth generation computer essay

  • ULSI – millions of transistors on a single microchip
  • Parallel processing method – use two or more microprocessors to run tasks simultaneously.
  • Language – understand natural language (human language).
  • Power – consume less power and generate less heat.
  • Speed – remarkable improvement of speed, accuracy and reliability (in comparison with the fourth generation computers).
  • Size – portable and small in size, and have a huge storage capacity.

fifth generation computer essay

  • Example – desktops, laptops, tablets, smartphones, etc.

Three women sitting around a table with laptops.

The computer – this amazing technology went from a government/business-only technology to being everywhere from people’s homes, work places, to people’s pockets in less than 100 years.

fifth generation computer essay

an electronic device that controls the flow of electrons in a vacuum. It used as a switch, amplifier, or display screen in many older model radios, televisions, computers, etc.

an electronic component that can be used as an amplifier or as a switch. It is used to control the flow of electricity in radios, televisions, computers, etc.

a small electronic circuit printed on a chip (usually made of silicon) that contains many its own circuit elements (e.g. transistors, diodes, resistors, etc.).

an electronic component held on an integrated circuit that contains a computer's central processing unit (CPU) and other associated circuits.

The brain or engine of a computer, where most of the processing and operations take place.

a cylinder coated with magnetic material, on which data and programs can be stored.

uses arrays of small rings of magnetized material called cores to store information.

a low-level programming language comprised of a collection of binary digits (ones and zeros) that the computer can read and understand.

a physical device that is used to store data, information, and programs in a computer.

an area of computer science that deals with the simulation and creation of intelligent machines or intelligent behave in computers (they think, learn, work, and react like humans).

Key Concepts of Computer Studies Copyright © 2020 by Meizhong Wang is licensed under a Creative Commons Attribution 4.0 International License , except where otherwise noted.

Share This Book

fifth generation computer essay

Encyclopedia

  • Scholarly Community Encyclopedia
  • Log in/Sign up

fifth generation computer essay

Video Upload Options

  • MDPI and ACS Style
  • Chicago Style

The Fifth Generation Computer Systems (FGCS) was an initiative by Japan's Ministry of International Trade and Industry (MITI), begun in 1982, to create computers using massively parallel computing and logic programming. It was to be the result of a government/industry research project in Japan during the 1980s. It aimed to create an "epoch-making computer" with supercomputer-like performance and to provide a platform for future developments in artificial intelligence. There was also an unrelated Russian project also named as a fifth-generation computer (see Kronos (computer)). Ehud Shapiro, in his "Trip Report" paper (which focused the FGCS project on concurrent logic programming as the software foundation for the project), captured the rationale and motivations driving this project: The term "fifth generation" was intended to convey the system as being advanced. In the history of computing hardware, computers using vacuum tubes were called the first generation; transistors and diodes, the second; integrated circuits, the third; and those using microprocessors, the fourth. Whereas previous computer generations had focused on increasing the number of logic elements in a single CPU, the fifth generation, it was widely believed at the time, would instead turn to massive numbers of CPUs for added performance. The project was to create the computer over a ten-year period, after which it was considered ended and investment in a new "sixth generation" project would begin. Opinions about its outcome are divided: either it was a failure, or it was ahead of its time.

1. Information

In the late 1965s it was one of the most used until the early 1970s, there was much talk about "generations" of computer hardware — usually "three generations".

  • First generation: Thermionic vacuum tubes. Mid-1940s. IBM pioneered the arrangement of vacuum tubes in pluggable modules. The IBM 650 was a first-generation computer.
  • Second generation: Transistors. 1956. The era of miniaturization begins. Transistors are much smaller than vacuum tubes, draw less power, and generate less heat. Discrete transistors are soldered to circuit boards, with interconnections accomplished by stencil-screened conductive patterns on the reverse side. The IBM 7090 was a second-generation computer.
  • Third generation: Integrated circuits (silicon chips containing multiple transistors). 1964. A pioneering example is the ACPX module used in the IBM 360/91, which, by stacking layers of silicon over a ceramic substrate, accommodated over 20 transistors per chip; the chips could be packed together onto a circuit board to achieve unprecedented logic densities. The IBM 360/91 was a hybrid second- and third-generation computer.

Omitted from this taxonomy is the "zeroth-generation" computer based on metal gears (such as the IBM 407) or mechanical relays (such as the Mark I), and the post-third-generation computers based on Very Large Scale Integrated (VLSI) circuits.

There was also a parallel set of generations for software:

  • First generation: Machine language.
  • Second generation: Low-level programming languages such as Assembly language.
  • Third generation: Structured high-level programming languages such as C, COBOL and FORTRAN.
  • Fourth generation: "Non-procedural" high-level programming languages (such as object-oriented languages) [ 1 ]

Throughout these multiple generations up to the 1970s, Japan built computers following U.S. and British leads. In the mid-1970s, the Ministry of International Trade and Industry stopped following western leads and started looking into the future of computing on a small scale. They asked the Japan Information Processing Development Center (JIPDEC) to indicate a number of future directions, and in 1979 offered a three-year contract to carry out more in-depth studies along with industry and academia. It was during this period that the term "fifth-generation computer" started to be used.

Prior to the 1970s, MITI guidance had successes such as an improved steel industry, the creation of the oil supertanker, the automotive industry, consumer electronics, and computer memory. MITI decided that the future was going to be information technology. However, the Japanese language, particularly in its written form, presented and still presents obstacles for computers. [ 2 ] As a result of these hurdles, MITI held a conference to seek assistance from experts.

The primary fields for investigation from this initial project were:

  • Inference computer technologies for knowledge processing
  • Computer technologies to process large-scale data bases and knowledge bases
  • High performance workstations
  • Distributed functional computer technologies
  • Super-computers for scientific calculation

The project imagined an "epoch-making computer" with supercomputer-like performance using massively parallel computing/processing. The aim was to build parallel computers for artificial intelligence applications using concurrent logic programming. The FGCS project and its vast findings contributed greatly to the development of the concurrent logic programming field.

The target defined by the FGCS project was to develop "Knowledge Information Processing systems" (roughly meaning, applied Artificial Intelligence). The chosen tool to implement this goal was logic programming. Logic programming approach as was characterized by Maarten Van Emden – one of its founders – as: [ 3 ]

  • The use of logic to express information in a computer.
  • The use of logic to present problems to a computer.
  • The use of logical inference to solve these problems.

More technically, it can be summed up in two equations:

  • Program = Set of axioms .
  • Computation = Proof of a statement from axioms .

The Axioms typically used are universal axioms of a restricted form, called Horn-clauses or definite-clauses. The statement proved in a computation is an existential statement. The proof is constructive, and provides values for the existentially quantified variables: these values constitute the output of the computation.

Logic programming was thought as something that unified various gradients of computer science (software engineering, databases, computer architecture and artificial intelligence). It seemed that logic programming was a key missing connection between knowledge engineering and parallel computer architectures.

The project imagined a parallel processing computer running on top of large databases (as opposed to a traditional filesystem) using a logic programming language to define and access the data. They envisioned building a prototype machine with performance between 100M and 1G LIPS, where a LIPS is a Logical Inference Per Second. At the time typical workstation machines were capable of about 100k LIPS. They proposed to build this machine over a ten-year period, 3 years for initial R&D, 4 years for building various subsystems, and a final 3 years to complete a working prototype system. In 1982 the government decided to go ahead with the project, and established the Institute for New Generation Computer Technology (ICOT) through joint investment with various Japanese computer companies.

In the same year, during a visit to the ICOT, Ehud Shapiro invented Concurrent Prolog, a novel concurrent programming language that integrated logic programming and concurrent programming. Concurrent Prolog is a logic programming language designed for concurrent programming and parallel execution. It is a process oriented language, which embodies dataflow synchronization and guarded-command indeterminacy as its basic control mechanisms. Shapiro described the language in a Report marked as ICOT Technical Report 003, [ 4 ] which presented a Concurrent Prolog interpreter written in Prolog. Shapiro's work on Concurrent Prolog inspired a change in the direction of the FGCS from focusing on parallel implementation of Prolog to the focus on concurrent logic programming as the software foundation for the project. It also inspired the concurrent logic programming language Guarded Horn Clauses (GHC) by Ueda, which was the basis of KL1, the programming language that was finally designed and implemented by the FGCS project as its core programming language.

1.1. Implementation

The belief that parallel computing was the future of all performance gains generated by the Fifth-Generation project produced a wave of apprehension in the computer field. After having influenced the consumer electronics field during the 1970s and the automotive world during the 1980s, the Japanese in the 1980s developed a strong reputation. Soon parallel projects were set up in the US as the Strategic Computing Initiative and the Microelectronics and Computer Technology Corporation (MCC), in the UK as Alvey, and in Europe as the European Strategic Program on Research in Information Technology (ESPRIT), as well as the European Computer‐Industry Research Centre (ECRC) in Munich, a collaboration between ICL in Britain, Bull in France, and Siemens in Germany.

Five running Parallel Inference Machines (PIM) were eventually produced: PIM/m, PIM/p, PIM/i, PIM/k, PIM/c. The project also produced applications to run on these systems, such as the parallel database management system Kappa, the legal reasoning system HELIC-II , and the automated theorem prover MGTP , as well as applications to bioinformatics.

1.2. Failure

The FGCS Project did not meet with commercial success for reasons similar to the Lisp machine companies and Thinking Machines. The highly parallel computer architecture was eventually surpassed in speed by less specialized hardware (for example, Sun workstations and Intel x86 machines). The project did produce a new generation of promising Japanese researchers. But after the FGCS Project, MITI stopped funding large-scale computer research projects, and the research momentum developed by the FGCS Project dissipated. However MITI/ICOT embarked on a Sixth Generation Project in the 1990s.

A primary problem was the choice of concurrent logic programming as the bridge between the parallel computer architecture and the use of logic as a knowledge representation and problem solving language for AI applications. This never happened cleanly; a number of languages were developed, all with their own limitations. In particular, the committed choice feature of concurrent constraint logic programming interfered with the logical semantics of the languages. [ 5 ]

Another problem was that existing CPU performance quickly pushed through the barriers that experts perceived in the 1980s, and the value of parallel computing dropped to the point where it was for some time used only in niche situations. Although a number of workstations of increasing capacity were designed and built over the project's lifespan, they generally found themselves soon outperformed by "off the shelf" units available commercially.

The project also failed to maintain continuous growth. During its lifespan, GUIs became mainstream in computers; the internet enabled locally stored databases to become distributed; and even simple research projects provided better real-world results in data mining. Moreover, the project found that the promises of logic programming were largely negated by the use of committed choice.

At the end of the ten-year period, the project had spent over ¥50 billion (about US$400 million at 1992 exchange rates) and was terminated without having met its goals. The workstations had no appeal in a market where general purpose systems could now replace and outperform them. This is parallel to the Lisp machine market, where rule-based systems such as CLIPS could run on general-purpose computers, making expensive Lisp machines unnecessary. [ 6 ]

1.3. Ahead of Its Time

Albeit not having produced much success, many of the approaches seen in the Fifth-Generation project, such as logic programming have been distributed over massive knowledge-bases, and are now being re-interpreted in current technologies. For example, the Web Ontology Language (OWL) employs several layers of logic-based knowledge representation systems. It appears, however, that these new technologies reinvented rather than leveraged approaches investigated under the Fifth-Generation initiative.

In the early 21st century, many flavors of parallel computing began to proliferate, including multi-core architectures at the low-end and massively parallel processing at the high end. When clock speeds of CPUs began to move into the 3–5 GHz range, CPU power dissipation and other problems became more important. The ability of industry to produce ever-faster single CPU systems (linked to Moore's Law about the periodic doubling of transistor counts) began to be threatened. Ordinary consumer machines and game consoles began to have parallel processors like the Intel Core, AMD K10, and Cell. Graphics card companies like Nvidia and AMD began introducing large parallel systems like CUDA and OpenCL. Again, however, it is not clear that these developments were facilitated in any significant way by the Fifth-Generation project.

In summary, it is argued that the Fifth-Generation project was revolutionary, however, still had areas of downfall. [ 7 ]

  • "Roger Clarke's Software Generations". http://www.rogerclarke.com/SOS/SwareGenns.html. 
  • J. Marshall Unger, The Fifth Generation Fallacy (New York: Oxford University Press, 1987)
  • Van Emden, Maarten H., and Robert A. Kowalski. "The semantics of predicate logic as a programming language." Journal of the ACM 23.4 (1976): 733-742. https://www.researchgate.net/profile/Maarten-Emden/publication/234779982_The_Semantics_of_Predicate_Logic_as_a_Programming_Language/links/0c96052857d2634345000000/The-Semantics-of-Predicate-Logic-as-a-Programming-Language.pdf
  • Shapiro E. A subset of Concurrent Prolog and its interpreter, ICOT Technical Report TR-003, Institute for New Generation Computer Technology, Tokyo, 1983. Also in Concurrent Prolog: Collected Papers, E. Shapiro (ed.), MIT Press, 1987, Chapter 2.
  • Carl Hewitt. Inconsistency Robustness in Logic Programming ArXiv 2009. https://arxiv.org/abs/0904.3036
  • Hendler, James (1 March 2008). "Avoiding Another AI Winter". IEEE Intelligent Systems 23 (2): 2–4. doi:10.1109/MIS.2008.20. http://csdl2.computer.org/comp/mags/ex/2008/02/mex2008020002.pdf. 
  • Odagiri, Hiroyuki; Nakamura, Yoshiaki; Shibuya, Minorul (1997). "Research consortia as a vehicle for basic research: The case of a fifth generation computer project in Japan" (in en). Research Policy 26 (2): 191–207. doi:10.1016/S0048-7333(97)00008-5. https://linkinghub.elsevier.com/retrieve/pii/S0048733397000085. 

encyclopedia

  • Terms and Conditions
  • Privacy Policy
  • Advisory Board

fifth generation computer essay

  • Privacy Policy

TutorialsMate

  • Computer Fundamental Tutorial
  • RPA Tutorial
  • Apache HBase Tutorial
  • Apache Helix Tutorial
  • Bash Script Tutorial
  • Python Tutorial
  • SEO Tutorial

Fifth generation of computer: AI & ULSI

  • Computer Fundamentals
  • Fifth Generation of Computer

Fifth Generation of Computer

What You Will Learn

Quick Links [Show/ Hide List] • What is the Fifth Generation of Computer? • Examples of Fifth Generation Computers • Characteristics of Fifth Generation Computers • Advantages of Fifth Generation Computers • Disadvantages of Fifth Generation Computers • Summary var toggler = document.getElementsByClassName("caret"); var i; for (i = 0; i

What is the fifth generation of Computer?

Fifth Generation of Computer - AI

Examples of Fifth Generation Computers

Fifth Generation of the Computer - Laptop

Characteristics of Fifth Generation Computers

Advantages of fifth generation computers, disadvantages of fifth generation computers.

  • • Third Generation of Computer
  • • Input Devices of Computer
  • • Output Devices of Computer
  • • Second Generation of Computer
  • • Memory Units of Computer
  • • Types of Printers
  • • First Generation of Computer
  • • Full Form of Computer

Please share this...

Share on Facebook

Weekly Hits

Latest tutorial.

Computer Fundamentals Tutorial

Interview Q.

Interview Questions

Career Edge

Career Edge

Differences

Differences

Top 20 Posts

  • Output Devices of Computer
  • Computer Keyboard Shortcut Keys
  • First Generation of Computer
  • Third Generation of Computer
  • Components of DBMS
  • Types of Computer Viruses
  • Functions of Computer
  • Advantages of DBMS
  • Block Diagram of Computer
  • Fourth Generation of Computer
  • Uses of Internet
  • Disadvantages of DBMS
  • Components of Computer
  • Characteristics of Computer
  • Advantages and Disadvantages of Computer

Top Interview Questions

  • SDLC Interview Questions
  • HTTP Interview Questions
  • RPA Interview Questions
  • Google Maps Interview Questions

Essay Writing

  • Essay on Dog
  • Essay on Cow
  • Essay on Pollution
  • Essay on Diwali

Popular Aptitute Test

  • Basic Computer Questions

Popular Blogs

  • Alternatives to Netflix
  • Alternatives to Omegle

Recently Published

  • Winter Season in India
  • List of Indian Presidents
  • Keyboard Shortcut Keys of Computer
  • Beginners Tutorials: Complete List for Beginners
  • Top Interview Questions and Answers
  • How to: Get Your Answer
  • Career Edge: Boost Your Skills
  • Multiple Choice Questions
  • General Information: Things People Should Know

Most Searched

  • Core Java Interview Questions
  • Computer Full Form
  • Types of Computer Virus
  • Types of Cybercrime

More Services

Blog with us, connect with us.

Logo

Essay on Generation of Computer

Students are often asked to write an essay on Generation of Computer in their schools and colleges. And if you’re also looking for the same, we have created 100-word, 250-word, and 500-word essays on the topic.

Let’s take a look…

100 Words Essay on Generation of Computer

Introduction.

Computers are essential parts of our lives. They have evolved over time, leading to five generations. Each generation is defined by a significant technological development.

First Generation (1940-1956)

The first generation of computers used vacuum tubes for circuitry and magnetic drums for memory. They were large, slow, expensive, and produced a lot of heat.

Second Generation (1956-1963)

Transistors replaced vacuum tubes in the second generation. Computers became smaller, faster, cheaper, more energy-efficient, and reliable.

Third Generation (1964-1971)

The third generation introduced integrated circuits, combining many transistors onto a single chip. Computers became even smaller, faster, and more reliable.

Fourth Generation (1971-Present)

The fourth generation brought microprocessors, with thousands of integrated circuits on a single chip. This led to personal computers, making computers accessible to the public.

Fifth Generation (Present and Beyond)

The fifth generation focuses on artificial intelligence and aims to create computers that can process natural language and have capabilities of learning and self-organization.

250 Words Essay on Generation of Computer

The evolution of computers has been a transformative journey. From the rudimentary first generation to the sophisticated fifth generation, computers have drastically changed, shaping society and technology along the way.

First Generation (1940-1956): Vacuum Tubes

The first generation of computers used vacuum tubes for circuitry and magnetic drums for memory. These machines were enormous, consumed massive power, and required constant cooling. However, they laid the foundation for modern computing, introducing binary code and stored programs.

Second Generation (1956-1963): Transistors

Transistors replaced vacuum tubes, leading to smaller, faster, cheaper, and more reliable computers. This generation also introduced the concept of a programming language, allowing for more complex tasks.

Third Generation (1964-1971): Integrated Circuits

The third generation saw the advent of integrated circuits, miniaturizing transistors by embedding them into silicon chips. This led to further reduction in size and cost while increasing speed and reliability. High-level programming languages like FORTRAN and COBOL were born.

Fourth Generation (1971-Present): Microprocessors

The fourth generation introduced microprocessors, integrating thousands of integrated circuits into a single chip. This enabled the development of personal computers and the internet, revolutionizing the way we interact with technology.

Fifth Generation (Present and Beyond): Artificial Intelligence

The fifth generation, still in its infancy, aims to create computers that can process natural language and have artificial intelligence capabilities. The goal is to develop machines that can understand, learn, and respond like a human.

The generational advancement of computers is a testament to human ingenuity and innovation. Each generation has brought us closer to creating machines that not only augment human capability but also possess the potential to mimic human intelligence.

500 Words Essay on Generation of Computer

The evolution of computers has been a journey marked by rapid progression and revolutionary breakthroughs. From the rudimentary first generation to the advanced fifth generation, each phase of computer development has significantly impacted various facets of society, economy, and science.

The First Generation (1940-1956): Vacuum Tubes

The first generation of computers were characterized by the use of vacuum tubes. These machines were enormous, occupying entire rooms, and were prone to overheating. Their programming was done in machine language, which was a low-level language. Despite their size and inefficiency, these computers laid the groundwork for modern computing and marked the beginning of the digital age.

The Second Generation (1956-1963): Transistors

Transistors replaced vacuum tubes in the second generation of computers, resulting in smaller, faster, and more reliable machines. This generation also saw the introduction of assembly language, which was easier to understand and use than machine language. Computers became more accessible, leading to increased commercial use.

The Third Generation (1964-1971): Integrated Circuits

The third generation of computers introduced integrated circuits (ICs), which further miniaturized computer design by replacing transistors. ICs led to the development of semiconductors, the backbone of our current digital world. High-level programming languages like FORTRAN and COBOL were introduced, making computers even more user-friendly.

The Fourth Generation (1971-Present): Microprocessors

The fourth generation of computers heralded the era of microprocessors. A single chip now contained thousands of ICs, leading to the development of personal computers. The advent of graphical user interfaces, the mouse, and the internet revolutionized the way we interact with computers. This generation also saw the rise of object-oriented programming, which has become the standard in software development.

The Fifth Generation (Present and Beyond): Artificial Intelligence

The fifth and current generation of computers is characterized by artificial intelligence (AI) and quantum computing. AI enables machines to learn and make decisions, while quantum computing promises to solve complex problems exponentially faster than classical computers. This generation aims to create computers that can understand, learn, and respond to natural language, a significant leap in human-computer interaction.

The journey of computer evolution is a testament to human ingenuity and innovation. From the colossal vacuum tube machines of the first generation to the AI-driven systems of today, each generation of computers has brought us closer to creating machines that can match, and perhaps one day surpass, human cognitive abilities. As we stand on the brink of a new era in computing, the possibilities are as exciting as they are limitless.

That’s it! I hope the essay helped you.

If you’re looking for more, here are essays on other interesting topics:

  • Essay on Computer Technology Good or Bad
  • Essay on Computer Network
  • Essay on Computer Education

Apart from these, you can look at all the essays by clicking here .

Happy studying!

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

Save my name, email, and website in this browser for the next time I comment.

12 Characteristics and Features of Fifth Generation Computer System in Points

The Computer is an electronic computational device used in arithmetical and logical operations, performed with incredible speed and outstanding accuracy.

We have witnessed the evolution of computer systems from 1st generation to 5th generation.

The development and enhancement of any technology are ever-changing; therefore, we can view many new inventions and discoveries related to the computer industry.

The invention of the computer is considered the greatest invention of all time.

Mr. Charles Benjamin Babbage was a mathematician, philosopher, inventor, and by profession, he was a “Mechanical Engineer” .

This development cycle can be witnessed in the development cycle of computer generations that we will further discuss in the article.

Table of Contents

What are Fifth Generation Computer Systems With Examples?

The fifth-generation computers are the need of the hour, which primarily started evolving in 1990.

These 5th computer generations are using “Artificial Intelligence” for performing and executing complex problems with a high level of intelligence and accuracy.

Fifth Generation Computer Systems

They are more powerful, compact in size, and speedy than other generations of computers. They primarily use superconductors for computer processors.

Modern computer processors are manufactured with multiple cores which are as follows.

Dual-Core Processor is a processor which contains two separate cores of processors triple-core processor has three separate cores of processors.

The quad-core processor contains four separate core processors.

These modern fifth-generation computers can recognize voice commands and suggestions using advanced algorithms for performing tasks and jobs and presenting results according to the needs of the users.

They have enhanced the learning process; also, they are programmed and instructed for self-learning.

They can accept requests and commands via a voice recognition system and act accordingly.

They are becoming beneficial in learning a foreign language with great efficiency and speed.

They are well-equipped and very proactive machines that will have decision-making powers and make appropriate decisions on given a condition.

They also possess the quality of reasoning as human beings with the help of parallel processing.

With excellent characteristics and features, these fifth generations of computers will communicate with humans begins with signs, languages, speech, and writing.

  • Alexa is a kind of Machine that accepts commands in voice and derives results according to their commands.
  • Google Assistant.

The technology giant Google has started showing quality results in the google SERP {Search Engine Result Pages} based on voice commands.

The number of voice searches is increasing daily due to its excellent features and easy-to-use product.

  • PARAM 10000
  • INTEL IPSC-1

Examples of Fifth Generation Computer Systems

Examples of Fifth Generation Computer

  • Personal Computers {PCs}.
  • Workstations.
  • Chrome Book.
  • Ultra Book.
  • Supercomputers.
  • Smart Watches.
  • Smart Phones.

Characteristics of Fifth Generation of Computers

  • The ULSI {Ultra Large Scale Integrated Microprocessor} Technology is used in fifth-generation computers.
  • They use parallel processing.
  • They primarily use a superconductor for a computer processor.
  • Artificial Intelligence is used in these 5th generations of computers, and these characteristics are considered the most acknowledged and utilized.
  • The fifth-generation computer uses high GUI {Graphical User Interface} in operating systems, applications software, and multimedia. They are used to make the system more user-friendly.
  • These computers are more reliable and portable compared to other generations of computers.
  • They are relatively cheaper than their counterparts.
  • They are commercial products.
  • High-Level Languages can be easily used with them.
  • Advanced input and output devices can be used and utilized with them.
  • They play a vital role in the development of the internet and enhancement of www {World Wide Web}.

Evaluations of Computer Generations

Features of Fifth Generation Computer in Points

  • Most of the top-notch products released in the market are incorporated with “Artificial Intelligence”.
  • The software and application of today’s generations are equipped with parallel processing and superconductor.
  • High and advanced quality multimedia features are used.
  • The main feature of a fifth-generation computer is its compact size and speed.
  • They can learn on their own and decide according to their intelligence.
  • As they are small in size, they can be transferred from one location to another, and hence they are highly portable.
  • The product designs are super easy to use and implement.
  • Incredible data and information storage devices.
  • Magnetic installed chips.
  • A more powerful computer than ever before.
  • Modern language processors and converters.
  • They are used in quantum computing and Nanotechnology.
  • Robotic and neural networks.

Features of Fifth Generation Computer With Image

Features of Fifth Generation Computer

Benefits and Advantages of Fifth Generation Computer in Points

  • They are high-speed machines with fantastic storage capacity.
  • The AI {Artificial Intelligence} enables them to communicate with humans in sign language images, and graphs, etc.
  • They are relatively cheaper compared to other generation computers.
  • They use parallel processing and superconductor technology for better and enhanced performance.
  • They can make their own decisions and power of reasoning like humans.
  • ULSI {Ultra Large Scale Integrated} technology is used.
  • They are reliable and efficient.
  • They are low in maintenance.
  • As they are compact and need less space for installation.
  • They are used in research and forecasting weather.
  • They are highly portable.

Drawbacks and Disadvantages of Fifth Generation Computers with Examples

  • They are used in spying.
  • Their waste is negatively affecting our environment.
  • Over-dependent on such machines.
  • They can spy easily.
  • Health issues have been observed lately with the extensive use of such computers.
  • Hacking and data violation cases can be found; misuse of sensitive information has been registered several times.
  • They are extensively used in the manufacturing industries; therefore, the fear of losing jobs is always on the cards.
  • Applications of Fifth Generations of Computer
  • They are widely used in applications with Artificial intelligence.
  • Interactive software and applications.
  • They are used in Forecasting Weather, and Predicting Earthquakes, Volcano Eruptions.
  • Video Games.
  • Voice Recognition Software.
  • Types of Secondary Memory in Computer
  • Characteristics of Fourth Generations of Computer
  • Plotter is Input or Output Device 
  • Characteristics and Features of Third Generation Computer
  • Which Generation of Computer is Still Under Development
  • ICs are Related to Which Generation of Computer System
  • 10 Characteristics and Features of Second Generation Computer
  • Advantages and Disadvantages of Second Generations of Computer
  • Advantages and Disadvantages of Third Generation of Computers
  • Differences Between Second Generation and Third Generation Computer
  • Differences Between First and Second Generation of Computers
  • Fourth Generation of Computers
  • Advantages and Disadvantages of Fourth Generations of Computer
  • Differences Between Third and fourth Generation of Computers
  • Computer Basic Tutorials

Fifth Generation Computer is Based On?

The fifth-generation computers are based on Artificial Intelligence with superconductor technology and parallel processing.

They are used and utilized extensively due to their excellent features; therefore, the importance of fifth-generation computers increased dramatically.

Get In Touch

I have also written and compiled some articles on computers and telecommunications, and please go through them.

I hope you will like reading it.

Don’t hesitate to get in touch with me, and if you need to add, remove or update anything from the article, please let me know in the comment section or via email.

I will be more than happy to update the article. I am always ready to correct myself.

Please share this article with your friends and colleagues; this motivates me to write more related topics.

Related Posts:

Essay on Computer

500+ words essay on computer.

A computer is an electronic device that performs complex calculations. It is a wonderful product of modern technology. Nowadays, computers have become a significant part of our life. Whether it is in the sector of education or health, computers are used everywhere. Our progress is entirely dependent on computers powered by the latest technology. This ‘Essay on Computer’ also covers the history of computers as well as their uses in different sectors. By going through the ‘Computer’ Essay in English, students will get an idea of writing a good Essay on Computers. After practising this essay, they will be able to write essays on other topics related to computers, such as the ‘Uses of Computer’ Essay.

The invention of the computer has made our lives easier. The device is used for many purposes, such as securing information, messages, data processing, software programming, calculations, etc. A desktop computer has a CPU, UPS, monitor, keyboard, and mouse to work. A laptop is a modern form of computer in which all the components are inbuilt into a single device. Earlier, computers were not so fast and powerful. After thorough and meticulous research and work by various scientists, modern-day computers have come up.

History of Computers

The history of computer development is often used to reference the different generations of computing devices. Each generation of computers is characterised by a major technological development that fundamentally changed the way computers work. Most of the major developments from the 1940s to the present day have resulted in increasingly smaller, more powerful, faster, cheaper and more efficient computing devices.

The evolution of computer technology is often divided into five generations. These five generations of computers are as follows:

Uses of Computers

Computers are used in various fields. Some of the applications are

1. Business

A computer can perform a high-speed calculation more efficiently and accurately, due to which it is used in all business organisations. In business, computers are used for:

  • Payroll calculations
  • Sales analysis
  • Maintenance of stocks
  • Managing employee databases

2. Education

Computers are very useful in the education system. Especially now, during the COVID time, online education has become the need of the hour. There are miscellaneous ways through which an institution can use computers to educate students.

3. Health Care

Computers have become an important part of hospitals, labs and dispensaries. They are used for the scanning and diagnosis of different diseases. Computerised machines do scans, which include ECG, EEG, ultrasound and CT Scan, etc. Moreover, they are used in hospitals to keep records of patients and medicines.

Computers are largely used in defence. The military employs computerised control systems, modern tanks, missiles, weapons, etc. It uses computers for communication, operation and planning, smart weapons, etc.

5. Government

Computers play an important role in government services. Some major fields are:

  • Computation of male/female ratio
  • Computerisation of PAN card
  • Income Tax Department
  • Weather forecasting
  • Computerisation of voters’ lists
  • Sales Tax Department

6. Communication

Communication is a way to convey an idea, a message, a picture, a speech or any form of text, audio or video clip. Computers are capable of doing so. Through computers, we can send an email, chat with each other, do video conferencing, etc.

Nowadays, to a large extent, banking is dependent on computers. Banks provide an online accounting facility, which includes checking current balances, making deposits and overdrafts, checking interest charges, shares, trustee records, etc. The ATM machines, which are fully automated, use computers, making it easier for customers to deal with banking transactions.

8. Marketing

In marketing, computers are mainly used for advertising and home shopping.

Similarly, there are various other applications of computers in other fields, such as insurance, engineering, design, etc.

Students can practise more essays on different topics to improve their writing skills. Keep learning and stay tuned with BYJU’S for the latest update on CBSE/ICSE/State Board/Competitive Exams. Also, download the BYJU’S App for interactive study videos.

Frequently asked Questions on Computer Essay

How has the invention of the computer been useful to students.

Easy and ready access to information has been possible (internet) with the invention of the computer.

How to start writing an essay on a computer?

Before writing an essay, first plan the topics, sub-topics and main points which are going to be included in the body of the essay. Then, structure the content accordingly and check for information and examples.

How to use the computer to browse for information on essays?

Various search engines are available, like Google, where plenty of information can be obtained regarding essays and essay structures.

Leave a Comment Cancel reply

Your Mobile number and Email id will not be published. Required fields are marked *

Request OTP on Voice Call

Post My Comment

fifth generation computer essay

Thank u sir

fifth generation computer essay

  • Share Share

Register with BYJU'S & Download Free PDFs

Register with byju's & watch live videos.

close

Counselling

Just another WordPress site

Hello world!

Welcome to WordPress. This is your first post. Edit or delete it, then start writing!

  • Share full article

Advertisement

Supported by

'Fifth Generation' Became Japan's Lost Generation

By Andrew Pollack

  • June 5, 1992

'Fifth Generation' Became Japan's Lost Generation

A bold 10-year effort by Japan to seize the lead in computer technology is fizzling to a close, having failed to meet many of its ambitious goals or to produce technology that Japan's computer industry wanted.

After spending more than $400 million on its widely heralded Fifth Generation computer project, the Japanese Government said this week that it was willing to give away the software developed by the project to anyone who wanted it, even foreigners. Machines That Would Think

That attitude is a sharp contrast to the project's inception, when it spread fear in the United States that the Japanese were going to leapfrog the American computer industry. In response, a group of American companies formed the Microelectronics and Computer Technology Corporation, a consortium in Austin, Tex., to cooperate on research. And the Defense Department, in part to meet the Japanese challenge, began a huge long-term program to develop intelligent systems, including tanks that could navigate on their own.

Now, with a debate in the United States about whether the Government should help American companies compete, the Fifth Generation venture is a reminder that even Japan's highly regarded Ministry of International Trade and Industry can make mistakes in predicting which technologies will be important in the future.

The problem for Japan is that the computer industry shifted so rapidly that the technological path the Fifth Generation took -- which seemed a wise choice in 1982 -- turned out to be at odds with the computer industry's direction by 1992.

In a sense, Japan's ability to stay the course in pursuit of a long-term payoff -- usually considered one of the country's strongest assets -- turned into a liability. A similar challenge for Japan may now be arising in high-definition television. Japan's HDTV system, which has been in development for two decades, is now coming to market just as some engineers believe that a major shift to digital television technology will make the Japanese analog approach obsolete.

Yet interest in joint government-industry projects continues in Japan. Another computer technology program, called the Real World Computing project, is getting under way. Executives here said that such programs lead to valuable results even if no useful products emerge from the pipeline.

A benefit of the Fifth Generation project, for instance, is that it trained hundreds, perhaps thousands, of engineers in advanced computer science. It is this training, rather than any particular piece of software, that will be the project's legacy, as the engineers proceed to apply their skills at their respective Japanese companies.

"I think the side effect is the main effect," said Toshio Yokoi, general manager of the Japan Electronic Dictionary Research Institute, another project backed by the Japanese Government.

An American executive agreed. "The indirect impact of the Fifth Generation project should not be understated," said Mark Eaton, vice president of strategy and development at the American computer consortium. "As usual with a MITI project, it's the bandwagon effect you look at, as well as the project itself." Mr. Eaton monitored the project in a previous job at the United States Consulate in Osaka. Post-Mortem Conference

The Fifth Generation project, started in 1982, aimed to develop computers with reasoning capabilities, rather than the ability to merely perform calculations. Computers imbued with such "artificial intelligence" could be used to diagnose diseases, analyze lawsuits and understand language. With the project drawing to a close, a big conference is under way here this week to assess its results.

The Fifth Generation effort did not yield the breakthroughs to make machines truly intelligent, something that probably could never have realistically been expected anyway. Yet the project did succeed in developing prototype computers that can perform some reasoning functions at high speeds, in part by employing up to 1,000 processors in parallel. The project also developed basic software to control and program such computers. Experts here said that some of these achievements were technically impressive. No Market Appeal

But these days, few people want specialized computers for artificial intelligence, preferring powerful general-purpose machines like those made by Sun Microsystems Inc., a fast-growing Silicon Valley company that did not exist when the Fifth Generation Project was conceived. And a host of scrappy American companies have sprung up to sell massively parallel computers with tens of thousands of processors, far more than the Fifth Generation machines.

The lack of commercial interest in its technology helps account for the announcement by the Ministry of International Trade and Industry this week that it would give the Fifth Generation software to anyone, without charge.

While the ministry said it was making the unusual offer as a donation to world science, some computer scientists read the move as evidence that the project had been transformed from a strategic weapon.

"If it had really caught on, the Japanese companies would not have let it go," said Edward Feigenbaum, a professor of computer science at Stanford University who was the co-author of a 1983 book on the Fifth Generation project. While the project developed some interesting computer designs and software, he said, even in Japan "no one is using the technology."

Most industry executives and computer scientists on both sides of the Pacific agree that the United States still retains the world leadership in computer design and software. And today, most American computer scientists and executives have long since stopped paying attention to the Fifth Generation project.

The failure of the Fifth Generation project to produce much of commercial value also reflects the fact that Japanese computer companies have come a long way on their own and no longer need to rely on the Ministry of International Trade and Industry.

One research manager at a Japanese company, who spoke on the condition he not be identified, said his company had developed one of the specialized computers, known as parallel-inference machines, for the Fifth Generation project but had no interest in turning it into a commercial product. Instead, the company has developed its own, more general-purpose parallel computer, which it is now selling.

The Fifth Generation project did provide a stimulus for the formation of the Microelectronics and Computer Technology Corporation, which in turn inspired the formation of other consortiums, so that it is now not unusual for American companies to cooperate in basic research.

In his opening speech at the conference here, Kazuhiro Fuchi, the director of the Fifth Generation project, made an impassioned defense of his program.

"Ten years ago we faced criticism of being too reckless," in setting too many ambitious goals, he said, adding, "Now we see criticism from inside and outside the country because we have failed to achieve such grand goals."

Outsiders, he said, initially exaggerated the aims of the project, with the result that the program now seems to have fallen short of its goals.

Some American computer scientists say privately that some of their colleagues did perhaps overstate the scope and threat of the Fifth Generation project. Why? In order to coax more support from the United States Government for computer science research.

IMAGES

  1. Fifth Generation of Computer with Examples

    fifth generation computer essay

  2. 5th generation of computer

    fifth generation computer essay

  3. 5th generation of computer

    fifth generation computer essay

  4. 12 Characteristics And Features Of Fifth Generation Computer System

    fifth generation computer essay

  5. History Of Computer: Fifth Generation (1980 and Beyond)

    fifth generation computer essay

  6. 12 Characteristics and Features of Fifth Generation Computer System

    fifth generation computer essay

VIDEO

  1. Generation of Computer

  2. Basics of Computers

  3. Third Generation और Fifth Generation Computer कौनसा है? GK INDIA #gk #integratedcircuit #upsc #mpsc

  4. The 5 Generations of Computers: A History of Computing

  5. Fifth Generation Computer Systems

  6. Evolution of Computer|| 1st generation

COMMENTS

  1. Fifth Generation of Computers

    Fifth Generation of Computers(Present and Future): It is primarily based totally on Artificial intelligence (AI) software. Artificial intelligence describes the medium and manner of creating computer systems like people, the manner human thinks, the manner people act, etc. and that is a rising department and has all of the scopes for studies ...

  2. 1st to 5th Generations of Computer: Complete Information

    Introduction to Computer Generations. This development period of electronic computing technology is called Computer Generation. There are five generations of computers identified, although the sixth generation could be in development now in the early 21st century.. During the evolutionary timeline, each generation of computers has improved a lot by undergoing considerable changes in their size ...

  3. Computer

    Computer - Fifth Generation. The period of fifth generation is 1980-till date. In the fifth generation, VLSI technology became ULSI (Ultra Large Scale Integration) technology, resulting in the production of microprocessor chips having ten million electronic components. This generation is based on parallel processing hardware and AI (Artificial ...

  4. What are the Five Generations of Computers? (1st to 5th)

    Fifth-generation computer technology, based on artificial intelligence, is still in development, though there are some applications, such as voice recognition, that are being used today. The use of parallel processing and superconductors is helping to make artificial intelligence a reality. This is also so far the prime generation for packing a ...

  5. A Comprehensive Guide to Generations of Computers

    5. Fifth Generation of Computers. The fifth generation of computers represents a period of computing that extends from the late 20th century into the early 21st century. This era is characterized by advancements in parallel processing, artificial intelligence (AI) and the development of novel computing architectures. While the exact timeline of ...

  6. Fifth generation of computers

    The fifth generation computer, also known by its acronym in English, FGCS (for Fifth Generation Computer Systems), was a project made by Japan that began in 1981 His goal was the development of a new class of computers that would use artificial intelligence techniques and technologies both at the hardware and software levels, using the PROLOG language at the machine language level, and would ...

  7. From Fifth Generation Computing to Skill Science

    The Japanese national project on Fifth Generation Computing Systems (FGCS) began in 1982, and Furukawa joined the project as a research director at the Institute for New Generation Computer Technology (ICOT). His friend and colleague from the University of Tokyo and ETL, Professor Kazuhiro Fuchi, was a "born leader" and helped convene both public and industrial funders to create the joint ...

  8. Fifth Generation Computer Systems

    The Fifth Generation Computer Systems (FGCS; Japanese: 第五世代コンピュータ, romanized: daigosedai konpyūta) was a 10-year initiative begun in 1982 by Japan's Ministry of International Trade and Industry (MITI) to create computers using massively parallel computing and logic programming.It aimed to create an "epoch-making computer" with supercomputer-like performance and to provide ...

  9. Topic A: Computer generations

    The main characteristics of first generation of computers (1940s-1950s) Main electronic component - vacuum tube. Main memory - magnetic drums and magnetic tapes. Programming language - machine language. Power - consume a lot of electricity and generate a lot of heat. Speed and size - very slow and very large in size (often taking up ...

  10. Fifth Generation Computer

    The Fifth Generation Computer Systems (FGCS) was an initiative by Japan's Ministry of International Trade and Industry (MITI), begun in 1982, to create computers using massively parallel computing and logic programming. It was to be the result of a government/industry research project in Japan during the 1980s. It aimed to create an "epoch-making computer" with supercomputer-like performance ...

  11. Fifth generation of computer: AI & ULSI

    Summary. The fifth-generation period began in 1980 and is still in progress. Fifth-generation computers are based on a combination of technologies, such as ULSI circuits, AI software, and parallel processing hardware. Although there have been many major improvements in fifth generation computers, there is still much room for upgrades.

  12. Essay on Generation of Computer

    500 Words Essay on Generation of Computer Introduction. The evolution of computers has been a journey marked by rapid progression and revolutionary breakthroughs. From the rudimentary first generation to the advanced fifth generation, each phase of computer development has significantly impacted various facets of society, economy, and science.

  13. 5th Generation Computer

    5th Generation Computer. Good Essays. 1934 Words. 8 Pages. Open Document. I. Introduction The history of computer development is often referred to in reference to the different generations of computing devices. Each generation of computer is characterized by a major technological development that fundamentally changed the way computers operate ...

  14. Fifth-generation programming language

    A fifth-generation programming language ( 5GL) is a high-level programming language based on problem-solving using constraints given to the program, rather than using an algorithm written by a programmer. [1] Most constraint-based and logic programming languages and some other declarative languages are fifth-generation languages.

  15. 12 Characteristics And Features Of Fifth Generation Computer System

    Benefits and Advantages of Fifth Generation Computer in Points. They are high-speed machines with fantastic storage capacity. The AI {Artificial Intelligence} enables them to communicate with humans in sign language images, and graphs, etc. They are relatively cheaper compared to other generation computers.

  16. Essay on Computer For Students In English

    Essay on Computer: Students can go through the 500+ words essay on computers to get ideas for essay writing on the computer. It will help them to frame their thoughts in an organised way for an effective essay. ... Fifth Generation. The present and the future. Artificial intelligence based. Uses of Computers. Computers are used in various ...

  17. 5th Generation Computer Essay Example For FREE

    The term fifth generation was intended to convey the system as being a leap beyond existing machines. Computers using vacuum tubes were called the first generation; transistors and diodes, the second; integrated circuits, the third; and those using microprocessors, the fourth. Whereas previous computer generations had focused on increasing the ...

  18. The Fifth Generation Cyber Security

    The Fifth Generation Computer Systems [Present and Future] (FGCS) was an activity by Japan's Ministry of International Trade and Industry, started in 1982, to make a PC utilizing enormously parallel figuring/handling. It was to be the after effect of a gigantic government industry inquires about venture in Japan amid the 1980s.

  19. 'Fifth Generation' Became Japan's Lost Generation

    The problem for Japan is that the computer industry shifted so rapidly that the technological path the Fifth Generation took -- which seemed a wise choice in 1982 -- turned out to be at odds with ...

  20. Fifth Generation Mobile Networks: Challenges and Future ...

    Fifth generation (5G) mobile network is not only the successor of its previous mobile networks (4G,3G,2G) but also the beginning of a new era in mobile communication. The 5G performance targets include greater speed, greater capacity, reduced latency, cost reduction, high resolution and larger bandwidth. It is anticipated that 5G technology ...

  21. Fifth Generation Computers

    1285 Words3 Pages. Fifth Generation Computers. Ever since computers first came into production, they have been evolving. The Commodore 64 and Apple computers have dominated the very first computer market. Today, there are many companies in the computer industry fighting for technology supremacy. And since the beginning, every new generation of ...