A computer is an electronic device designed to accept data, perform prescribed mathematical and logical operations at high speed, and display the results of these operations.
Computer evolution is categorized into generations based on the major technological developments used in the hardware.
| Generation | Period | Core Technology | Characteristics |
|---|---|---|---|
| First | 1940-1956 | Vacuum Tubes | Huge size, high heat, expensive, used Machine Language. |
| Second | 1956-1963 | Transistors | Smaller, faster, more reliable than vacuum tubes. |
| Third | 1964-1971 | Integrated Circuits (IC) | Increased speed and efficiency; used keyboards/monitors. |
| Fourth | 1971-Present | Microprocessors (VLSI) | Handheld devices, personal computers, high networking. |
| Fifth | Present & Beyond | Artificial Intelligence (ULSI) | Robotics, Natural Language Processing, Supercomputing. |
The basic architecture of a computer consists of three main units: Input Unit, Central Processing Unit (CPU), and Output Unit.
The CPU is known as the "brain" of the computer and consists of:
Information Technology (IT) has revolutionized various industries by automating tasks and improving efficiency.
Computers represent data using different number systems based on a fixed base or "radix".
Base (Radix): The total number of unique digits available in a number system.
To convert from decimal to another base, repeatedly divide the number by the target base and record the remainders.
Example: Decimal 25 to Binary
25 / 2 = 12 rem 1
12 / 2 = 6 rem 0
6 / 2 = 3 rem 0
3 / 2 = 1 rem 1
1 / 2 = 0 rem 1
Result (Bottom to Top): 11001
Multiply each digit by 2 raised to the power of its position (starting from 0 on the right).
Example: Binary 1101 to Decimal
(1 * 2^3) + (1 * 2^2) + (0 * 2^1) + (1 * 2^0)
= 8 + 4 + 0 + 1 = 13
Q: What is the main difference between 3rd and 4th generation?
A: The 3rd generation used Integrated Circuits (ICs), while the 4th generation used Microprocessors (VLSI), allowing an entire computer to fit on a single chip.
Q: Why is Hexadecimal used in computing?
A: It is a human-friendly way to represent long binary strings (1 Hex digit = 4 Binary bits).