Introduction to IT
It is the first module, of the Technical Support Fundamentals.
What is IT?
The use of digital technology, like computers and the internet, to store and process data into useful information.
- Digital Divide: The lack of digital literacy among the masses.
Role of IT Support Specialist
- Managing
- Installing
- Maintaining
- Troubleshooting
- Configuring
History of Computing
From Abacus to Analytical Engine
Computer
A device that stores and process data performing calculations.
Abacus
The oldest known computer, invented in 500 BC to count large numbers.
Mechanical Engine of 17th Century
It was able to perform summation, subtraction, multiplication, and division but still need human intervention to operate its knob and levers.
Invention of Punch Cards in 18th century shaped the world of computing
Charles Babbage invented the Difference Engine
It was a combination of sophisticated mechanical calculators and was able to perform pretty complex mathematical operations but not much else.
Analytical Engine
Babbage followed his Difference Engine with an Analytical Engine, he was inspired by Punch Cards, and it was able to perform automatic calculations without human interaction.
But it was still a giant Mechanical Computer, though being impressive.
Invention of Algorithms
A Mathematician, Ada Lovelace, realize the true potential of the Analytical Engine. She was the first person to recognize that a machine can be used more than just for pure calculations. She developed the first algorithm for the Engine.
Because of this discovery of Lovelace, the Analytical Engine became the first general purpose computing device in the history.
Algorithm
A series of steps that solve specific problems.
Digital Logic
Computer Language
Binary System
The communication that a computer uses, also known as a base-2 numeral system.
- Bit: A number in binary.
- Byte: A group of 8-bits.
- Each bit can store one character, and we can have 256 possible values thanks to the base-2 system (2**8)
1
2
Examples:
10100011, 11110011, 00001111
Character Encoding
Assigns our binary values to characters, so that we as human can read them.
ASCII
The oldest used character encoding system for English alphabet, digits, punctuation marks.
UTF-8
The most prevalent encoding standard used today. Along with the same ASCII table, it lets us use the variable number of bytes.
Binary
As in Punch Card systems, a hole represents the number 1, and no-hole represents the number 0.
In binary, electrical circuits are used to represent zeros and ones (0s,1s), when current passes through the circuit, the circuit is on, and it represents 1, when no electricity passes, the circuit is closed and represents 0.
Logic gates
Allow our transistors to do more complex tasks, like decide where to send electrical signals depending on logical conditions.
AND logic gate
OR logic gate
NOT logic gate
XOR logic gate
NAND logic gate
XNOR logic gate
How to Count in Binary?
256 | 128 | 64 | 28 | 16 | 8 | 4 | 2 | unit(0,1) | Decimal System |
---|---|---|---|---|---|---|---|---|---|
0 | 0 | ||||||||
1 | 1 | ||||||||
1 | 0 | 2 | |||||||
1 | 1 | 3 | |||||||
1 | 0 | 0 | 4 | ||||||
1 | 0 | 1 | 5 | ||||||
1 | 1 | 0 | 6 | ||||||
1 | 1 | 1 | 7 | ||||||
1 | 0 | 0 | 0 | 8 | |||||
1 | 0 | 0 | 1 | 9 | |||||
1 | 0 | 1 | 0 | 10 |
Computer Architecture layer
Abstraction
“To take a relatively complex system and simplify it for our use.”
We don’t interact with the computers in the form of 0s and 1s (we actually do), instead an abstraction layer like, keyboard, mouse, error messages instead of showing a bunch of machine code etc.
Software layer
How we as human interact with our computer.
User
User interacts with a computer. One can operate, maintain, and even program the computer.