Data Representation

Data Representation - SSS Three

CLASS: SSS Three

TOPIC: Data Representations


Introduction to Data Representation

In computer science, data representation refers to the methods used to store, process, and transmit information within a computer system. All data, whether it's text, numbers, graphics, or sound, must be converted into a format that a computer can understand. This format is the binary system, a sequence of 0s and 1s.


Methods of Data Representation

1. Bits

A bit is the smallest unit of data in computing. It is a single binary digit, either a 0 or a 1. A group of eight bits is called a byte. A byte is a fundamental unit of computer storage.

1 Bit = A single 0 or 1


2. Binary Coded Decimal (BCD)

BCD is an acronym for Binary Coded Decimal. It's a method of using binary digits to represent each individual decimal digit from 0 to 9. A decimal digit is represented by four binary digits (also known as a nibble).

BCD Table

Decimal BCD (4-bit)
00000
10001
20010
30011
40100
50101
60110
70111
81000
91001

Example: Convert 4910 to BCD.

From the table above, to represent the number 49 in BCD, we find the binary for each digit separately:

$$4 = 0100_{BCD}$$

$$9 = 1001_{BCD}$$

Therefore, $$49_{10} = 01001001_{BCD}$$



3. EBCDIC

EBCDIC: Stands for Extended Binary Coded Decimal Interchange Code. This is an 8-bit scheme developed by IBM, similar to ASCII but primarily used on IBM mainframe computers. Each character is represented by two nibbles (4-bits each), one for the character class and one for the specific character.


4. ASCII

ASCII: Stands for American Standard Code for Information Interchange. It was one of the earliest and most widely used schemes. It originally used 7 bits to represent 128 characters, primarily for English. An extended version uses 8 bits to represent 256 characters.

Unicode

Unicode: This is the most modern and widely used standard. It was created to solve the limitations of ASCII and EBCDIC by providing a unique number for every character, regardless of language. It supports characters from languages all over the world, including Chinese, Arabic, and Nigerian languages like Yoruba. UTF-8 is the most common form of Unicode.


Test Your Knowledge

What is the BCD representation of the decimal number 83?

Comments

Post a Comment

Popular posts from this blog

90 Objective Examination Questions in Major Subjects

Complete Computer Studies/ICT Curriculum for JSS 1 to SSS 3

JSS 3 Objective Questions and Answers in Computer studies