Computers represent everything as binary numbers. For a computer, there is not much difference between a movie and a book - both of them are a long stream of binary digits. The difference is the way the user tells the computer to interpret these numbers. For example, by opening a file with a video player, the numbers in that file are interpreted as a sequence of frames with sound, a video. However, in order to be store anything as a sequence of binary numbers, we need to come up with an encoding - a way to code represent all the data we need in binary form. For example, in the video storing example, we need to come up with ways to store pixels, which form images, which are in a particular sequence and are to be played along a sound track. All this must be somehow embedded in the binary file (stream of bits) that represents the video. Text is no different. What are letters for us, for computers must be binary numbers. At the beginning of computing there was a convention called ASCII, which had representations for all lowercase and uppercase letters of the English alphabet, some symbols and special characters as 7 bit numbers. For example, the whitespace character is encoded as 32 (0010 0000 in binary), while the lowercase 'a' is 97 (1001 0111 in binary). However, ASCII is quite limited (only around 100-200 characters). Thus, Unicode was born - a way of representing thousands and thousands of characters, including letters and symbols from all languages around the world. For chess, Unicode contains symbols for the chess pieces, so it might be useful to indicate which piece has moved.