top of page
Writer's pictureAlibek Jakupov

Information Theory: Step 6 Hamming code

Updated: Nov 19, 2021





This article is the logical suit of the previous article devoted to error correction. The goal is create an error correction algorithm with the help of parity bits (discussed in the previous article).

Hamming codes are a family of linear error-correcting codes that generalize the Hamming (7,4) - code invented by Richard Hamming in 1950.

For each integer r ≥ 2 there is a code with block length n = 2^r−1 and message length k = 2^r−r−1.


(n, k)=(2^r−1, 2^r−r−1)


Hamming code's main goal is to increase the hamming distance, and increase the code rate.


In this article are going to develop (7,4) Hamming code, in other words, to code correctly a message of 4 symbols (k) we need 3 parity bits, thus making a 7 (n) symbols message.


Important: with this configuration the algorithm is only able to correct a single error. However, an error is corrected in the encoded message, thus even if there was en error in parity bits we will be able to detect and correct it.


Thus formula for each parity bit is the following:

first parity bit = i1 XOR i2 XOR i3 (where i is a symbol from original message)

second parity bit = i2 XOR i3 XOR i4

third parity bit = i2 XOR i3 XOR i4


We then calculate the error syndrome which will help us detect erroneous symbol.

S1 = first parity bit XOR i1 XOR i2 XOR i3

S2 = second parity bit XOR i2 XOR i3 XOR i4

S3 = third parity bit XOR i1 XOR i2 XOR i4


Error syndrome = (s1, s2, s3)


And here is the magic: we look for our error syndrome value and detect the error in our message.



where i is a symbol in initial message, and r is a parity bit.


The logic is quite straightforward: we divide our initial message (converted into a set of 0s and 1s using Shanon or Huffman algorithm) by blocks of 4 add parity bits and send it. If there were transmission errors out system will be capable of detecting errors in the received message and correct them. However only error per block may be detected as has been discussed above.


Here's the implementation.


import sys
 
K = 4
def encode(s):
 """Read in K=4 bits at a time and write out those plus parity bits"""
 while len(s) >= K:
        nybble = s[0:K]
        sys.stdout.write(hamming(nybble))
        s = s[K:]
 
def hamming(bits):
 """Return given 4 bits plus parity bits for bits (1,2,3), (2,3,4) and (1,3,4)"""
    t1 = parity(bits, [0,1,2])
    t2 = parity(bits, [1,2,3])
    t3 = parity(bits, [0,1,3])
 return bits + t1 + t2 + t3 
 
def parity(s, indicies):
 """Compute the parity bit for the given string s and indicies"""
    sub = ""
 for i in indicies:
        sub += s[i]
 return str(str.count(sub, "1") % 2


sequence = raw_input("Enter your encoded msg")
print "The decode sequence is: "
encode(sequence)
###################################################################################
# Main
 
if __name__ == "__main__":
    input_string = sys.stdin.read().strip()
    encode(input_string)

And, as usual, link to github.



References:

  • Arndt C. Information Measures: Information and its Description in Science and Engineering.

  • Thomas Cover. Elements Of Information Theory.

81 views0 comments

Recent Posts

See All

Comments


bottom of page