Retrieved from https://studentshare.org/information-technology/1441327-how-is-entropy-and-information-gain-theory-used-in
https://studentshare.org/information-technology/1441327-how-is-entropy-and-information-gain-theory-used-in.
In a communication arrangement, two sources of coding are evident, the first one is the source coding, which focuses on efficiency provision of digital presentation from source signal. Secondly, we have the channel coding mainly the error-control coding which is for the provision of reliable communications through noisy channels (Ashikhmin, Barg & Dimacs 47) In coding theory, entropy and information gain theory offers various roles. Firstly, it is concerned with data reduction, which happens mostly studying a given task and coming across extra material called side information.
This will result to the need of data reduction. Therefore, appliance of entropy and information will be of pronounced meaning because it will lessen the extra information. This theory can also interpret the extra information to represent context or situation information (Roth 6). Secondly, entropy and information theory concerns the determination of indecision that is allied with the given information. For example, if certain specific information is on conduction and this theory happens to known it before the transmission of that material, it will lead to the failure of that information going through transmission.
Entropy focuses on maximization during the equiprobable of meanings thus determining vagueness (Roth 7). Thirdly, this theory promotes intelligence and application of secrecy to information. These concepts mostly apply to cryptography compounded with cryptanalysis (Roth 7). It focuses on the redundancy of plaintext by giving the least quantity of ciphertext ensuring exceptional decipherability. Here information theory makes us be certain that it is hard to keep any secret (Golomb, Peile & Scholtz 202).
Fourthly, entropy and information gain theory deals with the gathering of unpolluted disciplines, which have already gone through investigation and transformed to engineering practice. It deals with very broad applications thus the vitality of coding theory. Fifthly, this theory is important when it comes to “error-correcting codes” in computers with high-speed memories (Cover & Thomas 13). These codes are of vital use when it comes to enhancing the reliability of computer memories. Here the computers contain unusual features that are rarely in communication applications.
These errors are due to encoding, decoding, and uncommon type of errors. When this occurs, the entropy and information gain theory are able to detect double error appearing at the same time, correcting the single errors (Cover & Thomas 13). Source coding theory is about well-organized demonstration of given data that are from a certain information source (Gray 34). For example when it is an image-coding, achieving source coding can be through manipulating terminations of that image. To attain “noiseless source coding”, the measures of given information and its complexity should be observed.
Central to this, the entropy and information theory is of application when it comes to detection of total information. This theory also helps when it comes to arithmetic coding compounded with statistical modeling (Kannappan 174). Arithmetic coding avoids assigning certain bit of given patterns to the original source symbol. In entropy and information theory, a connotation with a code is given and it concludes the order of symbol. These code words have sub intervals showing the disparity
...Download file to see next pages Read More