Keywords: fuzzy set, generalized fuzzy entropy, generalized fuzzy average, godeword length, information Bounds
American Journal of Applied Mathematics and Statistics, 2014 2 (2),
pp 73-76.
DOI: 10.12691/ajams-2-2-4
Received June 14, 2013; Revised September 01, 2013; Accepted February 27, 2014
Copyright © 2013 Science and Education Publishing. All Rights Reserved.
1. Introduction
Fuzzy sets play a significant role in many deployed systems because of their capability to model non-statistical imprecision. Consequently, characterization and quantification of fuzziness are important issues that affect the management of uncertainty in many system models and designs. Zadeh [14] introduced the concept of fuzzy sets in which imprecise knowledge can be used to define an event. A fuzzy set ‘A’ is represented as
where
gives the degree of belongingness of the element
to the set ‘A’. If every element of the set ‘A’ is ‘0’ or ‘1’, there is no uncertainty about it and a set is said to be crisp set. On the other hand, a fuzzy set ‘A’ is defined by a characteristic function
.
The function
associates with each
grade of membership function. The importance of fuzzy set comes from the fact it can deal with imprecise and inexact information. Its application areas span from design of fuzzy controller to robotics and artificial intelligence. Many fuzzy measures have been discussed and derived by Kapur [6], Lowen [8], Pal and Bezdek [10] etc.
The basic noiseless coding theorems [6, 11] give the lower bound for the mean codeword length of a uniquely decipherable code in terms of Shannon’s [12] measure of entropy. Kapur [7] has established relationship between probabilistic entropy and coding. But, there are situations where probabilistic measure of entropy does not work. To tackle such situations, instead of taking the probability, the idea of fuzziness can be explored.
De Luca and Termini [3] introduced a measure of fuzzy entropy corresponding to Shannon’s [12] information theoretic entropy and is given by
 | (1.1) |
Bhandari and Pal [1] surveyed the literature on information measures of fuzzy sets and also gave some new measures. Thus corresponding to Renyi’s [11] entropy of order they suggested that the amount of ambiguity or fuzziness of order should be
 | (1.2) |
Kapur [7] has taken measure of fuzzy entropy corresponding to Havrada Charvat [4] as
 | (1.3) |
Corresponding to Cambell’s [2] measure of entropy, the fuzzy entropy can be taken as
 | (1.4) |
Corresponding to Sharma and Taneja [13] measure of entropy of degree (,), Kapur [6] has taken the following measure of entropy
 | (1.5) |
Corresponding to Kapur [6] measure of entropy of degree (,), Kapur [6] has given measure of entropy for fuzzy sets as
 | (1.6) |
Shannon [12] established the first noiseless coding theorem which states that for all uniquely decipherable codes, the lower bound for the arithmetic mean
lies between
, where
is Shannon’s measure of entropy.
Mathai, A.M. [9] has given the measure of entropy as
 | (1.7) |
Corresponding to this measure, we propose the following average codeword length as:
 | (1.8) |
Corresponding to equation (1.7) we propose the following measure of fuzzy entropy as
 | (1.9) |
And the corresponding average codeword length as
 | (1.10) |
Remark :
(I): When
tends to Shannon’s entropy given as
 | (1.11) |
(II): When
(1.8) tends to average codeword length given by
 | (1.12) |
In section 2, some noiseless coding theorems connected with fuzzy entropy corresponding to Mathai’s [9] entropy have been proved.
2. Fuzzy Noiseless Coding Theorems
Theorem2.1: For all uniquely decipherable codes
 | (2.1) |
Where
Proof: By Holders inequality, we have
 | (2.2) |
Set
;
and
,
.
Thus equation (2.2) becomes
Using Kraft’s inequality, we have
or
or
 | (2.3) |
Dividing both sides by t, we get
Subtracting n from both sides, we get
 | (2.4) |
Taking
and
Thus equation (2.4) becomes
Dividing both sides by α, we get
 | (2.5) |
That is
.
Which proves the theorem.
Theorem 2.2: For all uniquely decipherable codes,
 | (2.6) |
Where,
 | (2.7) |
And 
Proof: Since from (2.5), we have
Multiplying both sides by
, we have
 | (2.8) |
Changing α to β, we have
 | (2.9) |
Subtract (2.9) to (2.8), and divide by
, we get
 | (2.10) |
That is
. This proves the theorem.
Theorem 2.3: For all uniquely decipherable codes
 | (2.11) |
Where
 | (2.12) |
Proof: The result can be easily proved by adding (2.8) and (2.9) and then dividing by
Theorem 2.4: For all uniquely decipherable codes
 | (2.13) |
Where
 | (2.14) |
And
 | (2.15) |
To prove this theorem, we first prove the following lemma.
Lemma 1: For all uniquely decipherable codes
Proof of the Lemma. From equation (2.3), we have
Subtracting ‘n’ from both sides, we get
Taking
, and
, we have
 | (2.16) |
Which proves the lemma
Proof of the theorem 2. 4
Changing α to β in (2.16), we have
 | (2.17) |
Dividing (2.17) to (2.16), we get
Dividing both sides by β − α, we have
 | (2.18) |
⟹
. The R.H.S. is a new exponentiated mean codeword length of order α and type β and is defined as
References
[1] | Bhandari, N. R. Pal, Some new information measures for fuzzy sets, Information Sciences 1993; Vol. 67, No. 3: pp. 209-228. |
| In article | CrossRef |
|
[2] | Campbell, L.L., A coding theorem and Renyi’s entropy, Information and Control 1965; Vol. 8: pp. 423-429. |
| In article | CrossRef |
|
[3] | De Luca, S. Termini, A Definition of Non-probabilistic Entropy in the Setting of fuzzy sets theory, Information and Control 1972; Vol.20: pp.301-312. |
| In article | CrossRef |
|
[4] | Havrada, J. H., Charvat, F., Quantification methods of classificatory processes, the concepts of structural α entropy, Kybernetika 1967; Vol.3: pp. 30-35. |
| In article | |
|
[5] | J.N.Kapur, Measures of Fuzzy Information, Mathematical Science Trust Society, New Delhi; 1997. |
| In article | |
|
[6] | Kapur, J. N., A generalization of Campbell’s noiseless coding theorem, Jour. Bihar Math, Society 1986; Vol.10: pp.1-10. |
| In article | |
|
[7] | Kapur, J. N., Entropy and Coding, Mathematical Science Trust Society, New Delhi; 1998. |
| In article | |
|
[8] | Lowen, R., Fuzzy Set Theory–Basic Concepts, Techniques and Bibliography, Kluwer Academic Publication. Applied Intelligence 1996; Vol. 31, No. 3: pp.283-291. |
| In article | |
|
[9] | Mathai, A.M., Rathie, P.N., Basic Concept in Information Theory and Statistics. Wiley Eastern Limited, New Delhi; 1975. |
| In article | |
|
[10] | Pal, Bezdek, Measuring Fuzzy Uncertainty, IEEE Trans. of fuzzy systems 1994; Vol. 2, No. 2: pp.107-118. |
| In article | CrossRef |
|
[11] | Renyi, A., On measures of entropy and information. Proceedings 4th Berkeley Symposium on Mathematical Statistics and Probability 1961; Vol.1: pp.541-561. |
| In article | |
|
[12] | Shannon, C. E., A mathematical theory of communication. Bell System Technical Journal 1948; Vol.27: pp.379-423, 623-659. |
| In article | CrossRef |
|
[13] | Sharma, B.D., Taneja, I. J., Entropies of typeα, β and other generalized measures of information theory, Matrika 1975; Vol.22: pp. 205-215. |
| In article | CrossRef |
|
[14] | Zadeh, L. A., Fuzzy Sets, Inform, and Control 1966; Vol.8: pp.94-102. |
| In article | |
|