Next:About
this document ...Up:book
Previous:Fuzzy
Sets and Information Go to:Table
of Contents
Bibliography
-
1
-
ACZÉL, J.D. and Z. DARÓCZY (1963), "Uber Verallegmeinerte
quasilineare Mittelwerte die mit Gewichtsfunktionnen Gebildet sind", Publications
Mathematicae, 10, 171-190.
-
2
-
ACZÉL, J.D. and Z. DARÓCZY (1975), On Measures of Information
and Their Generalizations", Academic Press, New York.
-
3
-
ARIMOTO, S. (1971), "Information-Theoretic Considerations on Estimation
Problems", Information and Control, 19, 181-190.
-
4
-
ARIMOTO, S. (1975), "Information Measures and Capacity of Order
for Discrete Memoryless Channels", Colloq. on Information Theory,
Kestheley, Hungary, pp. 41-52.
-
5
-
ARIMOTO, S. (1976), "Computation of Random Coding Exponent Functions",
IEEE
Trans. on Inform. Theory, IT-20, 460-473.
-
6
-
ASH, R. (1965),"Information Theory", Interscience Publ., New York.
-
7
-
AUTAR (1965), "On Relative Information Function", Indian J. Pure and
Applied Math, 6, 1449-1454.
-
8
-
BECKENBACH, E.F. and R. BELLMAN (1971), "Inequalities", Springer-Verlag,
New York.
-
9
-
BEHARA, M. and J.M.S. CHAWLA (1974), "Generalized Gama-Entropy", Entropy
and Ergodic Theory: Selecta Statistica Canadiana, Vol. II, 15-38.
-
10
-
BELIS, M. and S. GUIASU (1968), "Qualitative-Quantitative Measure of Information
in Cybernetic System", IEEE Trans. on Inform. Theory, IT-14,
591-592.
-
11
-
BEN-BASSAT, B. and J. RAVIV (1978), "Rényi's Entropy and the Probability
of Error", IEEE Trans. on Inform. Theory, IT-24, 324-331.
-
12
-
BLACHMAN, N.M. (1965), "The Convolution Inequality for Entropy Powers",
IEEE
Trans. on Inform. Theory, IT-11, 267-271.
-
13
-
BLAHUT, R.E. (1987), "Information Theory", Addison Wesley, New York.
-
14
-
BLUMER, A.C. and R.J. McELIECE (1988), "The Rényi Redundancy of
Generalized Huffman Codes", IEEE Trans. on Inform. Theory, IT-34,
1242-1249.
-
15
-
BOEKEE, D.E. and J.C.A. VAN DER LUBBE (1979), "Some Aspects of Error Bounds
in Feature Selection", Pattern Recognition, 11, 353-360.
-
16
-
BOEKEE, D.E. and J.C.A. VAN DER LUBBE (1980), "The R-Norm Information Measure",
Information
and Control, 45, 136-155.
-
17
-
BURBEA, J. (1984), "The Bose-Einstein Entropy of Degree
and Its Jensen Difference", Utilitas Mathematicas, 25, 225-240.
-
18
-
BURBEA, J. and C.R. RAO (1982a), "Entropy Differential Metric, Distance
and Divergence Measures in Probability Spaces: A Unified Approach", J.
Multi. Analysis, 12, 575-596.
-
19
-
BURBEA, J. and C.R. RAO (1982b), "On the Convexity of Some Divergence Measures
Based on Entropy Functions, IEEE Trans. on Inform. Theory, IT-28,
489-495.
-
20
-
CAMPBELL, L.L. (1965), "A Coding Theorem and Rényi's Entropy", Information
and Control, 23, 423-429.
-
21
-
CAMPBELL, L.L. (1972), "Characterization of Entropy of Probability Distribution
on the Real Line", Information and Control, 21, 329-338.
-
22
-
CAMPBELL, L.L. (1985), "The Relation Between Information Theory and the
Differential Geometry Approach to Statistics", Information Sciences,
35,
199-210.
-
23
-
CAPOCELLI, R.M. and I.J. TANEJA (1985), "On Some Inequalities and Generalized
Entropies: A Unified Approach", Cybernetics and System, 16,
341-376.
-
24
-
CAPOCELLI, R.M., L. GARGANO, U. VACARRO and I.J. TANEJA (1985), "Generalized
Distance Measures and Error Bounds", Proc. IEEE Intern. Conf. on Syst.
Man, and Cybern., Arizona, U.S.A., November 12-15, 78-82.
-
25
-
CAPOCELLI, R.M. and A. De SANTIS (1988a), "Tighter Upper Bounds on the
Entropy Series", Research Report, IBM Research Division, T.J. Watson Research
Center, Yorktown Heights, N.Y.
-
26
-
CAPOCELLI, R.M. A. De SANTIS and I.J. TANEJA (1988b), "Bounds on the Entropy
Series", IEEE Trans. on Inform. Theory, IT-34, 134-138.
-
27
-
CHAUNDY, T.W. and J.B. McLEOD (1960), "On a Functional Equation", Proc.
Edin. Math. Soc. Edin. Math. Notes, 43, 7-8.
-
28
-
COSTA, M.H. (1985), "A New Power Inequality", IEEE Trans. on Inform.
Theory, IT-31, 751-760.
-
29
-
CSISZÁR (1974), "Information Measures: A Critial Survey", Trans.
of the 7th Prague Conferen., 83-86.
-
30
-
CSISZÁR, I. and J. KÖRNER (1981), "Information Theory: Coding
Theorems for Discrete Memoryless Systems", Academic Press, New York.
-
31
-
DARÓCZY (1963), "Uben die Gemeinsame Charakterisierung der zu nicht
vollständigen Verteillungen Gehörigen Entropien von Shannon und
Rényi", Z. Wahrs. und Verw Geb., 1, 381-388.
-
32
-
DARÓCZY (1964), "Uber mittelwerte und entropien vollständiger
wahrscheinlichkeiscerteilungen", Acta Math. Acad. Sci. Hunger, 15,
203-210.
-
33
-
DARÓCZY (1970), "Generalized Information Measures", Information
and Control, 16, 36-51.
-
34
-
DEVIJVER, P.A. (1974), "On a New Class of Bounds in Bayes Risk in Multihypothesis
Pattern Recognition", IEEE Trans. on Comp., C-23, 70-80.
-
35
-
DOWSON, D.C. and A. WRAGG (1973), "Maximum Entropy Distribution Having
Prescribed First and Second Moments", IEEE Trans. on Inform. Theory,
IT-19,
689-693.
-
36
-
FEINSTEIN, F. (1958), "Foundations of Information Theory", McGraw
Hill, New York.
-
37
-
FERRERI, C, (1980), "Hypoentropy and Related Hetrogeneity Divergency and
Information Measures", Statistica, XL, 155-168.
-
38
-
GALLAGER, R.G. (1968), "Information Theory and Reliable Communication",
J. Wiley and Sons, New York.
-
39
-
GALLAGER, R.G. (1978), "Variation on a Theme by Huffman", IEEE Trans.
on Inform. Theory, IT-29, 668-674.
-
40
-
GOKHALE, D.V. (1975), "Maximum Entropy Characterization of Some Distributions",
In Statistical Distributions in Scientific Work, Patil, Kotz and
Ord. Eds., Boston, M.A. Reidel, Vol. 3, 299-304.
-
41
-
GUIASU, S. (1977), "Information Theory with Applications", McGraw-Hill,
Intern. Book Company, New York.
-
42
-
GYÖRFI, L. and T. NEMETZ (1975), "f-Dissimilarity: A General Class
of Seperation Measures and Several Probability Measures", Colloq. on
Inform. Theory, Keszthely, Hungary, 309-331.
-
43
-
HARDY, G.H., J.E. LITTLEWOOD and G. PÖLYA (1934), "Inequalities",
Cambridge University Press, London.
-
44
-
HARTLEY, R.T.V. (1928), "Transmission of information", Bell Syst. Tech.
J., 7, 535-563.
-
45
-
HATORI, H. (1958), "A Note on the Entropy of a Continuous Distribution",
Kodai
Math. Sem. Rep., 10, 172-176.
-
46
-
HAVRDA, J. and F.CHARVÁT (1967), "Quantification Method of Classification
Processes: Concept of Structrual a-Entropy", Kybernetika, 3,
30-35.
-
47
-
HORIBE, Y. (1973), "A Note on Entropy Metrics", Inform. and Contr.,
22,
403-404.
-
48
-
HORIBE, Y. (1985), "Entropy and Correlation", IEEE Trans. on Syst. Man,
and Cybern., SMC-15, 641-642.
-
49
-
JEFFREYS, H. (1946), "An Invariant Form for the Prior Probability in Estimation
Problems", Proc. Roy. Soc. Lon., Ser. A, 186, 453-461.
-
50
-
JELINEK, F. (1968a), "Probabilistic Information Theory", McGraw
Hill, New York.
-
51
-
JELINEK, F. (1968b), "Buffer Overflow in Variable Length Coding of a Fixed
Rate Sources", IEEE Trans. on Inform. Theory, IT-18, 765-774.
-
52
-
JELINEK, F. and K. SCHNEIDER (1972), "On Variable Length to Block Coding",
IEEE
Trans. on Inform. Theory, IT-18, 765-774.
-
53
-
KAGAN, A.M., Yu. V. LINNIK and C.R. RAO (1973), "Characaterization Problems
in Mathematical Statistics", J. Wiley and Sons, New York.
-
54
-
KAPUR, J.N. (1967), "Generalized Entropy of Order
and Type ",
The
Math. Seminar, 4, 78-94.
-
55
-
KAPUR, J.N. (1983), "A Comparative Assesment of Various Measures of Entropy",
J.
Inform. and Optim. Sci., 4, 207-232.
-
56
-
KAPUR, J.N. (1986), "Four Families of Measures of Entropy", Indian J.
Pure and Applied Math., 17(4), 429-449.
-
57
-
KAPUR, J.N. (1987), "Inaccuracy, Entropy and Coding Theory", Tamkang
J. Math., 18, 35-48.
-
58
-
KAPUR, J.N. (1988a), "On Measures of Divergence Based on Jensen Difference",
Nat.
Acad. Sci. Letters, 11, 23-27.
-
59
-
KAPUR, J.N. (1988b), "Some New Nonadditive Measures of Entropy", Boll.
U.M.I., 7, 2-B, 253-266.
-
60
-
KAPUR, J.N. (1989), "Maximum-Entropy Models in Science and Engineering",
Wiley Eastern Limited, New Delhi.
-
61
-
KAPUR, J.N. (1990), "On Trigonometric Measures of Information", J. Math.
Phys. Sci., 24, 1-10.
-
62
-
KAPUR, J.N. and H.K. KESAVAN (1992), "Entropy Optimizaion Principles
with Applications", Academic Press, New York.
-
63
-
KERRIDGE, D.F. (1961), "Inaccuaracy and Inference", J. Royal Statist.
Society, Ser. B, 23, 184-194.
-
64
-
KIEFFER, J.C. (1970), "Variable-Length Source Coding With a Cost Depending
only on the Codeword Length", Inform. and Contr., 41, 136-146.
-
65
-
KULLBACK, S. and R.A. LEIBLER (1951), "On Information and Sufficiency",
Ann.
Math. Statist., 22, 79-86.
-
66
-
LAZO, A.C.G. and P.N. RATHIE (1978), "On the Entropy of Continuous Probalility
Distribution", IEEE Trans. on Inform. Theory, IT-31, 751-760.
-
67
-
LINSEMAN, J.H.C. and M.C.A. VAN ZYLEN (1972), "Note on the Generalization
of the Most Probable Frequency Distribution, Statistica Netherlandica,
26,
19-23.
-
68
-
MANGASARIAN, O.L. "Nonlinear Programming", Tata McGraw Hill, New
Delhi/Bombay.
-
69
-
MANSURIPUR, M. (1987), "Introduction to Information Theory", Prentice-Hall,
Inc., New Jersey.
-
70
-
MARSHALL, A.W. and I. OLKIN (1979), "Inequalities: Theory of Majorization
and Its Applications", Academic Press, New York.
-
71
-
MATHAI, A.M. and P.N. RATHIE (1975), "Basic Concepts in Information
Theory and Statistics", Wiley Eastern Ltd., New Delhi.
-
72
-
McELIECE, R.J. (1977), "The Theory of Information and Coding", Encyclopedia
of Mathematics and its Appplications, Vol. 3, Addison Wesley Publishing
Company, Reading, Massachusetts.
-
73
-
NATH, P. (1968), "On Measures of Error in Information", J. Math. Sci.,
III,
1-16.
-
74
-
NATH, P. (1975), "On Coding Theorem Connected with Rényi's Entropy",
Inform.
and Contr., 29, 234-242.
-
75
-
NYQUIST, H. (1924), "Certain Factors Affecting Telegraph Speed", Bell
Syst. Tech. J., 3, 324-.
-
76
-
NYQUIST, H. (1928), "Certain Topics in Telegraph Transmission Theory",
AIEEE
Transactions 47, 617-.
-
77
-
PICARD, C.F. (1979), "Weighted Probabilistic Information Measures", J.
Inform. and Syst. Sci.,4, 343-356.
-
78
-
RATHIE, P.N. (1970), "On a Generalized Entropy and a Coding Theorem", J.
Appl. Probl., 7, 124-133.
-
79
-
RATHIE, P.N. and Pl. KANNAPPAN (1972), "A Directed-Divergence Function
of Type ",
Inform.
and Contr., 20, 38-45.
-
80
-
RATHIE, P.N. and L.T. SHENG (1981), "The J-Divergence of Order ",
J.
Comb. Inform. and Syst. Sci., 6, 197-205.
-
81
-
RATHIE, P.N. and I.J. TANEJA (1991), "Unified Entropy
and Its Bivariate Measures", Information Sciences, 56, 23-39.
-
82
-
RÉNYI, A. (1961), "On Measures of Entropy and Information", Proc.
4th Berk. Symp. Math. Statist. and Probl., University of California
Press, Vol. 1, 547-461.
-
83
-
ROBERTS, A.W. and D.E. VARBERG (1973), "Convex Functions", Academic
Press, New York.
-
84
-
SAHOO, P.K. and A.K.C. WONG (1988), "Generalized Jensen Difference Based
on Entropy Functions", Kybernetika, 24, 241-250.
-
85
-
SANT'ANNA, A.P. and I.J. TANEJA (1985), "Trigonometric Entropies, Jensen
Difference Divergence Measures and Error Bounds", Information Sciences,
35,
145-155.
-
86
-
SHANNON, C.E. (1948), "A Mathematical Theory of Communication", Bell
Syst. Tech. J., 27, 379-423, 623-656.
-
87
-
SHARMA, B.D. and R. AUTAR (1973), "Characterization of Generalized Inaccuracy
Measure in Information Theory", J. Appl. Probl., 10, 464-468.
-
88
-
SHARMA, B.D. and R. AUTAR (1974), "Relative Information Function and Their
Type
Generalizations", Metrika, 21, 41-50.
-
89
-
SHARMA, B.D. and H.C. GUPTA (1976), "On Non-Additive Measures of Inaccuracy",
Czech
Math. J., 26, 584-595.
-
90
-
SHARMA, B.D. and D.P. MITTAL (1975), "New Nonadditive Measures of Inaccuracy",
J.
Math. Sci., 10, 122-133.
-
91
-
SHARMA, B.D. and D.P. MITTAL (1977), "New Nonadditive Measures of Relative
Information, J. Comb. Inform. and Syst. Sci., 2, 122-133.
-
92
-
SHARMA, B.D. and I.J. TANEJA (1975), "Entropy of Type
and Other Generalized Additive Measures in Information Theory", METRIKA,
22,
205-215.
-
93
-
SHARMA, B.D. and I.J. TANEJA (1977), "Three Generalized Additive Measures
of Entropy", Elec. Inform. Kybern., 13, 419-433.
-
94
-
SHIVA, S.S.G., N.U. AHMED and N.D. GEORGANS (1973), "Order Preserving Measures
of Information", J. Appl. Probl., 10, 666-670.
-
95
-
SIBSON, R. (1969), "Information Radius", Z. Wahrs. und verw Geb.,
14,
149-160.
-
96
-
STAM, A.J. (1959), "Some Inequalities satisfied by the Quantities of Information
of Fisher and Shannon", Inform. and Contr., 2, 102-112.
-
97
-
TANEJA. I.J. (1976), "On Measures of Information and Inaccuarcy", J.
Statist. Phys., 14, 203-270.
-
98
-
TANEJA. I.J. (1977), "On a Characterization of Unified Measure of Information
Theory", Inform. Sci., 13, 229-237.
-
99
-
TANEJA. I.J. (1979), "Some Contributions to Information Theory-I: On Measures
of Information (A Survey)", J. Comb. Inform. and Syst. Sci., 4,
253-274.
-
100
-
TANEJA. I.J. (1983), "On a Characterization of J-Divergence and Its Generalizations",
J.
Comb. Inform. and Syst. Sci., 8, 206-212.
-
101
-
TANEJA. I.J. (1984a), "Lower Bounds on Exponentiated Mean Codeword Length
for the Best 1:1 Code", Matemática Aplicada e Comput., 3,
199-204.
-
102
-
TANEJA. I.J. (1984b), "On Characterization of Generalized Information Measures",
J.
Comb. Inform. and Syst. Sci., 9, 169-174.
-
103
-
TANEJA. I.J. (1987), "Stastical Aspects of Divergence Measures", J.
Stat. Plann. and Inferen., 16, 137-145.
-
104
-
TANEJA. I.J. (1988), "Bivariate Measures of Type
and Their Applications", Tamkang J. Math., 19(3), 63-74.
-
105
-
TANEJA. I.J. (1989), "On Generalized Information Measures and Their Applications",
Chapter in:Ad. Elect. and Elect. Physics, Ed. P.W. Hawkes 76,
327-413.
-
106
-
TANEJA. I.J. (1990a), "On Generalized Entropies with Applications", Chapter
in: Lectures in Appl. Math. and Inform., Ed. L.M. Ricciardi, Manchester
University Press, 107-169.
-
107
-
TANEJA. I.J. (1990b), "Bounds on the Probability of Error in Terms of Generalized
Information Radius", Inform. Sci., bf 51,
-
108
-
TANEJA, I.J. (1995), "New Developments in Generalized Information Measures",
Chapter in: Advances in Imaging and Electron Physics, Ed. P.W. Hawkes,
91,
37-135.
-
109
-
TANEJA. I.J., L. PARDO, D. MORALES and M.L. MENENDEZ (1989), "On Generalized
Information and Divergence Measures and Their Applications: A Brief Review",
13,
47-73.
-
110
-
TOUSSAINT (1978), "Probability of Error, Expected Divergence, and the Affinity
of Several Distributions", IEEE Trans. on Syst. Man. and Cybern.,
SMC-8,
482-485.
-
111
-
TRIBUS (1969),"Rational Decisions, Decision and Designs", Pergaman,
New York.
-
112
-
TROUBORST, P.M., E. BAKER, D.E. BOEKEE and Y. BOXMA (1974), "New Families
of Probabilistic Distance Measures", Proc. 2nd Intern. Joint Conf. on
Pattern Pecognition, Copenhagen, Denmark.
-
113
-
VAJDA, I. (1968), "Bounds on the Minimal Error Probability on Checking
a Finite or Countable Number of Hypotheses", Inform. Trans. Problems,
4,
9-19.
-
114
-
VAJDA, I. (1989), "Theory of Statistical Inference and Information",
Kluwer Academic Press, London.
-
115
-
VAN DER LUBBE, J.C.A. (1978), "On Certain Coding Theorems for the Information
of Order
and of Type ",
Proc.
8th Prague Conf., 253-266.
-
116
-
VAN DER LUBBE, J.C.A., Y. BOXMA and D.E. BOEKEE (1984), "A Generalized
Class of Certainty and Information Measures", Information Sciences,
32,
187-215.
-
117
-
VAN DER LUBBE, J.C.A., D.E. BOEKEE and Y. BOXMA (1987), "Bivariate Certainty
and Information Measures", Information Sciences, 41, 139-169.
-
118
-
VAN DER PYL, T. (1977), "Proprietes de L'Information d'Ordre
et de Type ,
Colloq.
Intern. du C.N.R.S., No. 276, Teoria de L'Information, Cahan,
France, 4-8 July, 161-171.
-
119
-
VARMA, R.S. (1966), "Generalizations of Rényi's Entropy of Order ",
J.
Math. Sci., 1, 34-48.
-
120
-
Wiener, N. (1948), "Cybernetics, The MIT Press and Wiley, New York.
-
121
-
WYNER, A.D. (1972), "An Upper Bound on the Entropy Series", Inform.
and Contr., 20, 176-181.
2001-06-21