رکورد قبلیرکورد بعدی

" Maximum Entropy, Information Without Probability and Complex Fractals "


Document Type : BL
Record Number : 579200
Doc. No : b408419
Main Entry : Jumarie, Guy.
Title & Author : Maximum Entropy, Information Without Probability and Complex Fractals : Classical and Quantum Approach /\ by Guy Jumarie.
Publication Statement : Dordrecht :: Springer Netherlands :: Imprint: Springer,, 2000.
Series Statement : Fundamental Theories of Physics, An International Book Series on The Fundamental Theories of Physics: Their Clarification, Development and Application ;; 112
ISBN : 9789401594967
: : 9789048154678
Contents : 1. Introduction -- 2. Summary of Information Theory -- 3. Path Entropies of Non Random Functions -- 4. Path Entropies of Random Functions and of Non-Random Distributed Functions -- 5. Quantum Entropies of Non-Probabilistic Square Matrices -- 6. Complex-Valued Fractional Brownian Motion of Order n. Part I -- 7. Complex-Valued Fractional Brownian Motion of Order n. Part II -- 8. Information Thermodynamics and Complex-Valued Fractional Brownian motion of Order n -- 9. Fractals, Path Entropy, and Fractional Fokker-Planck Equation -- 10. Outline of Applications.
Abstract : Every thought is a throw of dice. Stephane Mallarme This book is the last one of a trilogy which reports a part of our research work over nearly thirty years (we discard our non-conventional results in automatic control theory and applications on the one hand, and fuzzy sets on the other), and its main key words are Information Theory, Entropy, Maximum Entropy Principle, Linguistics, Thermodynamics, Quantum Mechanics, Fractals, Fractional Brownian Motion, Stochastic Differential Equations of Order n, Stochastic Optimal Control, Computer Vision. Our obsession has been always the same: Shannon's information theory should play a basic role in the foundations of sciences, but subject to the condition that it be suitably generalized to allow us to deal with problems which are not necessarily related to communication engineering. With this objective in mind, two questions are of utmost importance: (i) How can we introduce meaning or significance of information in Shannon's information theory? (ii) How can we define and/or measure the amount of information involved in a form or a pattern without using a probabilistic scheme? It is obligatory to find suitable answers to these problems if we want to apply Shannon's theory to science with some chance of success. For instance, its use in biology has been very disappointing, for the very reason that the meaning of information is there of basic importance, and is not involved in this approach.
Subject : Computer science.
Subject : Coding theory.
Subject : Mathematics.
Subject : Distribution (Probability theory).
Added Entry : SpringerLink (Online service)
کپی لینک

پیشنهاد خرید
پیوستها
Search result is zero
نظرسنجی
نظرسنجی منابع دیجیتال

1 - آیا از کیفیت منابع دیجیتال راضی هستید؟