Entropy Theory and its Application in Environmental and Water Engineering (eBook)
John Wiley & Sons (Verlag)
978-1-118-42860-3 (ISBN)
Entropy Theory and its Application in Environmental and Water Engineering responds to the need for a book that deals with basic concepts of entropy theory from a hydrologic and water engineering perspective and then for a book that deals with applications of these concepts to a range of water engineering problems. The range of applications of entropy is constantly expanding and new areas finding a use for the theory are continually emerging. The applications of concepts and techniques vary across different subject areas and this book aims to relate them directly to practical problems of environmental and water engineering.
The book presents and explains the Principle of Maximum Entropy (POME) and the Principle of Minimum Cross Entropy (POMCE) and their applications to different types of probability distributions. Spatial and inverse spatial entropy are important for urban planning and are presented with clarity. Maximum entropy spectral analysis and minimum cross entropy spectral analysis are powerful techniques for addressing a variety of problems faced by environmental and water scientists and engineers and are described here with illustrative examples.
Giving a thorough introduction to the use of entropy to measure the unpredictability in environmental and water systems this book will add an essential statistical method to the toolkit of postgraduates, researchers and academic hydrologists, water resource managers, environmental scientists and engineers. It will also offer a valuable resource for professionals in the same areas, governmental organizations, private companies as well as students in earth sciences, civil and agricultural engineering, and agricultural and rangeland sciences.
This book:
- Provides a thorough introduction to entropy for beginners and more experienced users
- Uses numerous examples to illustrate the applications of the theoretical principles
- Allows the reader to apply entropy theory to the solution of practical problems
- Assumes minimal existing mathematical knowledge
- Discusses the theory and its various aspects in both univariate and bivariate cases
- Covers newly expanding areas including neural networks from an entropy perspective and future developments.
Vijay P. Singh, Texas A & M University, USA
Entropy Theory and its Application in Environmental and Water Engineering responds to the need for a book that deals with basic concepts of entropy theory from a hydrologic and water engineering perspective and then for a book that deals with applications of these concepts to a range of water engineering problems. The range of applications of entropy is constantly expanding and new areas finding a use for the theory are continually emerging. The applications of concepts and techniques vary across different subject areas and this book aims to relate them directly to practical problems of environmental and water engineering. The book presents and explains the Principle of Maximum Entropy (POME) and the Principle of Minimum Cross Entropy (POMCE) and their applications to different types of probability distributions. Spatial and inverse spatial entropy are important for urban planning and are presented with clarity. Maximum entropy spectral analysis and minimum cross entropy spectral analysis are powerful techniques for addressing a variety of problems faced by environmental and water scientists and engineers and are described here with illustrative examples. Giving a thorough introduction to the use of entropy to measure the unpredictability in environmental and water systems this book will add an essential statistical method to the toolkit of postgraduates, researchers and academic hydrologists, water resource managers, environmental scientists and engineers. It will also offer a valuable resource for professionals in the same areas, governmental organizations, private companies as well as students in earth sciences, civil and agricultural engineering, and agricultural and rangeland sciences. This book: Provides a thorough introduction to entropy for beginners and more experienced users Uses numerous examples to illustrate the applications of the theoretical principles Allows the reader to apply entropy theory to the solution of practical problems Assumes minimal existing mathematical knowledge Discusses the theory and its various aspects in both univariate and bivariate cases Covers newly expanding areas including neural networks from an entropy perspective and future developments.
Vijay P. Singh, Texas A & M University, USA
Chapter 2
Entropy Theory
Since the development of informational entropy in 1948 by Shannon, the literature on entropy has grown by leaps and bounds and it is almost impossible to provide a comprehensive treatment of all facets of entropy under one cover. Thermodynamics, statistical mechanics, and informational statistics tend to lay the foundation for what we now know as entropy theory. Soofi (1994) perhaps summed up best the main pillars in the evolution of entropy for quantifying information. Using a pyramid he summarized information theoretic statistics as shown in Figure 2.1, wherein the informational entropy developed by Shannon (1948) represents the vertex. The base of the Shannon entropy represents three distinct extensions which are variants of quantifying information: discrimination information (Kullback, 1959), mutual information (Lindley, 1956, 1961), and principle of maximum entropy (POME) or information (Jaynes, 1957, 1968, 1982). The lateral faces of the pyramid are represented by three planes: 1) the SKJ (Shannon-Kullback-Jaynes) minimum discrimination information plane, 2) the SLK (Shannon-Lindley-Kullback) mutual information plane, and 3) the SLJ (Shannon-Lindley-Jaynes) Bayesian information theory plane. Most of the information-based contributions can be located on one of the faces or in the interior of the pyramid. The discussion in this chapter on what we call entropy theory represents some aspects of all three faces but not fully.
Figure 2.1 Pyramid showing informational-theoretic statistics.
The entropy theory may be comprised of four parts: 1) Shannon entropy, 2) principle of maximum entropy, 3) principle of minimum cross entropy, and 4) concentration theorem. The first three are the main parts and are most frequently used. One can also employ the Tsallis entropy or another type of entropy in place of the Shannon entropy for some problems. Before discussing all four parts, it will be instructive to amplify the formulation of entropy presented in Chapter 1.
2.1 Formulation of entropy
In order to explain entropy, consider a random variable X which can take on N equally likely different values. For example, if a six-faced dice is thrown, any face bearing the number 1, 2, 3, 4, 5, or 6 has an equal chance to appear upon throw. It is now assumed that a certain value of X (or the face of the dice bearing that number or outcome upon throw) is known to only one person. Another person would like to know the outcome (face) of the dice throw by asking questions to the person, who knows the answer, in the form of only yes or no. Thus, the number of alternatives for a face to turn up in this case is six, that is, N = 6. It can be shown that the minimum number of questions to be asked in order to ascertain the true outcome is:
where I represents the amount of information required to determine the certain value of X, 1/N defines the probability of finding the unknown value of X by asking a single question, when all outcomes are equally likely, and log is the logarithm to the base of 2. If nothing else is known about the variable, then it must be assumed that all values are equally likely in accordance with the principle of insufficient reason.
In general,
where pi is the probability of outcome i = 1, 2, …, N. Here I can be viewed as the minimum amount of information required to positively ascertain the outcome of X upon throw. Stated in another way, this defines the amount of information gained after observing the event X = x with probability 1/N. In other words I is a measure of information and is a function of N. The base of the logarithm is 2, because the questions being posed (i.e., questions admitting only either yes or no answers) are in binary form. The point to be kept in mind when asking questions is to gain information, not assent or dissent, and hence in many cases a yes is as good an answer as is a no. This information measure or equation (2.2) satisfies the following properties:
If a six-faced dice is thrown, any face bearing the number 1, 2, 3, 4, 5, or 6 has an equal chance to appear. The outcome of the first throw is number 5 which is known to a person A. How many questions does one need to ask this person or how much information will be required to positively ascertain the outcome of this throw?
Solution:
In this case, N = 6. Therefore, . This gives the minimum amount of information needed or the number of questions to be asked in binary form (i.e., yes or no). The number of questions needed to be asked is a measure of uncertainty. The questioning can go like this: Is the outcome between 1 and 3? If the answer is no then it must be between 4 and 6. Then the second question can be: Is it between 4 and 5? If the answer is yes, then the next question is: Is it 4 and the answer is no. Then the outcome has to be 5. In this manner entropy provides an efficient way of obtaining the answer. In vigilance, investigative or police work entropy can provide an effective way of interrogation. Another example of interest is lottery.
Suppose that N tickets are sold for winning a lottery. In other words the winning ticket must be one of these tickets, that is, the number of chances are N. Let N be 100. Each ticket has a number between 1 and 100. One person, called Jack, knows what the winning ticket is. The other person, called Mike, would like to know the winning ticket by asking Jack a series of questions whose answers will be in the form of yes or no. Find the winning ticket.
Solution:
The number of binary questions needed to determine the winning ticket is given by equation (2.1). For N = 100, I = 6.64. This says that it will take 6.64 questions to find the winning ticket. To illustrate this point, the questioning might go as shown in Table 2.1. The questioning in Table 2.1 shows how much information is gained simply by asking binary questions. The best way of questioning and finding an answer is by subdividing the class consisting of questions in half at each question. This is similar to the method of regula falsi in numerical analysis when determining the root of a function numerically.
Table 2.1 Questioning for finding the winning lottery ticket.
| Question number | Question asked | Answer |
| 1 | Is the winning ticket between 50 and 100? | No |
| 2 | Is it between 25 and 49? | No |
| 3 | Is it between 1 and 12? | Yes |
| 4 | Is it between 7 and 12? | Yes |
| 5 | Is it between 7 an 9? | Yes |
| 6 | Is it between 7 and 8? | Yes |
| 7 | It is 7? | No |
| Answer then is: 8 |
Consider another case where two coins are thrown and a person knows what the outcome is. There are four alternatives in which head or tail can appear for the first and second coins, respectively: head and tail, head and head, tail and tail, and tail and head or one can simply write the number of alternatives N as 22. The number of questions to be asked in order to ascertain the outcome is again given by equation (2.1): log2 4 = log2 22 = 2.
Consider that the probability of raining on any day in a given week is the same. In that week it rained on a certain day and one person knew the day it rained on. Another person would like to know the day it rained on by asking a number of questions to the person who knows the answer. The answers to be given are in binary form, that is, yes or no. What will be the minimum number of questions to be asked in order to determine the day it rained on?
Solution:
In this case, N = 7. Therefore, the minimum number of questions to be asked is: I = − log2 (1/7) = log2 7 = 2.807
In the above discussion the base of the logarithm is 2 because of the binary nature of answers and the questioning is done such that the number of alternatives is reduced to half each time a question is asked. If the possible responses to a question are three, rather than two, then with n questions one can cover 3n possibilities. For example, an answer to a question about weather may be: hot, cold, or pleasant. Similarly, for a crop farmers may respond: bumper, medium or poor. In such cases, the number of questions to cover N cases is given as
2.3...
| Erscheint lt. Verlag | 10.1.2013 |
|---|---|
| Sprache | englisch |
| Themenwelt | Naturwissenschaften ► Geowissenschaften ► Geologie |
| Naturwissenschaften ► Geowissenschaften ► Hydrologie / Ozeanografie | |
| Technik ► Umwelttechnik / Biotechnologie | |
| Schlagworte | application • Applications • Areas • Bauingenieur- u. Bauwesen • Book • Civil Engineering & Construction • Concepts • constantly expanding • continually • Deals • Different • directly • earth sciences • Engineering • Entropy • Environmental Geoscience • Geowissenschaften • Hydraulic/Water Engineering • Hydraulik, Wasserbau • Hydrological Sciences • Hydrologie • Need • New • Range • relate • responds • Singh • theory • Umweltgeowissenschaften • Umweltwissenschaften • University • USA • use • vijay p |
| ISBN-10 | 1-118-42860-9 / 1118428609 |
| ISBN-13 | 978-1-118-42860-3 / 9781118428603 |
| Informationen gemäß Produktsicherheitsverordnung (GPSR) | |
| Haben Sie eine Frage zum Produkt? |
Kopierschutz: Adobe-DRM
Adobe-DRM ist ein Kopierschutz, der das eBook vor Mißbrauch schützen soll. Dabei wird das eBook bereits beim Download auf Ihre persönliche Adobe-ID autorisiert. Lesen können Sie das eBook dann nur auf den Geräten, welche ebenfalls auf Ihre Adobe-ID registriert sind.
Details zum Adobe-DRM
Dateiformat: EPUB (Electronic Publication)
EPUB ist ein offener Standard für eBooks und eignet sich besonders zur Darstellung von Belletristik und Sachbüchern. Der Fließtext wird dynamisch an die Display- und Schriftgröße angepasst. Auch für mobile Lesegeräte ist EPUB daher gut geeignet.
Systemvoraussetzungen:
PC/Mac: Mit einem PC oder Mac können Sie dieses eBook lesen. Sie benötigen eine
eReader: Dieses eBook kann mit (fast) allen eBook-Readern gelesen werden. Mit dem amazon-Kindle ist es aber nicht kompatibel.
Smartphone/Tablet: Egal ob Apple oder Android, dieses eBook können Sie lesen. Sie benötigen eine
Geräteliste und zusätzliche Hinweise
Buying eBooks from abroad
For tax law reasons we can sell eBooks just within Germany and Switzerland. Regrettably we cannot fulfill eBook-orders from other countries.
aus dem Bereich