|This article is of interest to the following WikiProjects:|
|This article is or was the subject of a Wiki Education Foundation-supported course assignment. Further details are available on the course page. Assigned student editor(s): KentView.|
This is good article. Don't ruin it with a narrow-minded definition of uncertainty. Heisenberg's principle has little or nothing to do with the general subject. When one can analyze errors one does not have real uncertainty! Dave 21:48, 22 December 2005 (UTC)
There is no definition of uncertainty on this page. What is it? Also, should have some stuff about the Electron Microscope and Heisenberg, no?
This page needs work. It needs to start with a straightforward definition.
Here's a potential definition (STSS): "Uncertainty is the inability to predict outcomes of future events considering what is known. This reflects “human’s poor ability to predict technoscientific and technological futures.” The level of uncertainty is determined by how complex a situation is. A lot of variables exist that affect uncertainty’s statistical chances. In the context of innovation, uncertainty is used to describe or quantify the instances that the technology can have adverse effects on human health, the environment, and social dynamics.
Difference from unintended consequences is that uncertainty is usually a predicted, negative and unwanted outcome of a certain action (technology). However an unintended consequence is an unpredicted outcome of an action based on the interaction of technology and society."
[Please leave any critiques.]
Removal of evaluation of uncertaintyEdit
I removed the following two paragraphs:
- Evaluating the degree of uncertainty, measuring it, and compensating for it is a basic process which is part of many activities and processes.
Basic process is my nemesis.
- As a situation is evaluated and more and more data is gathered and thought is taken on a matter, hypotheses are made, experiments are performed, and gradually the degree of uncertainty is reduced as conclusions are drawn, but never wholly eliminated, except in certain special circumstances as where knowledge is defined by definition.
This is one possible view, but I'm sure some epistemologists would disagree. I don't think this is a universally acceptable statement.
--Ryguasu 01:04 Dec 3, 2002 (UTC)
Uncertainty in physicsEdit
How is uncertainty defined/used in Physics? What about error analysis? --Aug 21, 2005
Decomposition of uncertainty by Peter FisherEdit
Here's an analysis of the concept I picked up (hopefully correctly) from a lecture by Peter Fisher - I'm sure it's in one of his published papers. I'd like to store it somewhere and this seemed like a good place, but maybe it should be discussed a bit first.
Uncertainty can be divided into well-defined and poorly defined parts. The well-defined uncertainty can be analysed with probability theory. The poorly defined can be divided into vagueness and ambiguity. Vagueness can be analysed with fuzzy set theory. Ambiguity can be divided into non specificity and discord. Non specificity can be analysed with possibility theory and discord can be analysed with ontologies.
--Ari Jolma Feb 1, 2007
Let's not reinvent the issueEdit
Douglas Hubbard just wrote the book "How to Measure Anything: Finding the Value of Intangibles in Business" published by John Wiley & Sons. He has a pretty complete discussion on every definition of uncertainty and even addresses the quote by Knight. Uncertainty is simply the lack of complete certainty. As he states it, uncertainty is a state of knowledge where 100% correct assessment of a state (past, present or future) is not possible. This is consistent with all the main schools of thought about uncertainty including, probability theory, the decision sciences, statistics, information theory and physics.
Contrary to the claims of the person at the top of this discussion page, it does not confuse the topic to show that these uses of the term are quite consistent. Nor, as the same person states, does it mean that if we just removed error, we would have no uncertainty. In modern physics uncertainty is not just a function of error in observation but an intrinsic and irreducible aspect of reality - at least at the subatomic level.
Hubbard also clarifies what Knight found unclear: the difference between uncertainty and risk. In the pragmatic sense of people who do risk analysis all the time, risk is simply uncertainty about a state of affairs where at least one possible state involves a loss (something undesirable). To take it further, Hubbard defines the measurement of uncertainty as the assignment of probabilities to the set of possible states such that the p(union of all states)=1 and p(intersection of exclusive states)=0 and the p(each state)>0. Finally, the measurement of risk is simply a measurement of uncertainty together with an assignment of loss (or pdf of losses) for each of the uncertain states. Many states may have a loss=0 but if at least one does state does not have loss>0 then there is, by definition, no risk. Knight's quote notwithstanding, this actually is the practical use of the term in the risk-measurement industries and professions.
If you are wondering why a book with that title would talk so much about uncertainty, it is because Hubbard defines measurement as observations that reduce uncertainty expressed as a quantity. This is also the de facto use of the term "measurement" by all of the empirical sciences, even if they don't explicitly define it that way. All empirical measurements in any peer-reviewed journal must report error of a measurement and the error is presumed not to be zero. Even though they don't eliminate uncertainty (usually) it still counts as a measurement because it is less uncertainty than there was before (this is also consistent with Information Theory).
BillGosset 00:23, 20 June 2007 (UTC)
Analyzing uncertainty does not mean you have no "real" uncertaintyEdit
Dave said "When one can analyze errors one does not have real uncertainty!". This is not correct. First, simply analyzing errors is not the same as removing errors. Much of statistics is about quantifying the error you have remaining after a set of observations even if you don't reduce it further, much less eliminate it. When you compute your 90% confidence interval based on a sample, you have literally analyzed uncertainty but you have not removed it. Furthermore, "real" uncertainty still exists even when you analyze error. Its considered classical, not modern physics, to presume that the only uncertainty is from error in observation. In quantum mechanics, the uncertainty is a basic property of particles and is not just a function of observation error.BillGosset 00:29, 20 June 2007 (UTC)
Remove David Wilkinson referenceEdit
Wilkinson is not a notable source on this topic. His book was not based on any prior research in the mathematical understanding of uncertainty and risk. He was trained in education management for law enforcement and later became a management consultant. Wilkinson is simply confused about or probably unaware of the mathematically well-defined meanings of these terms. He makes no attempt to reconcile his contradictory definitions with the proper scientific and mathematical use of the terms - he is simply unaware of these established definitions. He is a layman on this topic and should not cited.Hubbardaie 11:40, 21 June 2007 (UTC)
Punctuation and usage errorsEdit
The big table in the middle of the article says "epistemological" where it should say "epistemic", and the hyphens are missing in "knowldege-guided", "rule-guided", and "intuition-guided". Michael Hardy (talk) 00:23, 28 February 2009 (UTC)
This article is all over the placeEdit
It ought to be refocused or split into multiple articles. All I was looking for was the equation for "uncertainty of the mean" in physics.
Article says that
"Uncertainty is a term used in subtly different ways in a number of fields, including philosophy, physics, statistics, economics, finance, insurance, psychology, sociology, engineering, and information science."
Remove the entire "Uncertainty and the media" sectionEdit
Although there is some merit to observe that the public perception of risk is not always rational, the slant given to the "Uncertainty and the media" section shows a significant bias within a hot-topic issue, which is 'global warming'. It presents the view that global warming is 'settled science', which is an oxymoron.
The public may tend to view flying on a commercial airliner as being more risky than driving the same trip in an automobile. The available data do not tend to support that view.
How should the public, and elected officials, evaluate the threat from something that could potentially be catastrophic, like the Earth being hit by a large asteroid, but in any given century is still an unlikely event? How much resources should be dedicated to detecting and diverting such a threat?
There are topics that could be included in this section that do not introduce something as contentious as global warming.
And yes, as an engineer I would never present to the world models that had not been shown to have predictive capability for new data that was not used in making the models, and then recommend drastic and expensive changes in public policy. I would also, as a scientist, never defend my position by saying "it is the consensus opinion of the experts...". That is a very LAME position to take, on ANYTHING, and is an indication that the evidence is not sufficient in and of itself to stand up to scrutiny.
So why use such a contentious topic for explaining something like "Uncertainty and the media". There are less contentious topics that could be used to more effectively make the same points.
Error in example in Measurements sectionEdit
I believe that the part
also written 10.5(0.5) and 10.50(5)
also written 10.50(5) and 10.500(5)
to be consistent with the later discussion. It would also be clearer to list the parentheses notation separately after each example, to make the association between each example clearer. For example,
Thus it is understood that 10.5 means 10.5±0.05, also written 10.50(5), and 10.50 means 10.50±0.005, also written 10.500(5).
Definition of riskEdit
This article gives a poor (and wrong) definition of risk. Risks can also have positive outcome. When I play lottery, no matter how unlikely it is, there is a risk that I actually win and become rich. This risk is due to stochastic uncertainty. This should be covered here. There is a wide misconception that risks are necessarily negative potential outcomes. — Preceding unsigned comment added by 18.104.22.168 (talk) 11:35, 26 August 2016 (UTC)
'Measurement uncertainty' section duplicates separate 'measurement uncertainty' page SuggestionEdit
The wikipedia page Measurement uncertainty includes all the detail in the 'measurements' section on this page. That's a bit wasteful of space but more importantly, it adds to the likelihood of inconsistency and duplication and (because measurement uncertainty in metrology is an active research field) leaves the present page with quite an incomplete description. I suggest reducing the 'measurements' section to a single brief paragraph stating that there is a well-established special case in metrology/measurement science and pointing to the measurement uncertainty page. — Preceding unsigned comment added by Slrellison (talk • contribs) 12:14, 31 July 2017 (UTC)
The chapter Measurements explains the concise notation to state measurement uncertainties in parentheses. However, there is no example of when the error value crosses the decimal point. So when I have 12.3±4.5, then what is the correct expression, 12.3(45) or rather 12.3(4.5) ?? --Geek3 (talk) 18:11, 17 December 2018 (UTC)