SCIENCE edit

Good science (IMHO) should include at least the following:
1. Ask questions (impertinent if necessary)
2. Think critically (esp re assumptions)
3. Tell the truth (all that is relevant even if somewhat unpalatable)

Who does science? Well everybody does..

So why is science today falling into disrepute? <don't know - don't care> tea shirts etc..

1. Scientific reports can be bought
2. Scientific reports can be ordered (political agendas)
3. Scientists can 'weld' everything to a pet theory (world view)
4. Scientists tend to have big egos which contribute to a reluctance to admit error
5. Scientific institutions (even the Nobel Prize Committee) like any other organisation such as companies and government parties, are by definition embody 'self preservation' as such they have an 'image' to protect.. and do so vigorously even at the expense of hiding uncomfortable truth.

All of the above of course turns up here! What I mean is that even wide publication, peer review and mainstream scientific community acceptance does not guarantee good science is always at the top of the agenda..
Vh mby (talk) 00:54, 9 November 2010 (UTC)

TRUTH edit

So what is meant by 'truth'?.. simply "stuff we know well enough to use with confidence", it is in the end what constitutes verified knowledge. Does that mean its 'absolutely true' or is there any room for 'uncertainty'.. for us mere mortals no and there is always some uncertainty. The concept of 'absolute' truth while not being very useful is indicated by the nature of truth itself ie it is exclusive of all other proposals.. we cannot have that level of certainty because we cannot ever say we know everything. However knowledge or our truth does exist because it is how we got from the Wright flyer to the Saturn V or from the Sopwith Camel to Concorde.. Engineering (Applied Science)is based on knowledge which is considered to be true, it is also an essential component of law, ethics, business, community life, family life, information systems, communications etc.. So how may we separate or distinguish this 'truth' from theory, conjecture or speculation?

Let me informally define a FACT as a VERIFIED OBSERVATION.

We now classify observations as either verified or unverified. Unverified contrary observations may be held in tension without destroying our idea (theory) however it only takes one verified contrary observation to FALSIFY our theory. To establish truth we obviously need more verified observations (facts) than unverified observations. However even the process of verification can be somewhat 'subjective'.

So now we must include and deal with the problem of UNCERTAINTY. It relates to both the observed facts and the truth we want to establish.. only then can we say we have a working definition of truth. When a number of individuals with relevant training and/or experience agree on the facts surrounding an issue and collectively act on or use that knowledge and this is repeated over a period of time by different individuals we usually conclude we have the truth. However in keeping with the well established UNCERTAINTY PRINCIPLE a small uncertainty always remains. I suggest we overcome this very small uncertainty with.. FAITH.! It is simply the best expression we have for what we all do when we need to TAKE ACTION based on our perception of what is TRUE. The safety of bungee jumping or aerobatic flight is soundly based on accepted truth about materials, processes training etc.. but let me suggest.. they both require the exercise of a considerable degree of faith. So our 'formula' for truth becomes:

         TRUTH = FACTS + FAITH   (FAITH > 0 and FACTS >> 0)

Without facts your truth is just belief, without faith and your truth becomes absolute, and impossible. Between these two extremes is a sliding scale (% facts) vs (% faith) but both are always present.

Truth as defined thus has some interesting characteristics:
1. Since everyone does 'science' anyone may discover 'truth' and so we may conclude that no one actually owns or has exclusive rights to truth.
2. When we have found a truth in these terms we may conclude: all contrary proposals must be considered false. Until at least one verifiable contrary fact is observed. This is the concept of falsification which is necessary for good science.
3. There is no valid alternative to the search for truth and process of verification or falsification.
4. The search for and process of establishing truth is independent of the subject matter and therefore applies to any area of knowledge.. The only requirement is that verifiable observations (facts) may be established. Even the distinction natural or supernatural has no bearing on this process.

Vh mby Gyroman (talk) 02:32, 5 November 2016 (UTC) Gyroman (talk) 13:11, 1 June 2018 (UTC)

INFORMATION edit

Ever since the discovery of the structure of DNA we have known that life may be characterized as 'information rich'. However like the term complexity, 'information' of the type that is found in DNA is not very well defined. Which may also have something to do with the fact that 'life' itself is not very well defined either. It is a crucial definition for any 'information rich' type of complexity, which by the way also includes man made objects. The main focus of the literature on information is about the quality of the signal and not its meaning, (see Shannon). Physicists and cosmologists refer to information in the universe, meaning the event history of every particle in it. This however is not 'signal' or 'semantic' information, of the type found in DNA and human design. This distinction is the basis of the SETI program. It postulates that only intelligence can be the source of semantic information. Intelligence is associated with life and we know that intelligent life like ourselves is distinguished by the objects we make and the language we use in communication. In fact the very tools required to make intergalactic transmissions must be specified by semantic information so we may conclude that there is no difference between semantic information in an intelligently designed object and that found in DNA which is also a specification for a living creature.

This implies a single definition for semantic information should suffice for all occurrences of it. Firstly lets describe semantic information correctly..

By observation Semantic Information is a NON Repeating, NON Random, ORDERED Set with MEANING and PURPOSE

Like trying to define life, semantic information has some universal, characteristics which are helpful in closing in on a definition.
1. It is expressed in symbolic codes which come from a finite alphabet
2. The codes are grouped according to a grammatical language
3. There is a method of writing
4. There is a method of reading
5. It has a meaning and a purpose
[Ref Dr Werner Gitt 'In the beginning was Information']

All the above means that semantic information is fundamentally always a communication between minds. We may note it is possible for a communication to not have any meaning (and therefore it is not information) however all communication has a purpose. Meaning and purpose are not quantifiable but are binary attributes of information (ie they are present or not).

Since methods for writing and reading imply physical structures with a purpose they must also be specified by semantic information.! Which means the proper definition of 'Semantic Information' as a communication implies it is recursive.. It cannot exist without prior semantic information? which supports an intelligent designer of life
Presenting a whole bag full of problems for both the naturalist and the Atheist.

Vh mby (talk) 02:44, 16 April 2009 (UTC) Gyroman (talk) 09:35, 16 August 2020 (UTC)

COMPLEXITY edit

Although this is a very commonly used word it is amazingly poorly defined and an indicator of the nature of a much bigger problem with another word 'entropy'. Scientifically speaking it is an essential word for a true comparison of states of matter consistent with the 'entropy' of those states. A definition of complexity derived from the absolute entropy of a state of matter is universal, essentially giving the required independence of all the specific details of the system (atoms/molecules/objects, assembly method or purpose etc). The process of scientific discovery (seeking the truth ie knowledge) about a subject often requires the classification and arranging of things or concepts according to logical patterns or rules. When a basic rule is obtained from the data it is possible to predict characteristics of missing elements or project to unknown data which guides future searches. Examples abound in physics, chemistry, cosmology, biology etc. When we say something is more complex than something else we are really saying it is more rare or difficult to achieve. As such a general definition of complexity is pivotal to the investigation of any postulate involving a natural origin or rise in complexity (ie evolution). The following are of particular relevance:


1. The formation of stars and galaxies from cosmic gas (H & He) and dust
2. The abio-genesis of life from non living matter.
3. The continued evolution of living species by natural selection.


All these postulate a reduction in entropy and corresponding increase in complexity and so depend upon an unambiguous definition of this term. In particular whether or not something is living or not should not make any difference. It is because the entropy of a system is the quantitative measure of the time decay of order in a system it must relate to the reduction in complexity. While one effect is the dissipation of energy and evening out of properties like temperature and pressure it's statistical basis from Ludwig Boltzmann, reveals absolute entropy is based on the probability a particular arrangement may exist.

Improbability is the inverse of probability, meaning the total number of possible arrangements a system can take. The significance of the Boltzmann equation for entropy as the natural logarithm of the number of microstates (number of possible arrangements) it automatically includes both logical improbability as well as physical improbability. Meaning not just the number of physical arrangements a system can take but if you identify each particle all the possible permutations within each physical arrangement. Broadly, physical arrangements are called macrostates while logical arrangements are called microstates.


To illustrate imagine tossing ten child's building bricks into a 10 x 10 grid tray. Assume they are constrained to align with the grid by funnel. The probability they will end up in a neat row across the centre is small so lets talk about its inverse or improbability, this is simply the total number of possible arrangements of which this is just one. Ignoring order it is given by the familiar nPr or number of permutations of 10 in 100 = 6.28e19. This represents the physical improbability of this state. It gives the average occurrence of this arrangement (exactly) if you throw the bricks an infinite number of times. Now lets number the bricks and examine the improbability of them falling in readable (upright) order from 0 - 9. This last state is a logical improbability which = 6.28e19 x 10! (ordering) x 10^4 (4 orientations) x 10^6 (6 faces) = 2.27e36, (for just 10 bricks), note the logical improbability multiplies with the physical improbability to produce the total improbability. Now replace our bricks with atoms and our tower with molecular arrangements and it is clear the Boltzmann equation for the entropy of a system at a statistical level provides a direct measure of 'complexity' applicable to any physical arrangement of matter and we may conclude that:

COMPLEXITY = IMPROBABILITY

Vh mby (talk) 00:53, 9 November 2010 (UTC) Gyroman (talk) 12:06, 17 September 2016 (UTC), Gyroman (talk) 10:30, 12 February 2018 (UTC)

ENTROPY edit

This is at the very centre of the biggest problem in science today. It is the reason for the foregoing and pertinent to what follows. Even a cursory examination of all the articles and the talk pages on the subjects Entropy, Introduction to Entropy, Entropy (disambiguation).. to appreciate there IS a problem. It is not because we do not know the answer. It really only depends on whether you are doing the kind of science that searches for truth or your science is guided by philosophical naturalism.

The proper definition of entropy comes from the work of Ludwig Boltzmann and is well known and well documented. However there is a rather large group of individuals who object to this presumably on the grounds of their philosophy. From the Cambridge Encyclopedia of Technology 2010 we get
(1) "Entropy: A measure of disorder" but according to others like Emeritus Professor of Chemistry Frank Lambert "Entropy: Is not disorder" and there you have the problem.

You may see lots of attempts to show 'order' to be a subjective/analogous purely human judgment which is not rigorous or measurable so not scientifically valid. Well let me assure you the writers of the Cambridge Encyclopedia were well aware of that requirement. And from Boltzmann we know an ordered state is properly and most simply just an improbable state..

Equilibrium.. is the state of maximum entropy. The concept where molecular behaviour becomes evened out through a system. Every molecule in a system (defined by a boundary) has a momentum and a position. The set of all possible values of these constitute the microstates of the system. You must understand that just as it is possible (but very unlikely) that all the energy will be concentrated in one atom while all the others are stationary, so it is just as unlikely that all the molecules will be packed into say one tenth of the volume leaving a vacuum in the rest. Both these states viewed as an instantaneous snapshot are highly improbable microstates of the system. Now it must be stated that all possible microstates have equal probability of occurrence just like all hand of cards dealt from a random deck are equally probable. What makes such states highly improbable is they are only one or one of a small group out of the total number of possible states. It is the specification of these groups called macrostates that determines the entropy and why entropy is often referred to as ""the probability of a state of matter existing"". This also gives rise to the fact that a highly improbable state is also a low entropy state. The number of microstates in different macrostates differs and is what creates a probability distribution of macrostates. The more likely simply has a greater number of microstates and the least likely has the smallest number of microstates. This all means that of all the possible microstates a system may take it will by simple probability, tend to be found in the macrostate with the highest number of microstates.

Rudolf Clausius derived an equation for the change in entropy due to heat transfer which is not to be confused with Boltzmanns equation for the entropy of a system as described above. Clausius equation is only a special application which is derived from the Boltzmann equation by making the assumption that the system is in equilibrium. This turns out not to be too much of a restriction and allows the close estimation of entropy change in real systems unlike the Boltzmann equation.

There is a terrible confusion between entropy and information which is most aptly resolved in an article from the [1] as follows

"The similarity of the theory of heat to the theory of information also is striking in many other ways. The second law of thermodynamics states that entropy always increases in any spontaneous change; in the limit, entropy remains constant if the change takes place reversibly; and it never decreases spontaneously. Similarly, information always decreases as the result of being communicated; in the limit it remains constant as the communication becomes perfect; ie when no randomness such as electrical noise is introduced in the act of communication; but information never increases as the result of communication. Thus entropy and information are strictly isomorphic quantities, though differing in sign, the first increasing and the second decreasing when randomness occurs."

Randomness obviously includes random mutation of information encoded in DNA during reproduction. The statement is thus most pertinent to the theory of evolution as reproduction is just one form of communication.

Gyroman (talk) 00:10, 9 January 2017 (UTC) Gyroman (talk) 10:44, 16 August 2020 (UTC)

FAITH & RELIGION edit

Faith simply means what we exhibit whenever we act with incomplete knowledge. Since no one can claim complete knowledge about anything we all must express this kind of faith in all we do. So even Richard Dawkins may be said to have faith in these terms! This 'rational' faith is therefore based on the concept of truth as expressed above. Rational faith is simply the acceptance of the minimum uncertainty which it is either not possible, uneconomic or unnecessary to try and eliminate. The constraint of rational faith being that it is employed whenever we have decided to act, ie. when necessary, implies that it is essential for our livelihood or life concerns (even if it is only to relax in a deckchair and trust it will not collapse).
Religion on the other hand is largely cultural, based on beliefs (without necessarily concerning oneself about the truth or otherwise of those beliefs) and traditions. It does play its part in the fabric of most societies and is used by many individuals to fill a need for lets say telling their story about their experience of the deep mysteries of life, etc. No need to go on here. The main point of course is that religion as such is not science (as defined above) and religious followers whatever their persuasion, have no mandate or right or even need to teach their beliefs as 'truth'. People of religious persuasion do have a right to hold their beliefs, however we all have an obligation to speak the truth, particularly to those who are dependent on us or on our position (be it parent, teacher or scientist). This is a moral imperative or ethical issue completely independent of our religious beliefs.

In reality we all need truth (above definition - verified knowledge), we all exercise faith (rational) but religion is or (should be) optional.
Gyroman (talk) 13:06, 1 June 2018 (UTC) Vh mby (talk) 01:09, 9 November 2010 (UTC) Gyroman (talk) 10:13, 12 February 2018 (UTC)

  1. ^ Encyclopedia Britannica 1965 "Heat"