Wikipedia:Reference desk/Archives/Science/2010 February 5

Science desk
< February 4 << Jan | February | Mar >> February 6 >
Welcome to the Wikipedia Science Reference Desk Archives
The page you are currently viewing is an archive page. While you can leave answers for any questions shown below, please ask new questions on one of the current reference desk pages.


February 5

edit

How will Augmented Reality glasses work?

edit

Glasses or Contact Lenses where images are electronically displayed on the surface: how will it be possible for the wearer to focus on what's being displayed when the image is less than an inch from the eye? I cannot make out the individual hairs and dusts on the surface of my eyeglasses: all I see is a blur. --※Cōdell 23:48, 4 February 2010 (UTC)[reply]

A head up display projects a picture onto the screen eyepiece focussed at infinity (collimated); I've no idea how they'd get it to work if the AR was just an OLED embedded in the glasses (I rather suspect you're right). The projection thing is okay if it's augmenting things that are at optical infinity (as is the case for pilots looking out of their aircraft), but if someone was proposing to superimpose AR objects on my (real) desk, that'd be an issue too. -- Finlay McWalterTalk 00:51, 5 February 2010 (UTC)[reply]
Ah, the AR article pointed me at virtual retinal display, where the image is lasered right into your eye. That article isn't super-clear about how it handles the current focal length of your eye. -- Finlay McWalterTalk 01:06, 5 February 2010 (UTC)[reply]
I'd just like to point out that I think that particular "concept" image is totally unencyclopedic. Basically it's just a (crude) Photoshop job that someone did on their blog. It has no bearing on how actual devices might look or work. It should not be on Wikipedia. There are, I am sure, plenty of alternatives out there for illustrating how this technology does work, or how actual engineers/scientists imagine it would work. (On top of that, the "concept" image is a horrible idea from both a design and practical standpoint—a huge, clunky phone type thing that sits over your face, requires both ears, and you can awkwardly dial from the reverse side of it by poking yourself in the eye... imagine actually trying to use that, under even ideal conditions. Now compare that with a Bluetooth earpiece+regular phone interface that you keep in your pocket when you aren't dialing.) --Mr.98 (talk) 01:17, 5 February 2010 (UTC)[reply]
It's also worth mentioning that, for both a collimated or VRD type AR system, other people looking at you wouldn't be able to see what you're seeing (which that mockup seems to show). -- Finlay McWalterTalk 01:24, 5 February 2010 (UTC)[reply]
Correct. That markup is just Hollywood junk. I worked on VR helmets a long time ago. When I started, the top of the line helmet weighed a lot and had screens set far away from the eyes. So, you had to practically bolt the helmet to the back of your head to keep it from sliding off. The reason for the design is that the human eye will not focus well on small detail that is closer than a foot or so from a person's face. A perfect VR helmet needs the image to be about an inch from the eye. So, I built a very small lens system to make the image appear to be two feet away and much larger than it really was. Going back to the markup - the lady would just notice a red blur in front of her face. It wouldn't be anything useful. -- kainaw 02:11, 5 February 2010 (UTC)[reply]
And it also makes you look like you're having a thousand mile stare. ~AH1(TCU) 02:08, 5 February 2010 (UTC)[reply]
The professional ones that people actually USE for virtual reality work are generally huge - and would cost you about the same as a new Ferrari! Take a look at this beauty: http://www.link.com/img/AHMD_F16.jpg (which I used to work on). The large curved plastic gizmos are semi-silvered mirrors - in a flight simulator, they let you see your real hands and real instrument panels - but by draping the area around cockpit with black velvet, you prevent other light sources from interfering with the graphics. The graphics system uses the head tracker to estimate which parts of the virtual world it's generating would be occluded by the physical cockpit out here in the real world - and blacks those bits out so you can't "see through the airplane"). Those mirrors are that large so you get a wide field of view - they use exotic coatings to avoid reflections from other things around you. You'll notice that lens built into the side of the helmet - that's a holographic lens that collimates the light so it appears to be coming from infinitely far away - behind that is a video projector that pumps out a bright, better-than-HDTV image. The lenses have to be adjusted to the individual wearer. The biggest problem is keeping it lightweight and balanced so it doesn't hurt your neck. Because pilots each have their own custom-fitted helmet, this gizmo fits over the top of their regular flight helmet. There is a head-tracker built into the back of the helmet. It also comes in a version where the mirrors are fully silvered so you only see the virtual world. The resulting image quality is stunningly good. (...or you could buy a Ferrari...a nice shiney red one!) One interesting reason why the lenses have to be so far from your face is that you get really weird psychological issues when you can't see your own nose! SteveBaker (talk) 02:17, 5 February 2010 (UTC)[reply]

Good. So the concept art like in the image I found are not feasible then? So how would these AR contact lenses work then (or not)? Virtual retinal display isn't used, I think. --※Cōdell 03:27, 5 February 2010 (UTC)[reply]

The other method for doing augmented reality is not to have a see-through screen, but an opaque helmet and a pair of cameras. I've been experimenting with this recently. This actually has the advantage of making lag [i]less[/i] noticeable in some situations, because the real world (as seen through the cameras) will lag the same amount as the simulated world. However, this approach has many other drawbacks. You need surprisingly high-quality cameras (expensive ones) to give you enough visual quality as you move your head around, even in a situation with well controlled lighting. If you try to replace your eyes with off-the-shelf consumer web cams, (Even the slightly pricey HD ones.) and you'll just make yourself sick. Even with the nicest cameras we were able to reasonably get our hands on it means that you take a serious loss of visual acuity when you put the helmet on.
While this approach isn't nearly as pricey as the military rig that Steve links to above, it's not exactly cheap either. The only headset we've found under five grand that isn't laughably poor (optical) quality is the eMagin z800 which will 'only' set you back $1500. But it's still only 800x600x2 and I'm not at all impressed by the (build) quality of the units. If you don't think 800x600x2 is good enough resolution then you'll really need to open your wallet, assuming you keep tens of thousands of dollars in your wallet, that is. (Fun fact : 800x600 is not good enough resolution to comfortably read a book, and I don't think you'd pass a DMV eye exam either. )APL (talk) 06:28, 5 February 2010 (UTC)[reply]
Using cameras just to avoid lag is kinda silly - you're just adding more lag to the background. The trick for avoiding lag is to know precisely how much lag you have and to build fancy extrapolation algorithms that know about the dynamics of the human head - how fast it accelerates and decelerates. Then you can predict where your head will be when the graphics will eventually be displayed and calculate the appropriate view direction when you start rendering it. This worked great on the L3 system. The only time there would theoretically be noticable latency would be when you accelerated your head really fast (eg because you needed to jerk around suddenly to see something) - but in those case, your eyes tend to lead the head motion and fast eye motion causes a 'saccade' - during which your brain shuts off the 'video feed' from your eyes until the scene settles down again - and by then, our extrapolation algorithm was caught up. Net result - no noticable lag. SteveBaker (talk) 14:18, 5 February 2010 (UTC)[reply]
Oh, I agree with everything you just said. I was trying to put a positive spin on things. Actively compensating for lag (if it works) and using the lowest latency equipment and software you can get your hands on, will always be better than merely hiding lag.
The real advantage to AR system with cameras is that you're more flexible with what you do with the real-world background. But this comes at a serious cost of visual quality. APL (talk) 15:57, 5 February 2010 (UTC)[reply]

Just why are these systems so expensive? I'd have thought I'd be using one on my PC by now?Trevor Loughlin (talk) 05:42, 5 February 2010 (UTC)[reply]

Small markets. The market for people who want to put on a helmet the size of a football helmet and wander around playing with simulated objects is low. On top of that, the technology intrinsically isn't cheap. (Putting a high resolution display in an area the size of a postage stamp isn't easy.)
Worse, Not only are the markets small, they're made of people willing to shell out big bucks. The z800 I mention above was launched at $500. When they realized that their biggest customers were all defense contractors they promptly tripled the price!
So, you've got a difficult to manufacture piece of technology that is needed by only a small number of people, most of which have very deep pockets. APL (talk) 06:41, 5 February 2010 (UTC)[reply]
Yep. And just take a look at the photo I posted that link to...it even LOOKS expensive. The projectors were custom made, the lenses were super-fancy lightweight holographic lenses that cost a small fortune. The curved plastic mirrors have all kinds of exotic coatings on them (and incidentally, those coatings are destroyed and have to be replaced if someone touches them without gloves on). The low latency head-tracker is pretty fancy technology. These things are inherently expensive - and are way too fragile for mass-market use. That means they only get short production runs (I doubt that L3 sells more than a couple of dozen a year) and without economies of scale, they aren't going to get cheap. The only thing that is getting better is that PC's are now fast enough to drive them. However, the cost and danger of flying a real F16 in these kinds of training scenario dwarfs the price of the simulator - so even at these outrageous prices, it's a bargin for the military. SteveBaker (talk) 14:18, 5 February 2010 (UTC)[reply]

Actually, what that silly image shows is more of a HUD than actual Augmented Reality. HUDs are not ridiculously expensive, especially if you're willing to settle for a low resolution "red eye" display. Heck you could probably make a usable one out of a pocket TV and an R-Zone. APL (talk) 06:41, 5 February 2010 (UTC)[reply]

This article give a critical view of the "LED-contact lens" idea. The developer claims "If the pixel [the microLED] is close enough to the micro-lens, it will generate a virtual image that could be 30cm or more away from the surface. Our eyes can focus on this now." but quotes a critic "There would have to be some projection technology for it to appear at a distance that you could then focus on. I can't really see them generating a projection through a contact lens." My own view (pun) is that such a lens would need to carry a dense array of pixels that can be modulated in both brightness and phase to give an illusion of a scene at a greater distance. We are unable to do that at present. Cuddlyable3 (talk) 18:39, 5 February 2010 (UTC)[reply]
Thank you, that was helpful. --✚Cōdell 23:45, 6 February 2010 (UTC)[reply]

Fingernails?

edit

We all get little white spots on our fingernails occasionally. Is there any official term for them? Nyttend (talk) 00:29, 5 February 2010 (UTC)[reply]

Uncle Cecil wrote in 1990 that they're called "punctate leukonychia", which he described as "medicalese for white spots." Comet Tuttle (talk) 00:53, 5 February 2010 (UTC)[reply]
Don't forget that if they're unexplained you can tag them as idiopathic punctate leukonychia.... TenOfAllTrades(talk) 03:47, 5 February 2010 (UTC)[reply]
And if you've had them for a long time they're chronic idiopathic punctate leukonychia. Richard Avery (talk) 15:30, 5 February 2010 (UTC)[reply]
And if they aren't causing you any medical problems, then they are benign chronic idiopathic punctate leukonychia. --Jayron32 18:22, 5 February 2010 (UTC)[reply]
And depending on where they are - on fingers or toes, on one hand/foot or the other and at the base or tip of the nail, they could also be either Anterior, Posterior, Dorsal, Ventral, Lateral, Medial, Proximal or Distal...benign chronic idiopathic punctate leukonychia. SteveBaker (talk) 23:01, 5 February 2010 (UTC)[reply]

Mustard pain

edit

Why does eating too much hot mustard, such as English Mustard, cause pain inside the nose? And why does this pain only last two or three seconds and not longer? 78.146.193.0 (talk) 02:10, 5 February 2010 (UTC)[reply]

Allyl isothiocyanate is a potent lachrymatory agent. As to why the action is only brief - I do not know, but it may well depend on dose and personal sensitivity. --Dr Dima (talk) 02:22, 5 February 2010 (UTC)[reply]

steam

edit

PC Game related, Moved to Computer Desk --220.101.28.25 (talk) 06:03, 5 February 2010(UTC)

three low pressure systems near each other? how do they interact?

edit

as of time of writing, I'm referring to the weather.com map -- I don't know where else to get such a convenient summary on one map, and am afraid to save a copy of my own for fear of copyright issues. Firstly I'm wondering if low pressure systems have a tendency to merge -- or maybe repel or attract each other depending on the circumstance (like gravitational bodies)?

Also, where the heck is all this moisture coming from? Isn't it in the middle of winter? Isn't hurricane season over? Also I know it turns to snow very sharply near the mountain regions? This interests me because Charlottesville usually doesn't get a lot of snow at all because of the same mountain range effect (right?) but now we're getting 28 inches which must be shattering records left and right; certainly not 50 inches in one season (ignoring smaller snowfalls, we had a 22 inch blizzard last Christmas). John Riemann Soong (talk) 03:41, 5 February 2010 (UTC)[reply]

Well, I think what you're looking for on how low pressure systems interact is Fujiwhara effect...as for moisture, it can come from the most unlikely sources...when summer thunderstorms form in Alberta and Saskatchewan, the moisture to feed those thunderstorms is being transported from the Gulf of Mexico, over 1,500 miles away...it just depends on the atmospheric setup as to where the moisture originates. Ks0stm (TCG) 03:59, 5 February 2010 (UTC)[reply]


Yes but how is the Gulf warm enough to support such a massive storm with so much energy? I mean, half the country is essentially going to get 2 feet of snow... that's a lot of water.... John Riemann Soong (talk) 16:02, 5 February 2010 (UTC)[reply]
It might be as little as 2" of water. That's still quite a lot - but not unprecedented. SteveBaker (talk) 03:15, 6 February 2010 (UTC)[reply]
I've been tracking the progression of this winter's weather, and I've saved images of sea surface temperatures from Weather Underground since early December. Don't be afraid to save the images on your computer, as the copyright only applies to using the images elsewhere. Take a look also at Winter storms of 2009–2010 and Winter of 2009-2010 in Europe. Many storms have drifted over the North Atlantic, in the absence of the North Atlantic Gyre, and underwent frontal cyclogenesis. There was an earlier discussion here about this. As more storms such as the North American Blizzard of 2010 head out into the open Atlantic ocean, this will serve to "nudge" the existing lows in the North Atlantic, spinning more of the Gulf Stream in different directions. By the way, the earlier storm you described was the 2009 Christmas winter storm. Some of these storms are so large that they're simultaneously drawing water from the Pacific and Atlantic oceans. Leaves me wondering what the 2010 Atlantic hurricane season will be like. ~AH1(TCU) 22:00, 7 February 2010 (UTC)[reply]

Evolution

edit
  Resolved

What's that type of evolution called when two very similar looking species have evolved completely separately in different locations? —Preceding unsigned comment added by 82.43.89.14 (talk) 09:40, 5 February 2010 (UTC)[reply]

http://en.wikipedia.org/wiki/Convergent_evolution —Preceding unsigned comment added by 157.193.173.205 (talk) 09:53, 5 February 2010 (UTC)[reply]

Thank you!

Find the average velocity

edit

Between two train stations a train travels the first  th of the distance with uniform acceleration, then with uniform speed v, and for the last  th of the distance with uniform deceleration. What is the average velocity in terms of v and n?

--220.253.218.157 (talk) 11:11, 5 February 2010 (UTC)[reply]

  Please do your own homework.
Welcome to Wikipedia. Your question appears to be a homework question. I apologize if this is a misinterpretation, but it is our aim here not to do people's homework for them, but to merely aid them in doing it themselves. Letting someone else do your homework does not help you learn nearly as much as doing it yourself. Please attempt to solve the problem or answer the question yourself first. If you need help with a specific part of your homework, feel free to tell us where you are stuck and ask for help. If you need help grasping the concept of a problem, by all means let us know. --Tagishsimon (talk) 11:13, 5 February 2010 (UTC)[reply]
OK, please help me. My understanding is that average velocity is displacement divided by time, but I don't see how to find either displacement or time from the question. Perhaps you could help me with that?--220.253.218.157 (talk) 11:25, 5 February 2010 (UTC)[reply]
Yes, average velocity is displacement/time. As the train is always moving in the same direction, the displacement is the distance travelled by the train - let's call this d. Let's divide the time up into three parts. There is the time t1 taken to travel a distance d/n while accelerating uniformly from rest to velocity v - note that the average velocity during this part of the journey is v/2. Then there is the time t2 taken to travel a distance (n-2)d/n at a constant velocity v. There is the time t3 taken to travel a distance d/n while decelerating uniformly from velocity v to rest - this is the same as t1. Work out t1, t2 and t3, add then together to find the total time for the journey t, then average velocity is d/t. Gandalf61 (talk) 11:44, 5 February 2010 (UTC)[reply]
   so t1 and t3 are  . t2 is  , so total time is  .d/t is therefore   =   =  . Is this the correct answer?
Great answer, but where do you get   ? It is valid, but simplifies to    which intuitively is the displacement divided by the average velocity (assuming constant acceleration). Why the more complicate form? 58.147.58.179 (talk) 15:50, 5 February 2010 (UTC)[reply]
Having gotten the answer, you can now look for the intuitive meaning of your result. Had the train traveled at a constant velocity v, each of the n segments of the trip would have taken the same time period. The trip as presented in the problem takes the same time period for each of the n-2 segments in the middle, but each end segment takes twice as long (because of constant acceleration between 0 and v), so a total of 2 + (n-2) + 2 = n+2 time periods are required, as opposed to the constant velocity v trip of n of those same time periods. Thus the average velocity of the trip as stated is n/(n+2) as fast, or nv/(n+2).58.147.58.179 (talk) 16:05, 5 February 2010 (UTC)[reply]

Potassium supplements

edit

I notice at the drugstore that all potassium supplements (pills, capsules, etc.) only have 3% daily value of potassium. From this, I infer that to take all 100% of my daily value of potassium at once would be dangerous somehow.....? Can someone explain this to me? What is the point of these supplements if the provide so little of the nutrient? --The Fat Man Who Never Came Back (talk) 15:12, 5 February 2010 (UTC)[reply]

Er, sorry about that. I guess I should have read the article first. Still, 3% seems awfully low. Also, I'd like to know more about the injuries that large concentrations of potassium could cause.--The Fat Man Who Never Came Back (talk) 15:18, 5 February 2010 (UTC)[reply]
(EC) I suspect it's to prevent potassium toxicity. High serum potassium will cause problems with muscular contraction, particularly of the heart. It is important that the heart, being a rather important muscle, beats normally. -- Flyguy649 talk 15:20, 5 February 2010 (UTC)[reply]
Potassium in biology says "Although hyperkalemia is rare in healthy individuals, oral doses greater than 18 grams taken at one time in individuals not accustomed to high intakes can lead to hyperkalemia. All supplements sold in the U.S. contain no more than 99 mg of potassium; a healthy individual would need to consume more than 180 such pills to experience severe health risks". Gandalf61 (talk) 15:27, 5 February 2010 (UTC)[reply]

Potassium chloride is used for lethal injection. Potassium-sodium exchange is an important part of biological function. See Na+/K+-ATPase. At resting potential, potassium is "supposed" to stay within the cell and remain at low levels outside the cell. Sodium is supposed to remain at low levels inside the cell and remain at high levels outside the cell.

Well ... if you ingest lots of potassium (without giving it chance to be taken into cells gradually while being exchanged for Na+), you'll suddenly cause lots of potassium ions to flood into your body's cells without Na+ having a chance to leave those cells, i.e. the membrane potential of the cell goes up. Effectively your entire body depolarises.

Since sodium is supposed to stay at high levels outside the cell, ingesting lots of Na+ (Cl-) is less of a problem... the cause of death is usually chronically-induced, via kidney failure.

John Riemann Soong (talk) 22:17, 5 February 2010 (UTC)[reply]

For that little potassium you might as well eat a few more bananas. 67.243.7.245 (talk) 23:30, 5 February 2010 (UTC)[reply]
Bananas make me fat.--The Fat Man Who Never Came Back (talk) 02:56, 6 February 2010 (UTC)[reply]
Really? Bananas are not really that high-caloric, compared to their potential to make you feel full. I generally have no reservations about allowing myself a banana when I'm hungry.
Of course if you eat them Elvis Presley-style, that's another matter entirely.... --Trovatore (talk) 03:03, 6 February 2010 (UTC)[reply]
If your adiposity is normal and we are supplyng approximately your daily K needs by iv feeding, we would give you about 2 meq per kg per day. The reason your pills have so little is that almost no one needs K supplementation because it is ubiquitous in both animal and plant foods (unless you are living off minerals). The only people who need it are those who have unnatural losses, like those who take a diuretic. On the other hand, if your kidneys do not work, excess K cannot be excreted and will be among the first things that kill you if you do not have access to dialysis. Eat food and stay away from the supplement shelves. alteripse (talk) 08:42, 6 February 2010 (UTC)[reply]

Hawking radiation

edit

Hawking radiation according to the article appears to be radiation which exits the event horizon and does not fall back in. Would not such a phenomenon require some form of replacement energy, perhaps from radiation which had not already fallen in been absorbed? 71.100.13.180 (talk) 15:42, 5 February 2010 (UTC)[reply]

Assuming that you mean that there must be an energy source for the radiation, then the answer is that it comes from the mass which the black hole loses. JamesBWatson (talk) 15:50, 5 February 2010 (UTC)[reply]
You mean rather than from the mass it gains? 71.100.13.180 (talk) 18:34, 5 February 2010 (UTC)[reply]
The mass of a black hole doesn't have to stay constant. It's continuously leaking away some of its mass through Hawking radiation. On the other hand, usually there's also new mass that's getting sucked into the black hole. If the amount of mass lost from Hawking radiation exceeds the amount being sucked up, then the black hole will get smaller and eventually disappear. If the amount of mass being sucked up is greater then the black hole will grow. The two effects don't have to balance out. Rckrone (talk) 21:29, 5 February 2010 (UTC)[reply]
Yes but if you add water to a boiling pot of water which is not itself of a temperature high enough to boil then the pot of water not only gains mass buts is coolded to a point that there is less water in the pot being converted to water vapor by the reduced temperature of the water in the pot, i.e., more incoming mass of lower temperature --> less outgoing radiation : more incoming mass of equal or higher temperature and more outgoing radiation. Right? 71.100.13.180 (talk) 23:18, 5 February 2010 (UTC)[reply]
No, not right. The intensity of the Hawking radiation depends only on the total mass of the blackhole, not in the temperature of the mass from which it was formed. 68.56.16.165 (talk) 23:48, 5 February 2010 (UTC)[reply]
"is" or "was" formed? 71.100.160.128 (talk) 00:10, 6 February 2010 (UTC)[reply]
If more matter is added, the total mass will change and so will the radiation intensity (The blackhole radius, temperature, entropy will all also change accordingly). 68.56.16.165 (talk) 00:44, 6 February 2010 (UTC)[reply]
Interestingly, the Hawking radiation goes down as the mass of the black hole goes up - or in other terms, it goes up as the mass of the black hole goes down. Unless something fancy quantum-mechanical happens that we don't yet understand, the last few kilos will go off like the biggest boom Dr. Strangelove or any Mad Scientist could wish for. On the other hand, for stellar-sized black holes (and larger) the incoming cosmic microwave background is larger than the outgoing Hawking radiation, so it will take a few billion years of dilution of the universe before "normal" black holes have a chance to shrink at all. --Stephan Schulz (talk) 08:39, 6 February 2010 (UTC)[reply]
Yes, Blackholes have negative heat capacity and warm up as they radiate energy away. Just keep in mind that negative heat capacity is the rule - not the exception - for gravitationally bound objects that have virialized whenever non-gravitational interaction between its components can be neglected. For instance, interstellar clouds also warm up as they radiate energy away. Dauto (talk) 15:16, 6 February 2010 (UTC)[reply]

Supposing a radiating black hole was surrounded by a Dyson sphere to stop energy being sucked in, but with some means of extracting the energy coming out. Would this defeat the Second Law of Thermodynamics?Trevor Loughlin (talk) 15:14, 9 February 2010 (UTC)[reply]

The Dyson sphere itself also has a temperature and emits radiation as well

If I step on a Mormon cricket....

edit

Will it go to the Celestial Kingdom?--The Fat Man Who Never Came Back (talk) 15:46, 5 February 2010 (UTC)[reply]

That depends if the number of angels that can dance on the head of a pin is a prime number or not. Googlemeister (talk) 15:49, 5 February 2010 (UTC)[reply]
The Common Quaker remains silent on this matter.--Shantavira|feed me 17:26, 5 February 2010 (UTC)[reply]
Meanwhile, the Great Mormon hies to Kolob in a twinkling of an eye.--The Fat Man Who Never Came Back (talk) 17:40, 5 February 2010 (UTC)[reply]

Wikipedia has an article about the Mormon cricket which is neither a Mormon nor a cricket. In the unlikely event that you step on a Mormon Cricket match they would likely worship you as a pre-Columbian giant. Cuddlyable3 (talk) 17:38, 5 February 2010 (UTC)[reply]

QUESTION-can magnetron be used as an amplifier

edit

This question was put up in the class by my professor. i have read so far & gathering all sort of information i cud get that it is an oscillating device which produces electron according to the resonant frequency. This is the sole reason that it cant be used as an amplifier is there any other reason. —Preceding unsigned comment added by 61.1.100.54 (talk) 16:00, 5 February 2010 (UTC)[reply]

Reading the article on the Magnetron might actually help. Question remains what sort of amplifier you exactly mean. But very generally speaking: Yes, it can be used as an amplifier. NoisyJinx (talk) 16:23, 5 February 2010 (UTC)[reply]
Electrons are emitted from the heated cathode in the magnetron. They are supplied by the current through the magnetron (i.e. the magnetron does not manufacture electrons). The magnetron is usually used as an oscillator to convert dc input power into microwave radiation at the resonant frequency determined by the anode cavities. For a microwave amplifier use a different device such as a klystron. Cuddlyable3 (talk) 17:52, 5 February 2010 (UTC)[reply]

Bio-gas

edit

how is bio-gas produced with waste products of oilpalm processing —Preceding unsigned comment added by Liemenam (talkcontribs) 17:01, 5 February 2010 (UTC)[reply]

Our Biogas article discusses this, though it doesn't specifically talk about palm oil at all. Comet Tuttle (talk) 18:55, 5 February 2010 (UTC)[reply]

light

edit

could the dopplar shift in light be caused velocity only ie blue shift being faster than 186000mps while the red shift being slower —Preceding unsigned comment added by 82.22.255.246 (talk) 17:45, 5 February 2010 (UTC)[reply]

Wikipedia has an article about Doppler shift. The speed of light in space does not change. Doppler shifting of light is due to its source moving relative to the viewer e.g. the light from a star moving away from Earth is red-shifted. Cuddlyable3 (talk) 17:58, 5 February 2010 (UTC)[reply]
Cuddlyable3 is correct. "The speed of light in space does not change". Only the frequency and wavelength of the light changes. Only if light enters a more dense medium, ie air/ water/ glass will it slow down. 220.101.28.25 (talk) 06:27, 6 February 2010 (UTC)[reply]
Let us be clear that the slowing of light in transparent media is not accompanied by a colour (frequency) shift although a different phenomenon Dispersion (optics) can be observed. Cuddlyable3 (talk) 13:43, 6 February 2010 (UTC)[reply]

Study which shows that some People will not change their minds

edit

I read a Wikipedia article a while back which was about a study by 2 college professors who divided people into two groups. One group of people would not change their minds even when presented with evidence which shows that their belief was incorrect and would not even consider the possibility that they might be wrong. The other group accepted the possibility that they might be wrong even when in the end they turned out to be correct. If I remember correctly, the first group might even become hostile when their ideas where challenged. I'm looking for that article because it has details which I do not remember (like the two college professors names). —Preceding unsigned comment added by Saxonwg (talkcontribs) 18:22, 5 February 2010 (UTC)[reply]

It seems a waste of "study" effort to confirm such a common observation. There are two kinds of people: those who divide people into two kinds and those who don't. Cuddlyable3 (talk) 18:43, 5 February 2010 (UTC)[reply]
That was unhelpful, Cuddlyable3. The original poster is asking for a specific article and you blew him off. Comet Tuttle (talk) 18:53, 5 February 2010 (UTC)[reply]
Confirmation bias? Comet Tuttle (talk) 18:53, 5 February 2010 (UTC)[reply]

Or potentially Primacy effect? 20:39, 5 February 2010 (UTC)

Bigots? 78.146.215.222 (talk) 23:44, 5 February 2010 (UTC)[reply]

Global warming denial? ;-) ~AH1(TCU) 21:48, 7 February 2010 (UTC)[reply]

Why does Pluto's status as a Dwarf Planet bother so many people?

edit

The Definiton provided by the IAU is clear enough, and Pluto is strange enough (just look at the orbit) that it is obviously not like the other traditional planets. Does it matter? Pluto didn't go anywhere. We didn't know about 100 years ago, and kids these days know the new classification. So it's just three generations that have this problem. Is it just that we don't like change? Aaronite (talk) 18:30, 5 February 2010 (UTC)[reply]

It's because we were taught that Pluto was a planet when we were children. To suddenly say it is not a planet... it's like saying Washington was not a President, based on a technicality! Pluto was always presented as the most obscure, the most foreign, the most futuristic. Pluto is also the dinkiest and loneliest of the planets—and who doesn't like an underdog? (And three generations of Pluto-lovers is enough, to paraphrase an infamous quote. Four generations ago—my grandmother's mother—women couldn't vote in the USA. It's not an inconsequential amount of time on a human scale.) --Mr.98 (talk) 18:37, 5 February 2010 (UTC)[reply]
 EarthMoonCharonCharonNixNixKerberosKerberosStyxStyxHydraHydraPlutoPlutoDysnomiaDysnomiaErisErisNamakaNamakaHi'iakaHi'iakaHaumeaHaumeaMakemakeMakemakeMK2MK2XiangliuXiangliuGonggongGonggongWeywotWeywotQuaoarQuaoarSednaSednaVanthVanthOrcusOrcusActaeaActaeaSalaciaSalacia2002 MS42002 MS4File:10 Largest Trans-Neptunian objects (TNOS).png
Artistic comparison of Pluto, Eris, Makemake, Haumea, Gonggong (2007 OR10), Sedna, Quaoar, Orcus, 2002 MS4, and Salacia.
Pluto was thought to be the expected Planet X responsible for observed perturbation of the gas giants' orbits, see the article Planets beyond Neptune. Disqualification of Pluto means that Planet X must be sought all over again among the objects in the picture. Sheesh. Cuddlyable3 (talk) 19:00, 5 February 2010 (UTC)[reply]
I don't know - but if you followed the latest astronomical observations then it was clear since the 1990s that Pluto was just one member of some kind of outer asteroid belt (Kuiper belt). And I think from current observational data (especially since the Voyager probes visited Uranus and Neptune) there doesn't seem to be the need for Planet X (does anyone know what the original mistake was that lead to the assumption of Planet X?). Icek (talk) 19:12, 5 February 2010 (UTC)[reply]
Did you look at the article? Cuddlyable3 (talk) 20:01, 5 February 2010 (UTC)[reply]
I know about the 0.5% mass discrepancy, but why was there an error in the mass determination in the first place? If you derive the mass from Triton's distance and orbital period, then an error about 1/600 in Triton's distance from Neptune (less than 600 km, or about 0.03 arcseconds) could cause that, but there must have been a systematic error if they were so sure that there is a discrepancy in the orbits. Icek (talk) 08:25, 6 February 2010 (UTC)[reply]
It is just that we don't like change. (why do you think all the signs along the highway near my house have miles-per-hour on them? It ain't for any logical reason) 128.223.131.109 (talk) 20:29, 5 February 2010 (UTC)[reply]
There are plenty of other cases where science has changed the definitions of things - for example: mushrooms are no longer plants, bird are now dinosaurs, glass was (for a while) a liquid - but now it's not again, viruses were once "lifeforms" - and now they aren't. I could go on for hours. Yet other illogical classifications still stand (Why is Europe a continent?). Other terms change completely (a "computer" is no longer someone who computes, a "typewriter" is not someone who types - and a 'calorie' magically became a 'kilocalorie'). And thoughout all of those changes - the general public either ignored the change in status - or went along with it without complaint. Given our usual flexibility - why it is that people made such a fuss over Pluto is really hard to understand! SteveBaker (talk) 22:52, 5 February 2010 (UTC)[reply]
I think it has always retained the "Planet X" fascination—Pluto is sci-fi fodder at its best. Mushrooms are not. There is something wonderfully fundamental about what planets there are in the solar system—to have that yanked out from under us by a cabal of wrinkly astronomers... why, it's just not right! I will go to my grave insisting that not only is Pluto a planet, it is the most awesome planet. Dictionary definitions be damned. A better comparison with the redefinition of Pluto would be the argument over whether T-Rex was a fierce king of the jungle or a lowly scavenger. I mean, we all know T-Rex is bad-ass, right? He's no scavenger, whatever those irritating paleontologists say. --Mr.98 (talk) 23:40, 5 February 2010 (UTC)[reply]
In this case, there wasn't even an existing definition of "planet". They belatedly but very sensibly rectified that omission. That is, it was sensible to rectify the omission; whether the new definition itself is a sensible one - that's a different question, but it was done by what was effectively a scientific consensus, a concept not unfamiliar around here. It was decided to draw the line at a certain point, and Pluto fell outside the line, because to continue to have Pluto as a planet would have meant including Eris and some other bodies as planets, and that was undesirable for various reasons. It wasn't like "We all hate Pluto, so let's contrive a definition of planet that's sure to exclude it". -- Jack of Oz ... speak! ... 23:48, 5 February 2010 (UTC)[reply]

My apologies for editing anyone's contributions, but from this point onward in the discussion the posts were indented so many times that they eventually got shifted off the screen. I have only moved them leftwards.

If Pluto were a planet - then so should our moon be (the Moon is much bigger than Pluto and it's path around the sun is much more normal than Pluto's). That makes the Earth/Moon system into a dual-planet and means that we have to find another name for our twin planet because you can hardly call something "Moon" when it's not a moon. The astronomical convention would have to be to call it Earth-b or something. If you think people were pissed at the astronomers for demoting Pluto...would you want to be the one telling them that we have to change the name of the Moon?!?! Just think of the trouble romantic song-writers would have finding a rhyme for that! SteveBaker (talk) 03:08, 6 February 2010 (UTC)[reply]
My view is, of course the Moon is a planet. Why change its name? It can be a planet, and still called the Moon.
This actually goes to the heart of why the IAU definition is a terrible one. Their definition is based, not so much on the intrinsic characteristics of the body, as on where it's located (that is, its orbital characteristics). But these are much less interesting than the intrinsic ones!
Just to take an example, as I understand it, if the Earth were in Pluto's orbit, it would not yet have "cleared its neighborhood" either. Should the Earth cease to be a planet, just because it were transported to Pluto's orbit? Or what about rogue planets, torn loose from the grip of their parent stars?
On the other side of the coin, it's only a historical accident that the gas giants were ever grouped with the rocky planets. They're clearly a different sort of thing; you can't land on them and establish a base! But you can on some of their moons.
By any rational standard, the Moon would be a planet, and so would Ceres and Ganymede and Europa and Titan, but Jupiter would not be. --Trovatore (talk) 03:18, 6 February 2010 (UTC)[reply]
I desire to be/with you under Earth-B/Just you and me/in the light of Earth-B/doing it like a flower and a bee - I see untold possibilities for poets! --Stephan Schulz (talk) 08:53, 6 February 2010 (UTC)[reply]
"By any rational standard" - well, one rational counter to that proposition is that planets orbit stars, not other planets. If a body orbits a planet, it might be a satellite, or an asteroid, but not itself a planet. -- Jack of Oz ... speak! ... 22:05, 6 February 2010 (UTC)[reply]
Why? As I say, that seems to be basing planethood not on any characteristics of the body, but simply on where it is and how it's moving. To me those seem irrelevant. Would a sufficiently large moon *become* a planet, if you set it in a different orbit? Or would Earth cease to be a planet, if you set it in motion around Jupiter, or if you ripped it free to wander through interstellar space? Is that really how you think of the word? For me it is certainly not. --Trovatore (talk) 22:28, 6 February 2010 (UTC)[reply]
Lots of astronomical terms depend on where an object is found. An asteroid becomes a meteor when it enters our atmosphere and becomes a meteorite when it hits the ground. SteveBaker (talk) 23:55, 6 February 2010 (UTC)[reply]
That's fine; I just wanted to disabuse you of the notion that anyone who sees things differently than the way you see them is not employing a "rational standard". -- Jack of Oz ... speak! ... 22:39, 6 February 2010 (UTC)[reply]
So, presumably, you've been considering Eris, and probably Ceres as well, planets, just like Pluto? APL (talk) 07:06, 8 February 2010 (UTC)[reply]
Besides - even if you decide that (say) Io is not a planet because it's orbiting a planet - you still have to leave open the possibility of binary planets (analogous to binary stars). If a body is large enough to be a planet but is not given that designation because it orbits another planet - what are you going to do when some exoplanet hunter finds two bodies of equal mass orbiting each other? Do you flip a coin and arbitrarily label one of them "moon" and the other one "planet"? That's a pretty stupid definition! So if you use mass or diameter or roundness or almost any other "obvious" criteria for saying "Planet" then you have to either exclude Pluto or include the Moon. As it happens, the IAU's definition covers that situation by declaring that if the point about which the two objects orbit lies within one of the two bodies then you have a moon - if it lies outside of both of them then you have a binary system. So in fact, the Moon doesn't count as a planet (not even a "dwarf planet") for that reason. The thing that excluded Pluto wasn't its size - it was that crazy orbit. My point is that if you want to come up with clean definitions for words used in a scientific context then you have to make tough "bright line" definitions that sometimes result in different meanings for words than in common English usage. People didn't stop using the word "Bird" and start to talk about "putting up a dinosaur feeder for the winter" when that determination was made. In common English usage, mushrooms are still plants - although they haven't been taxonomically a part of the plant kingdom since the 1930's. People persist in calling tomatoes "vegetables" when they are really "fruit", peanuts are called "nuts" when they are really "legumes" and spiders are still called "insects" by many people. So it really doesn't matter if people continue to talk about "The planet Pluto" in informal circumstances if it makes them happy. But in scientific usage - particularly in publications - it's essential that we use language precisely - and for that we need a solid definition, and the IAU did exactly that. Popularity or otherwise does not matter one iota in the achievement of this purpose. SteveBaker (talk) 23:50, 6 February 2010 (UTC)[reply]
I'm having trouble following whom you're addressing here, Steve — you seem to be agreeing with Jack and disagreeing with me, but I wasn't the one saying Io's not a planet because it orbits a planet. I was saying the opposite — maybe not about Io, it's a tad small,whoops, I take it back — Io's plenty big enough. I thought it was smaller than that. but I would certainly include Ganymede and Europa as planets. And the Moon, for sure.
As for the need for a uniform definition of the word "planet", I sharply disagree there. No one has presented a convincing argument as to why the lack of a definition was a problem, except for people who like nomenclatural neatness for its own sake. Usually you need a definition of foo when you start creating a theory of foos in general. I have seen no evidence that the IAU definition in any way enables the study of planets in general (indeed, they specifically limited it to our solar system, so it's virtually usesless-by-design outside the particular list of eight). --Trovatore (talk) 00:09, 7 February 2010 (UTC)[reply]
It's not a matter of nomenclatural neatness for its own sake. Just what does anyone mean when they say "planet"? Or "broccoli", or "camel", for that matter. You can call the big oak tree in your front garden a "planet" if you like, and nobody's going to get too upset about it - unless you start teaching your theory to school children, say, or telling people on the WP ref dek that the Solar System includes your big oak tree. That's not gonna happen, obviously, but surely you can see that in scientific contexts there has to be agreement about the meanings of scientific terms, otherwise we're likely to descend into an abyss of chaos and despair (to be slightly melodramatic about it). There was an implicit understanding of "planet" for a long time, and that served its purpose, until it started becoming unclear which bodies were planets and which weren't. That's when it became necessary to settle the matter, definitionally. So we now have a formal definition of "planet". But, as I say, feel free to apply that term to whatever non-planets you like - just don't expect to do so in any sort of formal environment and get away with it, because you'll be told in no uncertain terms "Keep up with progress, Trovatore, and let go of old ideas, otherwise stay out of the kitchen". -- Jack of Oz ... speak! ... 03:24, 7 February 2010 (UTC)[reply]
The IAU has no authority (nor does anyone else) to establish the "formal" meaning of words. --Trovatore (talk) 03:34, 7 February 2010 (UTC)[reply]
That's only slightly true. Their only authority is measured by the degree to which their decision is respected. It's pretty clear (by the mere fact that people are complaining that "Pluto isn't a planet anymore") that the public accept that authority whether they like it or not. Everyone could just say "Silly IAU! Of course Pluto is a planet!" - and that would be that. But that's not what happened. Also, many prestigious scientific journals are going to start requiring "correct" use of the word in papers that they publish. This is not unusual - laws set by governments are only obeyed to the degree that the people accept them - or they have police and a judicial system to enforce them. If a particularly horrible law was passed, the police or the courts might simply refuse to enforce it. If the IAU made a particularly horrible decision, then the journals might not use their definition in their editorial policy. The IAU have the same authority to define "planet" as the government has to ban cellphone use while driving. More actually - they aren't even asking the general public to use the new definition - only scientists who are working within fields of study where IAU is considered to be the authority in these matters. SteveBaker (talk) 05:15, 7 February 2010 (UTC)[reply]
"Their only authority is measured by the degree to which their decision is respected." Exactly so. And I am arguing that we should not respect it. --Trovatore (talk) 05:46, 7 February 2010 (UTC)[reply]
I think that what everyone else is arguing is that that ship has sailed. The IAU has set themselves up as the official entity capable of changing this formal definition, and the scientific community has widely accepted it.
You can speak casually anyway you like, but you can't (correctly) claim to be speaking "formally" while using your own personal definition, however much more correct your definition is. That's what formally means : "Official" and "being in accord with established forms". (Unless, of course, you disagree with the dictionary's authority to provide an authoritative definition of the word "formal", in which case precise, logical communication with you is far more trouble than it's worth.) APL (talk) 07:06, 8 February 2010 (UTC)[reply]
See, you keep on changing your position, Trovatore. First, you say the decision was a bad one because it was based on the orbital characteristics of celestial bodies, rather than intrinsic ones. That's a substance argument. Then, you said the IAU had no authority to determine a formal definition. This is a principle argument, and it means that, even if you thought the definition was perfect in every way, you'd still reject it on the grounds that they had no authority to make such a ruling in the first place. Now, you've gone back to saying you don't respect their decision - which implies that you're no longer denying their right to make such a decision, just arguing with the decision they actually made. You're back to a substance argument again. You cannot have it both ways, but you're trying to. -- Jack of Oz ... speak! ... 05:59, 7 February 2010 (UTC)[reply]
Hmm? Why not? The IAU was wrong on both substance and principle, so I address both. --Trovatore (talk) 06:00, 7 February 2010 (UTC)[reply]
I just don't think you can do that. If you say they had no authority to impose a definition, then as far as you are concerned it should be completely and utterly irrelevant what the text of the definition is, because you reject it out of hand, sight unseen, on principle alone. Having completely rejected the definition on principle, you can't then readmit it only to criticise it on its substance - unless, of course, you're trying to have it both ways. -- Jack of Oz ... speak! ... 08:10, 7 February 2010 (UTC)[reply]
I disagree. They had no authority to impose a definition, and the one they chose was also bad. Both of these points are relevant to the discussion. --Trovatore (talk) 08:16, 7 February 2010 (UTC)[reply]
But if the definition they chose had been a "good" one (meaning, one you agreed with), then presumably you'd still be rejecting it on principle ...... or would you? If you hold to the principle, what possible difference could it make whether the definition was good, bad or indifferent? -- Jack of Oz ... speak! ... 08:22, 7 February 2010 (UTC)[reply]
If they had chosen a definition I liked, the principles would still apply, but it probably wouldn't bother me as much; I'm a human being, not some Randian superhero. That's descriptively. Prescriptively, should it make a difference? Of course — the damage is greater if it's a bad definition. --Trovatore (talk) 08:37, 7 February 2010 (UTC)[reply]

Wouldn't it be Goofy without Pluto in the Universe? Cuddlyable3 (talk) 01:18, 6 February 2010 (UTC)[reply]

The reason was because it might force George Lucus to release another round of Stars Wars films with updated dialog: Tarkin will now say to Leia, "You're far too trusting. Dantooine is too small and too remote. Although it has sufficient mass for its self-gravity to form a hydrostatic equilibrium, Dantooine has failed to clear the neighborhood around its orbit, and won't make an effective demonstration."A Quest For Knowledge (talk) 01:20, 6 February 2010 (UTC)[reply]
Fortunately, StarWars is set in a far distant galaxy a long time ago - hence either (a) the IAU had not made that determination at the time of those events - or (b) speed of light considerations would ensure that the IAU's decision cannot possibly have yet reached the Federation. SteveBaker (talk) 03:08, 6 February 2010 (UTC)[reply]
T h e s e a r e n o t t h e a n s w e r s y o u a r e l o o k i n g f o r . Cuddlyable3 (talk) 13:39, 6 February 2010 (UTC)[reply]
Federation? Surely you mean the Empire, or perhaps the Republic? APL (talk) 07:06, 8 February 2010 (UTC)[reply]
The mention a Trade Federation (the group with all those robots) In those crappy new Star Wars movies. Googlemeister (talk) 14:53, 8 February 2010 (UTC)[reply]

Menstruation

edit

How would a hypothetical human female go about stopping her annoying periods indefinitely? —Preceding unsigned comment added by 82.43.89.14 (talk) 18:59, 5 February 2010 (UTC)[reply]

See the article Hysterectomy. Cuddlyable3 (talk) 19:02, 5 February 2010 (UTC)[reply]
See also Extended cycle combined hormonal contraceptive, and the hypothetical women should talk to her doctor. -- Flyguy649 talk 19:17, 5 February 2010 (UTC)[reply]
She should consult her doctor who will be able to point her in the right direction - not some geek on the internet! --TammyMoet (talk) 20:20, 5 February 2010 (UTC)[reply]
Well, duh, she isn't going to get a hysterectomy or birth-control pill off of wikipedia - but she can get all the information she wants here. Why are we assuming that the OP is a female? The OP clearly stated that the subject of discussion is a hypothetical human female, thus heading off the inevitable "Wikipedia is not the place to ask for medical advice" response. 128.223.131.109 (talk) 20:42, 5 February 2010 (UTC)[reply]
Merely appending those magical words is neither a necessary nor a sufficient condition to come to a conclusion about what is and what is not a medical question. SteveBaker (talk) 22:24, 5 February 2010 (UTC)[reply]
No, but Kainaw's Superimposition tells us that since we are not being asked to give a medical diagnosis, this isn't a prohibited medical advice question. Comet Tuttle (talk) 23:10, 5 February 2010 (UTC)[reply]

Someone suggested that menstruation besides being unwelcome is an exceptional or even "unnatural" state since it involves Endometrium (uterine lining) cells failing to perform the function for which they grew and that a more natural state for a female is to be continually pregnant throughout her fertile life. Could that be achieved? Cuddlyable3 (talk) 01:07, 6 February 2010 (UTC)[reply]

This rivals the famous feminist nonsense "the only difference between men and women is a uterus" assertion of the 1970s. You have documented a new low in ideologically twisted physiology. alteripse (talk) 08:34, 6 February 2010 (UTC)[reply]
It's actually a pretty well accepted fact that, premodern times, most women spent most of their adult lives pregnant and/or lactating. The exceptions would be the celebate, for whatever reason. On top of that, it can't have been too unusual to have so little fat that menstruation stopped. This doesn't mean that spending your adult life pregant is a good thing (life expectancy of women was low due to the shocking mortality rate associated with giving birth), but it does suggest that monthly periods for 30 years may not be ideal or natural. Many young women spend years slightly aneamic thanks to the regular bleeding (if I weren't on this sodding iPhone I'd get you some references). In any case, there are loads of options if a hypothetical woman takes herself to her doctor. They range from a repackaged version of The Pill to more drastic surgical options, and would probably involve talking about what specifically is problematic about her periods. If the hypothetical woman has a male doctor and wishes to discuss it with a female doctor, they can usually just ask and people understand why. 86.179.145.61 (talk) 13:20, 6 February 2010 (UTC)[reply]
All true (except your apparent misunderstanding or misuse of "natural"), but none of those facts makes menstruation an "exceptional" or "unusual" or "pathologic" or "unnatural" state for a woman of reproductive age. alteripse (talk) 13:55, 6 February 2010 (UTC)[reply]
By "natural" he/she just means the state that humans have evolved to cope with. (How else would you define it in this context?) If the typical state through most of human development was for women not to be menstruating as often, then it may be that humans are not well equipped for that. Rckrone (talk) 18:29, 6 February 2010 (UTC)[reply]
Sorry, if you are correct that that was the intended meaning of natural then your understanding is equally flawed. "Natural" does not mean "optimized for human health": that misunderstanding is hard to distinguish from intelligent design and Platonic idealism. Please review biology 101. alteripse (talk) 18:36, 6 February 2010 (UTC)[reply]
I never said that natural meant "optimized for human health." You have order of causation wrong. Also, the question in parenthesis was not meant to be rhetorical. How would you define "natural" in this context? Because I really don't see what sort of meaning you're trying to hint at that you found offensive. Rckrone (talk) 18:56, 6 February 2010 (UTC)[reply]
My objection is to (1) use of a fantasy idea of an "ideal human state of nature" as a support for a specific social policy preference, or (2) sloppy use of the usually-denotatively-meaningless term natural as if it were an objective attribute of an argument or product someone is advocating or selling. I think the anonymous IP poster 86... was misusing the term in both senses, and I read your comment as endorsing it and failing to understand my objection. Someone can (and usually has) argued that virtually anything a human can do or experience is ipso facto "natural". Use of the term natural to endorse one preferred condition, experience, or behavior over an alternative usually signals an intellectually or semantically sloppy argument or a poor understanding of human evolution. Is that clearer? If the shoe doesnt fit, then feel free not to wear it. alteripse (talk) 20:09, 6 February 2010 (UTC)[reply]
Maybe some people think about the concept of "natural" in terms of some nonsense about Platonic ideals or artificial flavoring, but doesn't mean the word itself is tainted. It's still useful to think about the circumstances that a species has evolved to cope well with, contrasted with relatively new circumstance that it hasn't had time to adapt to. We have a perfectly good adjective to communicate that distinction concisely. e.g. a species' "natural habitat". Rckrone (talk) 23:54, 6 February 2010 (UTC)[reply]
I agree the word has a relatively precise and useful denotative meaning in some contexts. I think it a suboptimal word to use to designate an environment in which an organism is more adapted in comparison to another. Perhaps suboptimal enough to qualify as intellectually and semantically sloppy, but that is my opinion and you are free to be Humpty Dumpty and have it mean just what you choose it to mean. alteripse (talk) 01:42, 7 February 2010 (UTC)[reply]

Some (but not all) women stop menstruating while lactating. Lactation is far less drastic than a hysterectomy and one does not even have to get pregnant to get started. So our hypothetical human female might want to try Erotic lactation, maybe in alternating combination with the more booring alternative of milking her self by her hands or a breast pump. :-) —Preceding unsigned comment added by 89.8.105.190 (talk) 04:37, 6 February 2010 (UTC)[reply]

article on time deprivation

edit

I am looking for an article on experiments where people are not given any cues as to what time it is and seeing the effects on their internal clocks, but have not had much luck looking around here for it. Googlemeister (talk) 20:29, 5 February 2010 (UTC)[reply]

I would start here: Circadian rhythm. There is a section on rhythym disruption. CoolMike (talk) 21:43, 5 February 2010 (UTC)[reply]
I also found Chronobiology CoolMike (talk) 22:18, 5 February 2010 (UTC)[reply]
The gambling industry in places such as Las Vegas has practical experience of eliminating time cues because "the standard casino is windowless so as to control a patron's concept of time"[1] [2]. Cuddlyable3 (talk) 00:47, 6 February 2010 (UTC)[reply]
Did they get that idea from the Soviet KGB? 24.23.197.43 (talk) 03:46, 7 February 2010 (UTC)[reply]
Yes it was a KGB Russian roulette secret. Cuddlyable3 (talk) 20:52, 7 February 2010 (UTC)[reply]
There's a wee bit of information in the last paragraph of Circadian rhythm sleep disorder#Normal circadian rhythms. This looks pretty good, and you can find more by Googling for circadian +"temporal isolation" and similar sets of terms. Deor (talk) 01:21, 6 February 2010 (UTC)[reply]
Just to add a bit of info, the technical term for a time cue is zeitgeber, so the thing to look for is information about zeitgeber deprivation. Looie496 (talk) 17:47, 6 February 2010 (UTC)[reply]
See sleep#timing and perception of time. ~AH1(TCU) 21:46, 7 February 2010 (UTC)[reply]