Wikipedia:Reference desk/Archives/Science/2009 February 26

Science desk
< February 25 << Jan | February | Mar >> February 27 >
Welcome to the Wikipedia Science Reference Desk Archives
The page you are currently viewing is an archive page. While you can leave answers for any questions shown below, please ask new questions on one of the current reference desk pages.


February 26 edit

Mousetraps edit

Supposing that I was a mouse, what would be the best way to go about disabling/disarming a spring-loaded mousetrap? --84.68.65.232 (talk) 00:46, 26 February 2009 (UTC)[reply]

Tell another mouse about it first. "Say, there's some nice cheese over there, why don't you go take a look at it..." Then after he/she bites the dust, you get the cheese they missed. (Sound fanciful? I've seen mice do this, stepping over dead, trapped mice to get some peanut butter.) --98.217.14.211 (talk) 01:00, 26 February 2009 (UTC)[reply]
If your name was Jerry, then your best bet would be to trick the cat (Tom) into setting it off (probably with his tail). If you are not Jerry but still anthropomorphised, then lob something heavy onto the plate to set it off. Alternatively, you could wedge the spring down with some handy implement, but this is fairly risky. If you are an ordinary mouse, then find your supper elsewhere: those things can be lethal and you wouldn't want to risk setting it off by mucking around too close. Gwinva (talk) 01:04, 26 February 2009 (UTC)[reply]
It certainly seems that the risk of trying to wedge the thing open would be more than you'd want to take - so we should probably concentrate on setting it off. That means dropping something onto it - probably from a safe distance. If there is a shelf or some furniture conveniently above the trap - then dropping a small pebble ought to be enough to set it off...failing that - I think you'd need something long enough to provide leverage from a safe distance - yet light enough for the mouse to carry. Mice can make nests and carry their young around - so they are certainly mechanically capable of doing that. SteveBaker (talk) 01:12, 26 February 2009 (UTC)[reply]
I agree that setting it off from a distance is the way to go. You're pretty small and frail, you're fast but not that fast, and that long tail of yours is a liability if you're near the trap. These traps look crude, but they're nothing to mess around with. Take a look at our "Mousetrap" article, if you have the stomach for it—it's bloody horrifying. Anyway, if Mr. Human has set the trap right, it's against the wall with the bail out. You have to disturb the bait pedal enough to trigger the trap, but there are two problems. One is that the bail is going to come around with surprising force, tending to impart violent motion to whatever you're using on the pedal. The other is that it's going to make a loud snap, loud enough to alert your enemy and bring him around. If he does come around, you'll lose the bait, and why fool with the trap at all if you're not going to get a treat for your trouble? My advice is to see if you can find a tennis ball or something similar on the floor, and roll that at the trap. With luck, it will interfere with the trap's mechanism enough to mute the snap somewhat. But back away fast once you've rolled the ball, because the trap could tiddly-wink it back at you. Bon appetit! --Milkbreath (talk) 01:32, 26 February 2009 (UTC)[reply]
OR: I once lived in an apartment building where the local mice disarmed spring traps by turning them upside down. Don't know the way they did it because no one ever caught one at it. They weren't smart enough to evade box traps the exterminator put out. 76.97.245.5 (talk) 17:58, 26 February 2009 (UTC)[reply]
The real trick is to just avoid the mousetrap. Spend the extra time to find the food that isn't in a spring-death-machine. The effort and risk required to set it off safely is far greater than the value of the food in a single trap. --140.247.11.54 (talk) 01:42, 26 February 2009 (UTC)[reply]
A smart mouse would take a twig (or anything else long and thin) in its mouth, press down on the bait with the other end of it, and then enjoy the food. To learn how to catch rats - the big cousins of mice - read Full Revelations of a Professional Rat-Catcher. It's avaiable on Gutenberg.org. You'll have some good laughs. – GlowWorm. —Preceding unsigned comment added by 98.16.66.64 (talk) 02:59, 26 February 2009 (UTC)[reply]

Constituents of InkJet Printer Ink edit

What is the chemical dye/pigment that makes yellow inkjet-printer ink yellow? (And I guess that hearing about Cyan and Magenta would be interesting too). I'm thinking of the regular thermal inkjet Epson/Canon/HP types - not the fancy commercial grade gizmos. Our article Inkjet printer says it's a Volatile organic compound - but doesn't say what precisely. Ink cartridge and Inkjet refill kit were no help. I followed all of the links to various pigments in Yellow and none of them admit to being used in inkjets. ColorWen and Kolorjet Chemicals appear to be the manufacturers of the stuff - but their websites aren't much help ([1] and [2] SteveBaker (talk) 01:02, 26 February 2009 (UTC)[reply]

Many dye compounds are proprietary and kept secret by the companies that produce them. I had a friend that worked for DuPont in their titanium oxide/white pigment division, and he had to sign a non-disclosure agreement and a non-competition agreement where, if he left the company or was terminated, he could not work for a competitor in the same field for 10 years. There are probably some general ideas of things that make yellow colors, but each company probably either buys its dye from a manufacturer without knowing what it is chemically, or it makes its own ink, and keeps the formulation a secret. --Jayron32.talk.contribs 02:08, 26 February 2009 (UTC)[reply]
Googling for yellow thermal ink got me lots of patents, like this one, which say things like "the coloring material comprises at least one selected from the group consisting of C.I. Direct Yellow 86, C.I. Acid Yellow 23, C.I. Direct Yellow 173, and C.I. Direct Yellow 59", which refer to dyes standardized in the Colour Index International, some of which are listed at list of dyes. --Sean 13:50, 26 February 2009 (UTC)[reply]

Particles or waves in radiation edit

In describing radiation one can start with a nuclear particle escaping the sun's boundary some of which find their way to Earth via the solar stream.Gauss saw these as static particles or unbound electrons when he devised his law of statics. The law of statics when broadened to include radiators while maintaining equilibrium and then applying a time vary field becomes mathematically the same as Maxwell's laws. Thus it can be stated that a efficient radiator can be any shape, size, or elevation as long as it is in a state of equilibrium. When a radiator surface is covered with static particles and a time varying current applied it is the "weak force" or Foucalt current that applies levitation with spin to the particle such that the charge accelerates in a straight line projection to a receiving resonant diamagnetic radiator to complete the communication sequence. The ejection of the particle follows the same sequence that metal salvage yards apply to sort various types of material from each other via levitation from a conveyor line. Thus the question come about with respect to radiation and light being purported as a subject of "waves", when the above points to it being a subject of "particles" ( Neutrinos?) being accelerated by the "weak force" per the standard model?Slimylimey (talk) 01:40, 26 February 2009 (UTC)[reply]

In your specific example, the solar wind is not considered radiation. However, there's wave–particle duality. Clarityfiend (talk) 02:01, 26 February 2009 (UTC)[reply]

The solar stream is the vehicle for particles from the sun to arrive on Earth (from Gauss). Radiation is created by the acceleration of a charge (from Maxwell ala time varying field ) which in this case is the unbound particle.There is nothing in the above that shows wave - particle duality behavior Slimylimey (talk) 02:57, 26 February 2009 (UTC)[reply]

I guess I wasn't clear enough. Radiation can behave like waves in many situations and particles in others (e.g. the photoelectric effect). However, I didn't mean to say that there was much duality in the solar wind. Clarityfiend (talk) 04:45, 26 February 2009 (UTC)[reply]
It appears that spammers are in action. There is no duallity present in my statement. I never said that static particles were charged. Einstein predicted that the weak force was involved in radiation as well as being part of another std model force, so who proved he was in error? How was the weak force identified. Please describe. Photo electric effect is conformation that particles and not waves are the source. I thought this was moderated to keep out the spammers! Slimylimey (talk) 14:06, 26 February 2009 (UTC)[reply]
Foucault currents have little to do with the Weak force. And neutrinos aren't charged particles, so they can't be accelerated by a electrical or magnetic field. And your question is remarkably simpler to similar crackpot ramblings that appeared last year on Usenet's rec.radio.amateur.antenna, btw. --Pykk (talk) 07:48, 26 February 2009 (UTC)[reply]

water edit

Is it true that Water will keep you full for longer and stave off hunger pains? —Preceding unsigned comment added by Historyfan101 (talkcontribs) 03:30, 26 February 2009 (UTC)[reply]

Longer when compared to what? Dismas|(talk) 04:21, 26 February 2009 (UTC)[reply]
Longer than without it, presumably. I don't know but you don't want to drink too much; see water intoxication.--Shantavira|feed me 09:56, 26 February 2009 (UTC)[reply]
According to this article it is better to eat water rich food to stave off hunger pangs rather than simply drink water.--Lenticel (talk) 13:24, 26 February 2009 (UTC)[reply]
Personally, I prefer eating food to cure hunger... --Tango (talk) 16:14, 26 February 2009 (UTC)[reply]
Yes, buffalo is much more satisfying than water. --Scray (talk) 17:44, 28 February 2009 (UTC) [reply]

Largest open-shelf library edit

What is the largest open-shelf library?--Mr.K. (talk) 12:26, 26 February 2009 (UTC)[reply]

Sorry to see this question languishing here. Perhaps it is not optimally drawing on the knowledge base of the Science reference desk (I certainly don't have an answer). I assume you searched using Google or similar engine, which does not give an obvious answer. Humanities might be a better place to ask, though I am not sure. --Scray (talk) 11:18, 27 February 2009 (UTC)[reply]
Wikipedia says: Harvard. Rmhermen (talk) 23:13, 28 February 2009 (UTC)[reply]

Free university edit

Are there any university out there that is free in both meanings of the word? Free to join like the Open University and fee-free like some European state universities?--Mr.K. (talk) 12:32, 26 February 2009 (UTC)[reply]

Does Wikiversity count?--Lenticel (talk) 13:28, 26 February 2009 (UTC)[reply]
If it is an accredited institution, yes. Is it one?--Mr.K. (talk) 13:44, 26 February 2009 (UTC)[reply]
No. APL (talk) 13:55, 26 February 2009 (UTC)[reply]
You may be under a misapprehension about 'free' universities. European state universities are rarely 'zero cost'. What happens though is that some countries fully fund university education; the University charges tuition fees, but the state picks up the tab. For example this was true in the UK twenty years ago, when I was a student. But there were conditions; you have to pass the entrance requirements, you have to keep up a minimum standard, and you have to be a citizen of the UK. If you were a foreigner who went to the same university I did you certainly had to pay fees.
On the other hand you may be interested in this: [3] train to be a medical doctor, entirely without tuition fees in Cuba. DJ Clayworth (talk) 21:39, 27 February 2009 (UTC)[reply]
Britain isn't Europe though! German universities certainly used to be free in the second meaning. If you meet the entrance criteria and get a place you can study there without fees. Most (all?) German universities now charge an "administration fee", but at ~EUR300/semester this is still well short of what you'd call a fee in the Anglo-Saxon world. Getting a place might be straightfoward as well, at least 15 years ago for unfancied subjects (such as mathematics) you could just turn up on matriculation day.195.128.250.9 (talk) 23:41, 27 February 2009 (UTC)[reply]

derivation of lorentz transformation edit

in the following wikibook about special relativity, [4] i am able to understand upto equation 5. But after that, einstein has wrote that Now t can be eliminated from equations (1) and (2) and combined with   and (4) to give in the case where x = 1 and t' = 0:

   (6)   .

But i don't understand how t can be eliminated. Even if i substitute t with  , it cancels out and it ends at  . Please explain the steps very elaborately.--harish (talk) 12:42, 26 February 2009 (UTC)[reply]

Maybe you just misread what they are suggesting. First, you use equations (1) and (2) to eliminate t (if you aren't sure of how to do this read simultaneous equation). Then you get an equation in terms of  ,  and  . Now we set   and use   to put the answer in the form they gave. Vespertine1215 (talk) 17:28, 26 February 2009 (UTC)[reply]

Milky way part two edit

I finally had the chance to study the Milky Way near Sirius and Orion (new moon + not too cloudy sky). Anyways I want to verify if it was what I saw. Anyways I detected a faint band of light near Orion. Using a toy binocular, I found that it was made up of small stars that aren't visible to the naked eye. Orion also have several of these minute stars inside him.--Lenticel (talk) 14:20, 26 February 2009 (UTC)[reply]

A simple constellation map is the easiest way to verify this. The band of the Milky Way should pass above Betelgeuse, Orion's left shoulder. Note that you're going to see lots of faint stars in every direction, though, when you use binoculars. — Lomn 16:13, 26 February 2009 (UTC)[reply]
Orion is pretty big, could you be more precise? Orion (constellation) has a map showing where different things are in the constellation, you may be able to identify what you were looking at from that. --Tango (talk) 16:12, 26 February 2009 (UTC)[reply]
The most significant object in Orion is without a doubt Messier 42 The Orion Nebula, it is the single brightest diffuse nebula in the sky, visible even to the naked eye under dark skies. That might be what you saw. The Orion nebula is at the middle star of Orion's "sword". As for looking and identifying stuff in the sky, I always recommend this http://www.skymaps.com/downloads.html and this http://www.stellarium.org/ enjoy. Vespine (talk) 22:09, 26 February 2009 (UTC)[reply]
The Orion nebula wouldn't look like it was made up of stars, though, would it? I believe it contains an open star cluster, but from my recollection of when I observed it through a small telescope (I never quite got the focus right, admittedly), it was more of a diffuse cloud than a collection of stars. --Tango (talk) 22:55, 26 February 2009 (UTC)[reply]
Well, the constellation of Orion and nearby sky is a region pretty rich in stars, assuming relatively poor conditions of observation, a "faint band of light" could be the Hyades, the nearest open star cluster from the solar system, roughly at the location of Aldebaran (reddish star opposite Sirius from Orion's belt) in the nearby constellation of Taurus. A little farther in that direction, you might have spotted the most striking open star cluster in the sky: the Pleiades, also in Taurus. Orion's head above and Orion's sword just below Orion's belt are other possibilities. Equendil Talk 00:58, 27 February 2009 (UTC)[reply]
Okay I was able to confirm that what I saw was indeed the milky way (although much fainter than in the pictures) using the maps that you provided. I thought Orion's nebula was a star. No wonder it look fuzzy when viewed more closely. Thanks guys. Now I'll be hunting meteor showers and comets!--Lenticel (talk) 15:43, 27 February 2009 (UTC)[reply]
Lentice, what you have seen might be part of the Milky-Way, but if you have really never seen it, do yourself the favour and try to get out on a moonless, clear night to somewhere well away from any light source. Not sure where in the world you live, but it must be possible with a bit of effort. And leave the binoculars at home. You can't miss it if the conditions are right. 195.128.250.9 (talk) 23:48, 27 February 2009 (UTC)[reply]

opened bottled water shelf life edit

I tried to search to determine if this question had been asked before, but the search bar turned up quite a few threads and my time is limited. How long is a bottle of water still drinkable after it has been opened and left sitting out a room temperature? I often take one to bed and leave half a bottle on the night stand. I always throw it away the next day and grab a fresh one. I am now wondering if the previously opened bottle would still be good. thanks and cheers, 10draftsdeep (talk) 14:38, 26 February 2009 (UTC)[reply]

I don't think there is a precise answer. If you got some bacteria in it from your mouth the day before, then it may start to grow in the water. However, it may not find much food there. If you were eating while drinking the water, though, you may have gotten food particles in the water which the bacteria can snack on. This might make it bad the next day. If, on the other hand, you opened it and closed it right up without any bacteria or food getting in, it may be good for a very long time (until the chemicals leaching from the plastic ruin it).
The obvious solution is to dump the water from the bottle and refill with tap water, since that's likely what came in the bottle anyway. The only reason not to do this is if your tap water is really nasty. A bit of lemon helps the taste, in any case. StuRat (talk) 16:27, 26 February 2009 (UTC)[reply]
In the UK at least, mineral water often has a higher bacterial load than tap water in the first place. DuncanHill (talk) 16:55, 26 February 2009 (UTC)[reply]
If you left it for a couple weeks it would probably evapourate, which would make drinking difficult. Until then, it should be fine. WilyD 17:04, 26 February 2009 (UTC)[reply]
Use a glass. Put the top back on tightly after you poured your water. If you want to be extra cautious you can use a thermos-bag and toss in some Ice packs or use a bucket with some ice cubes. If you finish it the next day it should be fine without, though. (Apart from the nasties already in the water that Duncan mentioned.[5]) If you don't live in a hotel or have tap water issues like lead or silt a filter on your tap might be better. Those can have issues, too, though. 76.97.245.5 (talk) 17:21, 26 February 2009 (UTC)[reply]
Buy one of these. Fill it from your kitchen sink. Drink from it as needed. It will be identical to the bottled water you buy in the store. Remember, bottled water is made from your municipal water suply, most likely from a local bottling plant. If you are buying bottled water in a local supermarket, the people that put the water in that bottle got the water from the same place your kitchen sink does. In fact, the water from your kitchen sink is probably better than the bottled water, because compounds in the plastic readily leach into the water, especially as the water usually spends some time in hot trucks or on shelves in warehouses. Of course, if you obtain water from a well, as opposed to a municipal supply, you may be using water softeners that make your tap water unpalatable. In that case, bottled water may be a better option. But for most people, its just a scam. --Jayron32.talk.contribs 03:06, 27 February 2009 (UTC)[reply]
We have Pur water filters on our kitchen taps - they make getting rid of stuff from tap water a breeze. But bottled water is a scam - and a disaster for the environment (plastic bottles don't recycle well - shipping water in trucks is a lot worse than along those handy pipes you have running to your house)...there is no excuse for drinking the stuff. SteveBaker (talk) 03:28, 27 February 2009 (UTC)[reply]

Thanks for all the great advice. I once owned a Brita filter pitcher and have been seriously considering a new one. I guess the answer (going back to my question)is that the water is not going to spoil horribly just because it sits unrefrigerated for awhile once opened. thanks, cheers and happy weekend to all.10draftsdeep (talk) 13:40, 27 February 2009 (UTC)[reply]

television edit

i am struggling to get this answer . Who really invented television? —Preceding unsigned comment added by 59.92.247.138 (talk) 14:39, 26 February 2009 (UTC)[reply]

History of television seems to have a lot of information. Algebraist 14:54, 26 February 2009 (UTC)[reply]
You need to specify whether you mean the device or the application as a broadcast medium. Television is a very broad concept and thousands of scientists and inventors contributed to its current incarnations. Pick a suitable definition for your purposes, and you will be able to narrow down which accomplishment "counts" as the invention. Nimur (talk) 16:24, 26 February 2009 (UTC)[reply]
The modern television as we know it can probably be best assigned to the work of Philo Farnsworth. He certainly built heavily on the work of others, but the modern CRT-based television is pretty much identical to one he invented (well, 20th-century TV....) Interestingly, he was about 14 years old when he worked out the basics of it. --Jayron32.talk.contribs 19:43, 26 February 2009 (UTC)[reply]
Sure...but that's the cathode ray tube... Whenever I watch television, (which is rare these days) I do not use a CRT. Between the LCD display and DLP, I'm using 100% CRT-free televisions. In fact, most of the time, for my purposes, the entire broadcasting process does not use a conventional television camera (now replaced by semiconductor-based digital video cameras), conventional RF television modulation (now replaced by digital encoding), or audio processing; this is especially true when I watch "internet television" encoded digitally and broadcast over a packet-switched network. So, no part of my experience relies explicitly on the "television" as it was known fifty years ago. This brings me to my original point - "television" is more subtle to define than just a single device or invention. Nimur (talk) 19:50, 26 February 2009 (UTC)[reply]
I still have 5 CRT TVs and 4 CRT PC monitors in use in my house. Why ? Because I refuse to throw out anything that still works. Some of these are decades old. I have needed some amusing fixes for some, though, like a strategically placed magnet to prevent color distortion in one, adjusting the vertical height to avoid blurry corners in another, and cutting the speaker wires in one and attaching external speakers. Another just needs a good kick whenever it goes down to a single bright line on the screen. And, yes, if you can't tell, I'm partially Scottish (I like to say I have about a fifth of Scotch in me at any given time). StuRat (talk) 04:16, 27 February 2009 (UTC)[reply]
Phooey on LCD displays and CRTs. Just build an electronic scan converter and use a Nipkow disc which Paul Nipkow invented in 1883 to display the image. Edison (talk) 02:01, 27 February 2009 (UTC)[reply]
Reply to Nimur; the deal is, the changed since Farnsworth's original ideas have (color TV, High Def technologies (plasma/LCD), digital broadcasts, etc. etc.) represent rather incrimental changes to the basic system. Even modern, satellite television in 1080i Hig Def on a 52 inch plasma TV run on the same basic system that Farnsworth put together. It wasn't the CRT, it was the camera-broadcast-display conceptual system that he turned into a practical reality which resulted in Television. Yes, these parts had been in place in bits and pieces already, and yes, as Edison notes, earlier television systems had been tried, but it was Farnsworth's system that stuck, and became the modern television system. --Jayron32.talk.contribs 03:00, 27 February 2009 (UTC)[reply]

Einsteins prediction regarding the "weak force" edit

(Moved from talk page. – 74  16:25, 26 February 2009 (UTC))[reply]

Einstein predicted that the solution to identifying the "weak force" lies in the subject of radiation, where it is part of or connected to another of the "standard forces" This points to the Foucault current which is part of the standard force of electromagnetic force Thus this very specific questions Who proved Einstein was in error, and what was eventually identified as the "weak force" ? Please, no spamming guessing or personal conjectorSlimylimey (talk) 16:13, 26 February 2009 (UTC)[reply]

The weak nuclear force is a fundamental force, it hasn't been identified as being something else. You may be interested in reading electroweak force, which talks about how the weak force and electromagnetism become the same thing at high enough energy levels. --Tango (talk) 17:20, 26 February 2009 (UTC)[reply]

Thanks for responding but I wanted a specific answer. Gauss established the presence of particles (static) By applying a time varying field to the statics equation so that it equals Maxwell's laws ala radiation Therefore the Foucault current should be identified as the weak force or the magnetic field generated by it. It is now a tangiable force as Einstein predicted. Why does such a question cause none specific answers? If electro magnetic can be defined as one of the forces why cannot the reaction to same be identified as the weak force without any accompanying mumbo jumbo per prior unproven theories of Feynman? To sum up, particles exist at rest on all diamagnetic materials including radiators. An applied time varying field generates a swirling electrical field (eddy field)that dislodges the particles (accelerates) by applying an accelerating force with spin for straight line projection that exceeds gravity, another standard model force.Slimylimey (talk) 18:51, 26 February 2009 (UTC)[reply]

What you do mean by "particles"? You seem to be using the word to mean something specific, rather than just any particle - are you taking about electrons, possible? Also, be careful saying "radiation" without clarification - I think you mean electromagnetic radiation, but ionising radiation is also relevant to this discussion (the weak force is very significant in radioactivity). --Tango (talk) 19:46, 26 February 2009 (UTC)[reply]

Bad medical terms give me high blood pressure... edit

Can anyone explain the origin of the terms hypotension and hypertension ? I get the "hypo" and "hyper" parts (aside from the obvious stupidity in using nearly identical-sounding terms with opposite meanings in a life-critical situation). But, how exactly does "tension" mean "blood pressure" ? (OK, that was the Q, now for a little rant: The terms "low blood pressure" and "high blood pressure" appear to be far superior, both in clarity to all and the lack of potentially fatal medical mistakes. They are also the same 4 syllables as the "proper" medical terms.) StuRat (talk) 19:26, 26 February 2009 (UTC)[reply]

Well, the pressure of blood puts the blood vessels under tension, so I guess it makes a certain amount of sense. I agree that hypo- and hyper- are terrible prefixes. I think doctors speak Greek just to sound clever half the time - they're not the only discipline guilty of that, though! --Tango (talk) 19:41, 26 February 2009 (UTC)[reply]
I believe it's because 'tension' means strain, and both low and high blood pressures cause a strain on the body. I agree, however, that blood pressure is a much better term, and it's used almost solely for patients. However, in an educational context, the official term of hypo/hypertension is used much more often. —Cyclonenim (talk · contribs · email) 20:11, 26 February 2009 (UTC)[reply]
Just for interest's sake, the OED counts one of the meanings of 'tension' as precisely 'pressure', especially in biological or medical contexts. So, it doesn't really need to be interpreted literally in this context. Vespertine1215 (talk) 22:18, 26 February 2009 (UTC)[reply]
OK, this explains why hypertension might mean "high pressure", but there's still no reference to blood. So, how, other than memorizing that this is what hypertension refers to, would one know that ? It could just as well mean pressure in the bowels, for example. How'd we end up with such an imprecise terminology for these conditions ? StuRat (talk) 23:43, 26 February 2009 (UTC)[reply]
Also, medical terminology is purposely outdated. That keeps it from changing. A hundred years from now, hypertension should still be called hypertension. It wouldn't be good if your diagnoses was misunderstood in ten years because terminology drifted with whatever the latest cool rap song that recently came out and the new doctor had no clue what "blood pressure" was because he only learned about "blasizzle" in school. -- kainaw 23:21, 26 February 2009 (UTC)[reply]
It's also nice to have internally-consistent terminology. 'Hyper' always means 'high' or 'too much': hyperhidrosis - too much sweating; hypervolemia - too much fluid in the circulatory system; hyperthermia - high body temperature; hypertension - high blood pressure; hyperkalemia - elevated potassium. 'Hypo' always means 'too little' or 'low': hypohidrosis - to little sweating; hypovolemia - not enough blood volume; hypothermia - low body temperature; hypotension - low blood pressure; hypokalemia - low potassium levels. Everyone has a nice, consistent, easy-to-remember set of terms, and everyone uses the same ones. There are a number of other standard prefixes and suffixes, which allow for specifically and precisely describing medical conditions in a reasonably concise way that – and this is key – all medical practitioners will interpret in the same way. TenOfAllTrades(talk) 23:39, 26 February 2009 (UTC)[reply]
"High" and "low" also have precisely consistent meanings and yet don't sound alike. They are also easier to abbreviate, as in HDL and LDL. StuRat (talk) 23:48, 26 February 2009 (UTC)[reply]
"High" and "low" are neither Latin nor Greek. They aren't even French. --Carnildo (talk) 00:51, 27 February 2009 (UTC)[reply]
You could go for "Hi-" and "Lo-", then at least it isn't English! --Tango (talk) 01:06, 27 February 2009 (UTC)[reply]
Well, I think you are being prejudiced. We should use 上 and 下. They haven't changed for few thousand years. (In case you don't have Chinese characters installed, they are the Chinese characters for up and down used to point out the reason that a dead language was chosen). -- kainaw 01:49, 27 February 2009 (UTC)[reply]
I'd be happy to use any language that doesn't have a tendency to use nearly identical-sounding words for opposites. StuRat (talk) 03:25, 27 February 2009 (UTC)[reply]

Well, then you'd have stop using English (at least), right? There are (obviously) dozens (hundreds, perhaps?) of words that while not precisely opposites sound alike but mean different things or are spelled alike but mean something different. At any rate, it would seem to be moot, because in ordinary conversation, you probably already say "high blood pressure" or "high blood sugar". Healthcare professionals are most likely use the "jargon" and they can tell the difference pretty easily. —Preceding unsigned comment added by Brewfangrb (talkcontribs) 08:52, 27 February 2009 (UTC)[reply]

We don't have to use English, or any one language, we should use whatever language is best in each case. As for it not being a problem for medical professionals, you're probably right in that it's only a problem for their dead patients and malpractice insurance providers. There are thousands of deaths each year due to medical mistakes, and I'd bet a substantial portion are due to mishearing or misreading medical terms. Can anyone find any statistics on this ? This is similar to the recent heparin overdose problem, where different dosages were stored in similarly labeled containers. And yes, "medical professionals should be able to tell the difference", but it's also inevitable that easily confused labels or terms will occasionally be confused, even by those who should know better. StuRat (talk) 15:34, 27 February 2009 (UTC)[reply]
It would be interesting to post this at the languages RD for response by someone who knew something about linguistics (Which I've done, BTW) . Seems to me that, spoken correctly, without panic, hyper- and hypo- sound rather different, since completely different mouth movements are required to make the vowel sounds. To my mind, stuff-ups occur from stuff like this because inexperienced people get flustered easily, talk too quickly & imprecisely, and don't check what they're doing (or about to do). One of the reasons doctors spend interminable years in training is to learn to deal calmly with situations most people freak out over. All that running around and yelling "STAT" a lot like they do on ER is not really what it's like. Mattopaedia Have a yarn 02:57, 28 February 2009 (UTC)[reply]
You've reposted this Q on the Language Desk ? Can you provide a link ? If we could guarantee that every medical professional always said and wrote everything clearly, you might have a point. However, doctors still can't write a prescription that patients can read, and less experienced medical professionals, like interns and nursing aides, are also involved. So, in the real world, we need words that are difficult to confuse. StuRat (talk) 14:22, 28 February 2009 (UTC)[reply]
I found it here: Wikipedia:Reference_desk/Language#Similar_sounding_terms.3F. They seemed to agree with me that the words look similar and sound similar in many accents. StuRat (talk) 15:47, 28 February 2009 (UTC)[reply]
So unless we listen carefully, whatever language we speak (except maybe for ancient greek), there's enormous room for error it seems. Handwriting is a seperate issue, of course, and another of my professional bugbears. Penmanship is a dying art, and years of frantically scribbling notes in lectures and on wards produces the typical result. The advent of the electronic medical record may herald the end of this problem though. Mattopaedia Have a yarn 02:48, 2 March 2009 (UTC)[reply]

Well, we seem to have beat the "hyper"/"hypo" part of my Q to death, but I'd still like to know why there's no mention of blood, in any form, in the words hyper- and hypotension. Any ideas ? StuRat (talk) 03:22, 2 March 2009 (UTC)[reply]

Because they mean "high pressure" and "low pressure", not "high blood" and "low blood". You have have hypertension because of hardened arteries, high stress, excessive heart rate, etc... all of which are cardiovascular, but not necessarily "blood". -- kainaw 05:15, 2 March 2009 (UTC)[reply]
Well, of course "pressure" should ALSO be in the name, and "tension" apparently does have that meaning, in a medical context. I don't believe the causes of the disorder are relevant, either, as disorders aren't generally named after their causes, but rather after their symptoms. So, let me ask the question so you can't possibly misinterpret it this time:

'Why don't we use the terms "hypohemotension" and "hyperhemotension" to denote low and high blood pressure, since "hypotension" and "hypertension" could refer to low and high pressure in any body system ?' StuRat (talk) 15:09, 2 March 2009 (UTC)[reply]

Because cardiovascular hypertension is exceedingly the most common form of hypertension, it is normal to drop "cardiovascular". When discussing other forms of hypertension, the type of hypertension is included, such as ocular hypertension. Of course, if you are discussing ocular disorders, you will likely drop ocular as it will be implied. -- kainaw 15:29, 2 March 2009 (UTC)[reply]

"Treatment" for Aspergers Syndrome? edit

Ok, no medical advice sought, none given. Peachy, but are there treatments recognized for the above? Sensitivity training aside, I'd look it up myself IF i had time (STEVE BAKER HELP ME HERE>>>PLEASE!!!). I know that there is a syndrome for everything nowadays but it appears to simple little me to be the same as cynicism with a healthy dose of paranoia and Compulsive behaviour thrown in? Please excuse my ignorance if it is otherwise. Furthermore, what if it's just a case of trying to care and then getting worn down? What if i just tried too hard to care too much and couldn't carry it off as I age? Messiah complex? I tried to save the world, the world wasn't interested! Furthermore AGAIN, what if I can express emotions, especially those that I don't feel (empathy especially) Practice,practice practice! But I did want to be the First CAnadian Pope when i was young BUT i settled for being an environmentalist ( both equally futile ambitions BTW). I care, for the whole planet my fellow man and the cosmos as we know it. Ok 1 last quick Q? Would this then, not make me prone to addictions to escape what i see as my dark side or is there a genetic connection there TOO (I'm Metis, i inherited the worst of both worlds and the good of ????).

Sorry but many of the above q's are rhetorical, but the opinion of others is always welcome, if not appreciated. Again sorry if I come across as arrogant. I'm not, IMHO. —Preceding unsigned comment added by 67.193.179.241 (talk) 22:25, 26 February 2009 (UTC)[reply]

I'm sorry, I don't really know what you're asking. That was a rather incoherent stream of consciousness... As for your main question, have you read Asperger's syndrome#Management? I don't think there is any treatment beyond managing the condition and learning how to cope with it. I'm sure Steve can give you a more authoritative answer once he gets here. --Tango (talk) 22:52, 26 February 2009 (UTC)[reply]


Regarding treatment, "Some researchers and people with AS have advocated a shift in attitudes toward the view that AS is a difference, rather than a disability that must be treated or cured." as our article says. So, why have a label, then. One reason is to be able to identify individuals with needs that are not met with the training and facilities offered by the standard systems. Programs facilitating the "management" Tango mentioned might not be offered if there wasn't a label. Another is for job security. A boss could send someone to a stuffed business meeting presenting group results. Ordinarily if s.o. would say, "I can't deal with crowds." said boss would have the right to say "Get over it, or leave." An Aspi can't get over it. If the boss would knowingly fire someone for being put in a situation they couldn't handle the employee could take measures like complaining to a level higher up or HR or sue if they were fired. As Steve pointed out in an earlier post (see the archives) the unique workings of a mind that fits under the "Asperger" umbrella can also offer a lot of benefits. Being left-handed used to be considered an aberration that you could "treat" people for. Thank goodness the people who consider homosexuals as just "choosing" to want to be different (and/or needing to be treated) are getting fewer and fewer. There are people who find it hard to cope with the fact that we humans come in many varieties and those who don't fit the mold don't need to be beaten into shape. 76.97.245.5 (talk) 00:30, 27 February 2009 (UTC)[reply]

Sorry but doesn't that ALLOW us, as ASPIES , the right if not the priveledge of not "trying" to change? Sorry again, but labels seldom fit in the space allotted to humankind? Again my apoligies if i seem condescending, I AM trying to understand!!!! I'm honestly trying just to understand/comprehend my place in this dark old world. Because i may be eloquise or verbiose (or too drunk to type apparently!). Please don't take take me too seriously, I don't. I'm not looking to be ENABLED, I'm just looking to "be myself". And no, thanks for your concern, BUT I AM NOT CONSIDERING SUICIDE!!!! Cheers For NOW, , must sleep. SHiftwork sucks. as does 12 hr shifts!! See you all in the dreamworld but Tango, we'll see you sooner ( in my nightmares...)! Sorry, my Thanks for your expediate replies, i KNOW you are only trying to help. Myself? I try not to hurt the other animals, the earth, people/plants or fungi But I find myself "not up to the task" repeatedly. I can only hope that i haven't offended anyone's "common sensibilities" with my diatribe(sp?) and drunken drivel AND in the END I've done more good than harm. Cheers Vin. Save page before i dribble on. —Preceding unsigned comment added by 67.193.179.241 (talk) 01:48, 27 February 2009 (UTC)[reply]

First and foremost - this is a 'spectrum' disorder. At one end normality - mild geekishness - then "Aspergers" - then more severe stuff through to profound, terrifying Autism. There are no definite limits...no solid definitions...so the labels don't help much. There is a point on that spectrum where the problem is extreme and any benefits are irrelevent. Having a cure available for those people would be a wonderful thing. Equally there is another point where a cure is not only unnecessary - but actually agressively NOT wanted. That's because it's not a one-sided condition. There are benefits. And I'm also left-handed...which is a similar deal.
So where is that line? It's different for everyone. I for one don't want to be "cured" - since I've learned where my limitations are - and that I can do a reasonable (but imperfect) job of working around them - I'll take the benefits anytime. But I can't (and won't) speak for everyone - because we're all at different points on the line.
One of the defining features of Aspies is that we can't automatically figure out what other people are thinking...that's one of the things that makes big meetings so uncomfortable. Strangers are also a problem because you don't (yet) understand them - and they don't understand you. But we aspies are good with 'mechanical' things - things with rules - things that behave predictably. The trick is to understand the rules of the interpersonal-relationship game. So if you have to go to meetings - give yourself things to do - watch the other participants' body language - respond to it. Find the people you are agreeing with - now play this silly game where you fold your arms when they fold their arms...lean back when they lean back. Wait a few seconds after they change posture - then mirror it...look at the other people and their poses - did the guy just change his mind and switch 'body-language' teams? It seems stupid - but this is what normal people do - they don't KNOW they're doing it - but you can learn to fake it. You just can't do it unconsciously and automatically. A meeting is a process - treat it like that. You have tasks - taking notes - formulating replies - handling body language. This is a mechanical process like programming a computer or and you can do it.
Your place (well, MY place - at least) is to do what you can to fit in - because that's what makes life smoother. Perhaps you shouldn't have to - but there is little point in trying to fight the system. So fake the 'fitting in' thing - then use your abilities to get what you want.
If you are on the other side of the line - where your downsides are too severe to allow you to function in society - my heart goes out to you. All I can say is: Get Help. There are people out there who can coach you in the interpersonal skills stuff that you lack. Unless you are falling severely into the Autism end of things, you CAN make your life work out better by using your ability to get obsessional and use that laser-like focus to get interested in figuring out what makes humans tick. The right trainer can tell it to you like it's a mechanical system...how often to make eye contact - how to mirror other people's body posture when you agree with them - and don't mirror when you don't - how to recognise the facial expressions and body postures that tell you that you're talking for too long on a subject that nobody else cares about.
It's just like you learned the latin names of ALL the dinosaurs when you were a kid...it's a pile of arbitrary stuff that you can get into. All of these things can be turned into formulaic rules - follow the rules and it's like magic. Sure you're only PRETENDING to do what the 'normals' do magically and without thinking...but that doesn't stop it from working. So revel in what you can do - use those special talents. Stop feeling sorry for yourself. Sure you'll screw up - lots of times. But you're lucky - you know! I didn't find out until I was already in my mid 40's for chrissakes! I look back on some of the TERRIBLE faux-pas I made in my teens, 20's and 30's...it makes me shudder! But now I know and understand the problem - I can use my skills to overcome the worst of the difficulties.
If our OP wants to email me personally (you'll find a way to email me on my Talk: page) I'll be happy to chat in a calmer, less public environment. SteveBaker (talk) 03:18, 27 February 2009 (UTC)[reply]

Validity of Wikipedia edit

So for my Inquiry class I am required to give an oral presentation on a topic of my choice (as long as it's science related). Pretty much everyone in my class bashes Wikipedia, saying how it's so inaccurate, blah blah blah, so I'm thinking about giving a presentation of the validity of Wiki (not as a substitute for scientific journals, obviously, just as an equivalent source to other encyclopedias). However, there's an obvious problem; most of the evidence for this (at least the stuff I can find) is on Wiki itself, which is not considered a valid source (there's been a lot of talk about paradoxes on the ref desk lately and clearly this is one of them, but I cant remember what kind). Is there substantial evidence supporting Wiki not on Wiki? There's also the problem of my prof and my peers tearing my argument to shreds immediately following the presentation, and that seems even more difficult to avoid. Any advice? -Pete5x5 (talk) 23:52, 26 February 2009 (UTC)[reply]

Wikipedia has an article on everything, including Reliability of Wikipedia. Particularly pay attention to the reference section - (theoretically present in every article) - in communities which do not respect Wikipedia outright, reliable third-party references will still carry weight in the discussion. Be sure to emphasize that Wikipedia is an encyclopedia and is not a general replacement for research journals, newspapers, and other types of references. The most general complaint I hear levied against Wikipedia is that the anonymous contributions result in non-trustworthy, non-expert opinions. Consider refuting that argument from a sociological standpoint. Compare various definitions of anonymity, and the construction of trusted networks (i.e. a "peer review" process is typically performed by people you do not know - why do you trust them?) This actually has quite a bit of subtlety, if followed to its logical end. ("Reputable journals are trustworthy because they are reviewed by reputable people..."). Such reasoning, I believe, inevitably devolves into an admission that anonymous sources must be trusted, to some extent; and that the majority of the burden lies with the consumer of information to really evaluate quality and credibility. As for the expertise of the anonymous sources... isn't it self-evident? Nimur (talk) 00:39, 27 February 2009 (UTC)[reply]
Take a look at ANY of these articles, and challenge your audience to find the flaws. They are scrupulously referenced (look for footnoted references in Britannica sometimes, eh?) and well written. Ultimately, what makes Wikipedia work as a reliable source is the huge emphasis we place on verifiability espcially with specifically footnoted statements. Now, take the same audience, and ask them how they would grade a scholarly paper which was referenced to Encyclopedia Brittanica and World Book Encyclopedia. They would laugh the student who wrote that out of their classes. And Brittanica And World Book don't tell you their original sources, so you can find the good stuff yourself. Wikipedia does. So which is a better research tool? Britannica or Wikipedia? See where I am going with this. People often cite some shitty unreferenced article about a soap opera character or a garage band or a train station and say "See, this is why Wikipedia sucks". But show them the text of a Featured Article, and don't tell them it's from Wikipedia, and they will likely say its great writing and research. --Jayron32.talk.contribs 02:52, 27 February 2009 (UTC)[reply]
If you can get access to Nature online, this article] is spectacular. Wikipedia compared favorably with brittanica online in 50 randomly selected articles (er....I think that was the gist of it. I actually don't have access to the article myself, and it's been sometime since I read the original). --Shaggorama (talk) 04:20, 27 February 2009 (UTC)[reply]
This has details about the Nature findings. Clarityfiend (talk) 04:40, 27 February 2009 (UTC)[reply]
It's not that spectacular... there have been many concerns with whether it was a meaningful study. See Wikipedia:External_peer_review#Nature_.28December_2005.29. IMO it doesn't say much about how great Wikipedia is, it just points out that EB and other encyclopedias aren't exactly perfect either. --98.217.14.211 (talk) 14:31, 27 February 2009 (UTC)[reply]
Wikipedia isn't a source reference but if you really really want to reference it as a source you should probably specify a time stamped version of an article rather than the latest version. It may have a mistake that's been corrected later but at least you'd be specifying what you saw. Dmcq (talk) 16:09, 27 February 2009 (UTC)[reply]
As I have experienced, Wikipedia is excellent for science, but bad at history and politics with all those ninja-editors "let's alter statistics in every article to make our country/religion/ideology look more favorable, and do it in a group so we can accuse our reverters for 3rr breaking. If someone argues against us, we label them as nazis, racists, etc.."
Besides this, however, Wikipedia is very good and trustworthy in science-related articles, or, every type of article that is not heavily based on political interests. --131.188.3.21 (talk) 16:24, 27 February 2009 (UTC)[reply]
The trouble is that while FA's are excellent, there are only 2,400 of them...out of 2,800,000 articles - that's less than 0.1%. For the rest, readers need to exercise some care. Sure, there are statistical studies that have been done that show that we are at least as accurate (on a random sampling) as the other encyclopedias out there. But when you are standing in a large, university library looking for a specific scientific fact - you DON'T start off by reading Encyclopedia Britannica. You go to the stacks - look in the correct Dewey-decimal section and start browsing books. Encyclopedias are for not-critical stuff...and for that, Wikipedia is head and shoulders above the rest - for breadth of coverage - for documentation of sources - and for traceability of authors.
Take some examples:
  • If you are watching a movie on TV and you wonder who some actor is - Wikipedia is a spectacularly good resource because even if there is an occasional error in who played who in what movie, it's hardly a critical thing.
  • If you are a Doctor looking up treatments for obscure diseases - then you'd better not use what you read here as the basis of a life-or-death decision. In that case you should use Wikipedia ONLY as a means to find links to 'reliable sources' that you can check.
It's still incredibly useful...but not as a primary source. Somewhere between those extremes is the place where things get tricky. Suppose you are a fact-checker for a newspaper - you might be happy to use Wikipedia to confirm things that you already pretty much believe to be true...and if you're really not certain - you can check the references. In the end, you simply have to use your best judgement. You can see with most articles whether they appear solidly written or not - you can check the sources - you can look back at the edit history and see how long the particular fact you are interested in has gone unchallenged - whether it's been the subject of repeated revisions or revert wars. Heck, you can even find out who precisely wrote every single word in every single article and you can go to that person's talk page and ask them how they know that...and you have a better than even chance of getting a good reply too! The breadth of coverage is now so great that there are frequently multiple articles written by different people with overlapping coverage - you can follow links and confirm that all of the versions of the same information agree. There is no other resource on the planet that gives you those options. So Wikipedia is DIFFERENT. You have to learn how to use it well...and (most importantly) when NOT to use it!
SteveBaker (talk) 16:56, 27 February 2009 (UTC)[reply]
As for profs and peers tearing your arguments apart, it is best to be prepared. While coming up with your talk, at the same time come up with rebuttals for all your main points by playing the devil's advocate. Then come up with good counter arguments to those. By doing this you will know the strong and weak points of your argument beforehand and will be able to defend your viewpoint without having to come up with arguments on the spot. When it comes time to talk, if you can't keep it all in your head bring a list of quotes with the sources from references. It may be troublesome if your peers use Anecdotal evidence as counterarguments because it is easy to cherry-pick examples where something has gone badly wrong (i.e. stumbling onto the Evolution page after someone replaced it with the first chapter of Genesis). I'm not sure what you can do about this other then point out that arguing from one or two examples isn't a solid argument.
One of what I think is Wikipedia's strongest points is its networking: the unparalleled-by-other-information-sources connection of related or pertinent topics to one another through Wikilinks, see also sections, categories, and lists. It makes searching for related topics much easier. Like Steve Baker said above, wikipedia is transparent. You can see who worked on what and all past versions of the article. It is also in real time, many Wikipedia articles can and do keep up to date with news and findings. Sifaka talk 03:10, 28 February 2009 (UTC)[reply]

Those are all very good points. I hope I actually can do my presentation on this (is a study of evidence scientific enough? We'll find out!) because I'm convinced that I can sway a lot of people's opinions. I may put a presentation together anyways, just to shut people up. They'll tell me that Wikipedia's BS and I'll be like "please watch my 4 minute powerpoint and you may just change your mind.." -Pete5x5 (talk) 07:40, 28 February 2009 (UTC)[reply]

Tut tut, WP:NPOV :) Dmcq (talk) 18:39, 28 February 2009 (UTC)[reply]