Wikipedia:Reference desk/Archives/Computing/2010 May 25

Computing desk
< May 24 << Apr | May | Jun >> May 26 >
Welcome to the Wikipedia Computing Reference Desk Archives
The page you are currently viewing is an archive page. While you can leave answers for any questions shown below, please ask new questions on one of the current reference desk pages.


May 25 edit

No audio on some songs edit

All of my music was working fine yesterday, but now some songs have no audio. Was there some update to Windows Media Player recently that could've done something to them? What should I do? 174.52.133.133 (talk) 00:38, 25 May 2010 (UTC)[reply]

Setting up a file server with a RAID edit

My plan is to put a couple of 2TB drives into a Dell (I think it's a Dimension 5150 or some such) and use this as a media server. I want to RAID the drives together for safety's sake. I have a copy of Windows (I'm 99% sure it's XP) to put on the system. Will Windows be able to set up the RAID? Or will I have to get some third party software for this? I've only set up one RAID before and it was in a Mac (which is where most of my computer experience lies). Dismas|(talk) 01:34, 25 May 2010 (UTC)[reply]

Windows XP Pro appears to support RAID 0 only, out of the box, which is not what you want; see this link; and our Software RAID article section. Comet Tuttle (talk) 02:58, 25 May 2010 (UTC)[reply]
From that, it looks like I'll want to set up a hardware RAID then, using a RAID PCI card. Do you agree? Dismas|(talk) 03:50, 25 May 2010 (UTC)[reply]
If you are indeed running a 5150, it appears that the computer may already support RAID 1 in the BIOS, so you don't need to go to software RAID. See pages 27 through 30 in the manual, which I found here. By the way, Nil Einne and some others here on the Reference Desk have been saying "RAID is not a backup strategy" for a while; see this link, this RD post, and the bottom of this RD post, if you're interested in such advice. Personally I have never set up a RAID; for desktops, I use 2nd or 3rd hard disks and set up nightly or every-other-nightly backups. Comet Tuttle (talk) 18:32, 25 May 2010 (UTC)[reply]
I used to run a RAID system a while ago and concluded that it was infinitely better to have a decent backup strategy than run RAID. I't not impossible that the RAID hardware will fail and you'll be unable to get another of the same spec - thus making both drives unreadable. The PC could overheat and burn, thus losing both disks. Someone could break in an steal it. Depending how valuable your data is, portable hard drives and a remote fire safe (mine's in a shed in the garden) will give much better security. --Phil Holmes (talk) 16:16, 25 May 2010 (UTC)[reply]
CD-Rs and DVD-Rs aren't a great backup strategy either, btw. They have a shelf life of 5-30 years when stored properly (our article doesn't show the time frame, but it does talk about the life expectancy). --Wirbelwindヴィルヴェルヴィント (talk) 18:19, 28 May 2010 (UTC)[reply]

PCI IDE controler for main HDD? edit

I have an older, single HDD Dell with failing IDE controllers on the mobo. Would it be possible to just get a PCI IDE controller and plug it into that? Would I be able to boot XP off of it in the same way that a PCI video card works at POST, or would drivers from the OS have to be loaded before it started working?128.151.32.169 (talk) 03:26, 25 May 2010 (UTC)[reply]

The model of card I have seen (this was several years ago and had a Silicon Image chipset that supposedly supported RAID) could be used for the boot hard drive. To get XP installed, you might need to copy some files from the driver CD (including one called "txtsetup.oem") onto a floppy disk (yes, it has to be a floppy disk, not a flash drive or CD). I am not sure about what you would have to do if XP is already installed on your hard drive. PleaseStand (talk) 04:10, 25 May 2010 (UTC)[reply]
Sweeeeet. Thank you.128.151.32.169 (talk) 18:11, 25 May 2010 (UTC)[reply]

Do I need several versions of NET installed on my computer? edit

I appear to have MS Net Framework 1.1, 2.0, 3.0, and 3.5 all installed on my XP computer. Do I need all of them? 92.28.244.102 (talk) 11:30, 25 May 2010 (UTC)[reply]

Unfortunately I can't give you an exact answer as it it depends what applications you're using and if those applications require a version of the framework. Unless you're really short of disk space I would just leave them then if anything tries to use them it'll load without a problem, but otherwise you can always uninstall them and reinstall later if you find you need one. ZX81 talk 11:56, 25 May 2010 (UTC)[reply]
To make explicit a point that ZX81 is implying: Each one of those is not a superset of all previous versions; so 2.0 does not include 1.1. If you have .NET 2.0 only then you can't run an app requiring 1.1 until you download and install .NET 1.1 as well. Comet Tuttle (talk) 20:55, 25 May 2010 (UTC)[reply]
You won't be able to tell unless you know which programs depend on which version. I have had the distinct pleasure of using other computers and finding that various versions are not available. It is absolutely correct that having 3.0 is not inherently 'better' than 2.0. Freedomlinux (talk) 21:15, 29 May 2010 (UTC)[reply]

Help with codes.. edit

Hey! I'm a beginner in signals and systems.I'm trying to write some simple matlab codes on uniform and non-uniform quantization.I've serached but most of the codes are quite complicated using the Guassian method that dont give me any clue to work with my codes. Can anyone provide link to some simple codes for sampling and quantization.Any help will be well appreciated.Thanks --221.120.250.76 (talk) 11:45, 25 May 2010 (UTC)[reply]

One way for example a-law coding is to quantitize uniformly first, then apply a function to the number to get a new value that is non uniformly quantitized. This could use a look up array if it needs to be fast, or else a slower a more complex calculation. Graeme Bartlett (talk) 10:17, 26 May 2010 (UTC)[reply]

reviews of day-to-day mobile and semi-mobile living with an 13, 15, and 17-inch MacBook Pro??? edit

Hi, I wonder if there are any real-life reviews of what it is like to work and play with a 13, 15, and 17-inch MacBook Pro in terms of the weight and size. I am considering all three, for me it seems like bigger is better, but then who knows what it is like to actually use a big size like 17 inches in all kinds of situations, like working outdoors, in cafes, lugging the thing around, and so on. Are there any reviews from what the usage of the three are, respectively? Thanks. 84.153.238.49 (talk) 12:19, 25 May 2010 (UTC)[reply]

Nearly all the laptop reviews I have ever seen make some comment about its usability on the road and its weight. Here is my personal experience: I don't know about MacBook's in particular, but my previous employer issued me a laptop with a 15 inch screen. I took it with me on many business trips and took it home most nights in case I need to provide out of hours support. Even though it came in at at a relatively light 2.6 kg (5.7 lb), the bag and light-ish power block pushed the weight of the whole package closer to 4 kg (8.8 lb). Quite frankly I got tired of lugging it round, to and from the car or to and from the airport. So, when it came time to buy my own laptop, I deliberately selected a light weight laptop with a high-res 13 inch screen. I just couldn't imagine lugging round some 17 inch monster weighing in excess of 6.5 kg (14 lb) once in its bag. Unless heavily into gaming or video editing, I would choose weight over performance almost every time. Astronaut (talk) 17:45, 25 May 2010 (UTC)[reply]
Agree 100%. The extra space isn't worth it 99% of the time, and you pay for it dearly in terms of size and weight. If you do need large screen space, you are better off buying an external monitor that you keep at your primary work station. --Mr.98 (talk) 01:27, 26 May 2010 (UTC)[reply]
Here's a review that talks briefly about the size differences. Going from the 15 to 17 inch, you don't get any better performance, but rather a larger screen and more ports. I certainly wouldn't recommend that if you'll be hauling it around frequently. The reviewer appears to prefer the 15 inch, but he also doesn't have to lug it around a lot. Obviously, it depends on how much you travel with your laptop, and also what you use it for. I chose a lighter 13 inch laptop for school, and then it turns out that it just sits on my desk in the dorm room all day (I don't find it useful in class), so I could have gone for a bigger one. The screen does feel a bit small at times, though overall it works well. On the other hand, when I do travel with it, I am glad that I don't have a bigger one. If you go smaller than 13 inches, you can start to run into usability problems (cramped keyboard, small screen, etc.). I largely agree with Astronaut though: if there's not a reason to get a bigger one, don't. Buddy431 (talk) 18:04, 25 May 2010 (UTC)[reply]
I'm on my third 15 inch Mac laptop, so my preference is clear. 15 inch is ok for most forms of travel, but barely usable in coach when flying. The only other situation where I would prefer a smaller one is when on a longer cycling trip - even with the very compact Tom Bihn bag, it fills nearly half a panier bag. --Stephan Schulz (talk) 18:13, 25 May 2010 (UTC)[reply]
(edit conflict with above) Here's an overview comparison from Apple (doesn't talk about usability and travel, though). For me, the prices are striking. It costs $500 more for the 15 inch over the 13 inch, and another $500 from the 15 inch to the 17 inch. Depending on your budget, that could be something to consider (I'd never pay that much for a laptop with those specs anyway, but that's your business). Buddy431 (talk) 18:16, 25 May 2010 (UTC)[reply]


Buddy431 - It's what Microsoft calls the 'Apple Tax'.... :) Chevymontecarlo 18:50, 25 May 2010 (UTC)[reply]

Summarize: The specs tend to not change, the price does, and usability goes up relative to weight and size. So, if you must have a larger screen, you get bigger. You pay for it in cash, size, and weight. If you can manage the small(er) screen, get the 13, it's cheaper, lighter, and smaller. By the way, something no one has mentioned is battery life. I don't know, having only a Mac Pro but no Macbooks, but I assume that the smaller the screen, the longer the battery life. Unless the bigger ones have larger batteries. As for my preference, I work on a 9 inch netbook! :) Mxvxnyxvxn (talk) 05:31, 28 May 2010 (UTC)[reply]

is it true that CPU performance isn't improving anymore? edit

Is it true that the reason that discounting more and more cores, the raw performance of CPU's just isn't increasing by the amount it had been, say between 1995 and 2005? Is there some kind of performance graph someone could show me with the very fastest/highest-performance general-purpose PC cpu core (like Intel or AMD) and how this "fastest" performance has changed over the past fifteen years? That way it would be easy to see, at a glance, if performance increases (the curve upward) really is tapering off... THanks. 84.153.238.49 (talk) 14:27, 25 May 2010 (UTC)[reply]

I like this question! (disclaimer: my knowledge of the this side of computing is a little weak, so the low-level stuff is kinda vague, and maybe even wrong) Process improvements in chip manufacturing have led to smaller and smaller transistors. This makes it possible to put more and more transistors on a chip, but doesn't have too significant an effect on how many times each transistor can accurately switch per second. So, in order to make processors faster, it becomes necessary to trade the extra space gained for time somehow.
If it's not possible to get an instruction done in fewer microseconds, it becomes necessary to increase the number of instructions executed at a time, by breaking instructions into small chunks, and executing chunks from multiple instructions at a time. This is called instruction pipelining and it has been in use since the 1970s. As you might imagine, it's a complex process, since one instruction might depend on the result of another instruction, and both might be in the pipeline at the same time. With more space, you can be doing more things at once, but eventually you reach a limit. You can only break down instructions so much before the tiny chunks stop making sense to execute independently, and you can only have so many instructions in-flight at a given time before the inter-instruction data dependency slows everything down by introducing pipeline bubbles all the time.
So the makers of CPUs have moved the problem up to the programmer's level by using the extra space to provide more cores. On a four-core machine, you can have four totally independent things happening at once. The catch is that they have to be independent. Now it's the programmer's (or programming language designer's or compiler writer's) responsibility to break up tasks so that four (or eight or sixty-four) tasks are available that don't depend on each other's output.
See Moore's Law for more information on the blistering pace of increase in transistor density. Paul (Stansifer) 15:32, 25 May 2010 (UTC)[reply]
I found this old news item about a multigate device: [1]. Ten years later, are we using these vertical transistors or anything like them? The wiki article says "the industry predicts that planar transistors will reach feasible limits of miniaturization by 2010". Did that happen yet? 213.122.2.195 (talk) 17:02, 25 May 2010 (UTC)[reply]
It's instructive to look at the spectacular rate of improvement in Graphical Processing Units (GPU's) compared to Central Processing Units (CPU's). Because graphics are very parallelizeable, it's easy to use increased silicon densities to add more processors and get an almost linear improvement in performance. You can do the same thing with CPU's - and just increase the number of cores - but because the work of most software isn't parallelizable, you can't easily split it over multiple cores. While many important CPU-software tasks can be rearranged to run on two, three or four cores - there aren't many pieces of software that could take advantage of (say) 100 cores. So simply pushing the number of cores up doesn't really help. The alternative is to use that silicon area to build ever cleverer ways of executing code so that it runs faster - but those things are notoriously error-prone and horribly expensive to engineer. So yes, CPU's are hitting something of a bottleneck. The smart money (IMHO) is going into pushing more of the heavy workload onto the GPU - leaving the CPU to do things that are strictly impossible to parallelize. Things like nVidia's CUDA library are helping that process along immensely. SteveBaker (talk) 18:51, 25 May 2010 (UTC)[reply]


Thanks, steve, but my question (I'm the OP) is about whether the rate of performance increase has been tapering off in a single Intel or AMD core (the fastest available at any given time). Has it been? 82.113.121.113 (talk) 21:53, 25 May 2010 (UTC)[reply]
There's basically no performance left to gain in the single-core world (because parallelizing work at the instruction level is yielding greatly diminishing returns, as I described above). This is why it's barely possible to get single-core machines, despite the fact that, for most applications, the extra cores are useless. They're hoping that the software people start writing parallelizable code. Paul (Stansifer) 02:14, 26 May 2010 (UTC)[reply]
Addendum about the device physics side - smaller transistors inherently can switch faster (this is just a fact of device physics that governs their behavior). So the goal of miniaturization was not only density, but also towards improving the clock-speed for the digital circuitry. The big problem is that we are reaching two analog-circuit/device-physics roadblocks that make it hard to keep decreasing the size any further: first, we have devices so small (we're talking tens of silicon atoms across) that they barely function as transistors. Manufacturing these devices is an immense challenge - photolithography can't work for features smaller than the wavelength of light - and we're pushing way into the ultraviolet spectrum and beyond, with 32 nanometer VLSI nowadays. So, making devices any smaller will be a technology challenge. A secondary problem is power density. If we make devices smaller, and do the same amount of work, the concentration of heat-per-unit-volume increases dramatically - you can find charts that predict energy densities inside of transistor junctions that would approach nuclear-fusion energy densities if Moore's Law continues indefinitely. So, the devices can not be packed in any more densely. Compounding this problem is that smaller devices are worse devices - they have poorer characteristics, worse thermal losses, and so on. Thus the problems with cooling the CPU interior emerge even sooner than we might predict by straightforward extrapolation of heat/thermal trends. See, for example, End of Moore's law: thermal (noise) death of integration in micro and nano electronics from Physics Letters; there are dozens of similar high-profile "End of Moore's Law" analyses published in major journals over the last decade or so. Nimur (talk) 15:35, 26 May 2010 (UTC)[reply]
Slight correction to a generally interesting contribution. The minimum size of features created by photolith are related to the wavelength of the light used in the exposure, but can be considerably smaller. The article you linked to says "Current state-of-the-art photolithography tools use deep ultraviolet (DUV) light with wavelengths of 248 and 193 nm, which allow minimum feature sizes down to 50 nm." --Phil Holmes (talk) 08:45, 27 May 2010 (UTC)[reply]
First off, the limit of how small we can make transistors before they stop being transistors is spot on. We're getting close to that limit, so unless someone figures out how to make smaller transistors, we'll have to stop looking to that area for any improvements as well. One thing I don't think anyone has mentioned is that miniaturizing the transistors have a side effect that can increase performance as well. At gigahertz speeds, signal propagation time becomes more significant, which is also reduced with smaller paths which can come with smaller transistors. Alone, it's not enough to keep up with Moore's Law by any means, but every percent helps nowadays with all the technological diminishing return in the topic. --Wirbelwindヴィルヴェルヴィント (talk) 18:13, 28 May 2010 (UTC)[reply]

MonoDevelop vs. SharpDevelop edit

Hello. I want to start programming in C# and wants a opensource IDE and my chose is either MonoDevelop or SharpDevelop. I looked at both of them and I can't tell the difference. Some say SharpDevelop is better, others say MonoDevelop.

Which one should I choose and what IDE would you recommend and why?

Thanks. —Preceding unsigned comment added by 92.243.235.218 (talk) 15:07, 25 May 2010 (UTC)[reply]

Android and Java edit

The Android (operating system) article says it "allows developers to write managed code in the Java language". The managed code article says that managed code "will only execute under the 'management' of a Common Language Runtime". Does Android therefore compile from Java into Common Intermediate Language and then into native code, or did the second article mislead me? Also: does the Android OS require (or virtually require by reason of available libraries) that any program it runs be written in Java, or can you write in C? What's the point of its ties to Java - why have a platform independent language integrated with the OS, seems like a contradiction? 213.122.2.195 (talk) 16:43, 25 May 2010 (UTC)[reply]

The managed code article has been significantly changed since that sentence was written. Here is an earlier version which does not specifically tie it to Microsoft .NET. --Spoon! (talk) 17:57, 25 May 2010 (UTC)[reply]
In this case a link to Sandbox (computer security) would have been more appropriate (although in practice apps have a more sophisticated security model, something akin to signed java applets). Android's security model (described here) prevents defective or malicious apps from messing with each other or the system. Desktop/server OSes do this with a full memory-protected, user/supervisor mode architecture, but that requires CPU features that the limited processors in handhelds frequently lack. In addition, writing in a platform-neutral language (and delivering bytecode to the platform) gives independence of the architecture - initially Android platforms are ARM, but there's no reason (from an app perspective) that it couldn't run on a PPC, MIPS, or even Intel Atom core. It used also to be the case that bytecode was a preferred distribution format for mobile apps, as bytecode tends to be more compact (an issue for very memory limited mobile platforms) - but these days mobiles have enough ram that this isn't a big deal any more. Android itself, and the platform's drivers and other ancillary code, is still written in C and/or C++. -- Finlay McWalterTalk 18:16, 25 May 2010 (UTC)[reply]
So... is the bytecode CIL? 213.122.2.195 (talk) 18:35, 25 May 2010 (UTC)[reply]
No. Android machines run the register based Dalvik VM, not the stack based Java Virtual Machine, nor the CLR CIL bytecode. In this post Mono Project leader Miguel de Icaza speculates about the possibility of someone implementing a CIL to Dalvik translator, but I'm not aware that anyone has actually done so. -- Finlay McWalterTalk 18:44, 25 May 2010 (UTC)[reply]
Incidentally, Dalvik's .dex file format is discussed here, and the bytecodes discussed here. -- Finlay McWalterTalk 18:49, 25 May 2010 (UTC)[reply]
Cool. Register based sounds good. I like the idea of Android a little bit more now. 213.122.2.195 (talk) 19:22, 25 May 2010 (UTC)[reply]
It hardly matters whether an abstract machine is register-based or stack-based. Bytecode stack operations don't compile to hardware stack operations. The bytecode stack is just a (potentially) more compact way of referring to recently written registers. -- BenRG (talk) 22:08, 25 May 2010 (UTC)[reply]
Ah I might as well post a few thoughts in this thread. I just upgraded to an Xperia X10, foolishly not knowing that it wouldn't support .jar files. Anyone have any idea if it ever will, or will there be (better) support in Android 2.1 or 2.2 or later or never or we will have to rely on Dalvik ports? Anyone with battery issues with the X10? Bah sorry for so many questions but I'm a little peeved at 'spending' so much and having so many doubts about a mostly great phone. Why can't someone just create a perfect phone? I miss my P1i already! Sandman30s (talk) 23:07, 25 May 2010 (UTC)[reply]
When you say ".jar files", I'm guessing you mean J2ME MIDlets (which is the "normal" mobile java environment). It would seem not. Android (operating system) discusses some conversion tool called J2Android; I've no idea how, or whether, it works. -- Finlay McWalterTalk 23:22, 25 May 2010 (UTC)[reply]
Yes that link mentions 'It allows developers to write managed code in the Java language, controlling the device via Google-developed Java libraries' which was what I kind of researched before buying the phone. Knowing that, who would have guessed it wouldn't support J2ME (some silly copyright struggle between Google and Sun/Oracle)? I have a library that I've built up over the years and I know it's going to be a struggle to either port some of that or replace that with Android equivalents. The variety of phones and phone OSes on the market nowadays reminds me of the boom in home computers back in the early 80's... it will take a while but there will be a few winners and many losers. Sandman30s (talk) 09:17, 26 May 2010 (UTC)[reply]
Sadly, from mobiputing, 'Myriad isn’t targeting end users. Instead, J2Android is described as a utility that will let phone makers, wireless carriers, and mobile app stores expand the number of apps available for Android'. At least there's hope that someone else can do it. Sandman30s (talk) 09:37, 26 May 2010 (UTC)[reply]

Improved Formula Editor in Word 2010 edit

The new formula editor in Word 2007 is great, although very buggy:

  • Sometimes Often Word crashes when you edit large formulae.
  • If you insert a formula in a numberbed or unnumbered list containing soft line-breaks (Shift+Return), you will no longer be able to save the document ("file format error").
  • In large formulae, on-screen rendering bugs are common.
  • Editing large formulae is very slow.

In addition, it is not possible to number equations. Prior to Microsoft Office Word 2007, one used the Microsoft Equation Editor 3.0 OLE object, and then it was easy to number equations: simply insert a centered tab stop at the center of the page, and a right-aligned tab stop at the right margin of the page. Then you insert the formula at the first tab stop, and the number at the second tab stop. However, this is not possible in Word 2007 (using the built-in formula editor), for unless the formula is entirely alone on its line, it will shrink, as if the formula was "inlined" on a line (if you know what I mean). Furthermore, using formulae in headings does not work very well.

Are these problems solved in Word 2010? --Andreas Rejbrand (talk) 17:08, 25 May 2010 (UTC)[reply]

Word 2010 is still in Beta mode at the moment, and only the 2007 version is currently available as a demo/trial version. You can find out more about Office 2010 here. The best way to see if the bugs have been fixed in Office 2010 is to wait until it comes out and try out the demo. Chevymontecarlo 18:48, 25 May 2010 (UTC)[reply]
Very late "beta mode" if so. It will be released in shops within a month. --Andreas Rejbrand (talk) 21:34, 25 May 2010 (UTC)[reply]
It RTMed over a month ago [2]. Those with MSDN access to such things and other Microsoft connections can I presume already get it. If you don't have those, the beta version is probably the only legally available option although if you haven't already got a key I presume you can't get one. And Microsoft is unlikely to be interested in bug reports for stuff they may have already fixed. In other words, I wouldn't consider it beta mode. You may be able to find someone using the RTM version of Word 2010 to test this but it seems so far no one has turned up. Nil Einne (talk) 23:22, 26 May 2010 (UTC)[reply]

Windows Server 2008 edit

  Resolved

Can Windows Server 2008 be used as a normal operating system, for browing the web, watching video and playing games etc, or is it for server tasks only? 82.44.55.254 (talk) 18:50, 25 May 2010 (UTC)[reply]

Of course. Perhaps gaming support is limited, but in principle, yes: see here: [3]. --Andreas Rejbrand (talk) 19:55, 25 May 2010 (UTC)[reply]
I used Windows Server 2003 for a while once. Generally it worked fine, however some poorly designed game installers just refused to install (some did it on XP x64 too which was based on the 2003 code base). I can't remember but I think compatibility mode didn't help for some reason. But once I did install (by modifying the installer IIRC), the game itself worked fine. Some may warn you about an unsupported operating system. Probably due to Microsoft's push to security, stability etc and the resulting removing unneeded components you're more likely to have problems with 2008 vs Vista or 2008 R2 vs 7 then 2003 vs XP. But a quick search shows a bunch of discussions e.g. [4] [5], it seems crappy installers are still a problem but similar to my experience many will work. However some dlls may be missing since they're unlikely to be used by servers which you'll need to manually add. Obviously any game which doesn't work on Vista is unlikely to work on Windows Server 2008. Nil Einne (talk) 17:45, 26 May 2010 (UTC)[reply]

Thanks! 82.44.55.254 (talk) 18:22, 26 May 2010 (UTC)[reply]

I have used Windows Server 2008 as a desktop, following tips available at [6]. There are cases where software objects to being installed on Windows Server, but most things go smoothly. The website also has information on enabling sound and disabling the pop-up box requiring you to describe the reasoning behind any shutdown. I don't use Windows anymore, but Server 2008 was a reasonable desktop. Freedomlinux (talk) 21:24, 29 May 2010 (UTC)[reply]

Setting a Variable in a Shell Script to The Contents of a File edit

Hello,

I'm trying to figure out how to in a linux shell script set a variable to be equal to the contents of a file. Here is my example: I have a file called foo.txt and I want to send an email using the mail command:

# mail -s "$mail_subject" someone@someone.com

How to I set the variable $mail_subject to be the contents of file foo.txt?

Thank you for any help. —Preceding unsigned comment added by CGP (talkcontribs) 21:37, 25 May 2010 (UTC)[reply]

Using backticks. mail -s "`cat foo.txt`" someone@someone.com Unilynx (talk) 21:42, 25 May 2010 (UTC)[reply]
Getting a 'permission denied' error. CHMOD says everyone can read the file... --CGP (talk) 22:34, 25 May 2010 (UTC)[reply]
 
no cat
read mail_subject<foo.txt
mail_subject=`<afile`
for options with no cat Graeme Bartlett (talk) 08:53, 26 May 2010 (UTC)[reply]