Wikipedia:Reference desk/Archives/Computing/2007 October 9

Computing desk
< October 8 << Sep | October | Nov >> October 10 >
Welcome to the Wikipedia Computing Reference Desk Archives
The page you are currently viewing is an archive page. While you can leave answers for any questions shown below, please ask new questions on one of the current reference desk pages.


October 9

edit

Nvidia GeForce 8300GS

edit

Hi,

I've just ordered my parents a new pc with a video card "256mb NVIDIA GeForce 8300GS Turbocache graphics card". Question...Does this card have its own memory or is that 256mb coming from my RAM/PC somwhere else? Also is it a decent card? Basically i'd be happy if it can play new games like Half Life 2 Episode 2 at a reasonable level, but have no idea if it will (though i'll probably just buy it and find out what kinda level I can play it at). Also do video-cards get used when using programs like Adobe Lightroom? ny156uk 21:05, 9 October 2007 (UTC)[reply]

It has it's own memory. And it's a pretty good card. :) --76.213.142.41 21:09, 9 October 2007 (UTC)[reply]

Your card is probably the lowest in the GeForce 8 Series. Get an GeForce 8500GT, for me it was around $110 AUD. Overclocking it to a 600MHz core, Half-Life 2 runs with no lag at 1680x1050, all effects high, 16xAF and 4xAA. Of course, Half-Life 2 Episode 2 demands more than Half-Life 2. According to some simple (and possibly incorrect) calculations, your card should be 4 to 8 times slower than mine. I would guess that Episode 2 would play alright at 1024x768, all effects high, 4xAF and possibly 4xAA. That's only my guess though. BTW, TurboCache means the card (mostly) uses system RAM. So, I'm guessing your card only has maybe 64MB of onboard RAM. --wj32 talk | contribs 23:26, 9 October 2007 (UTC)[reply]

I would imagine this card is equivalent to a 128mb 6600GT or so. Don't be expecting to play on anything other than low. I am amazed at how many people buy budget graphics cards around $70 when they could pay $20-$30 more for a vastly superior one like the 8600 or even a 7 series. -Wooty [Woot?] [Spam! Spam! Wonderful spam!] 02:58, 10 October 2007 (UTC)[reply]
Hey, don't insult the 6600GT; my 3 year old system (Sempron 3100+, Nforce3 MB, 1.5GiB ram, 6600GT AGP 128MiB) can play Team Fortress with everything on high, can't try episode 2 right now though since it's still decrypting... --antilivedT | C | G 07:35, 10 October 2007 (UTC)[reply]
I'm not insulting the 6600GT, it was one of the most underrated cards for quite a few years, easily the best buy for a while. However, I had a 6600GT 256MB with a P4 @ 2.4ghz and 2GB of the best damn DDR RAM you could buy and barely squeaked by (20-30fps) on "Medium" in Battlefield II. I also dislike when people automatically assume the 8xxx series is going to be faster than previous generations, as was assumed by the OP, because often a 7xxx card at a similar or lower price, while lacking DX10 functionality, is vastly superior in benchmarks and everyday usage than anything the budget 8xxx cards throw down. Since the G92's going to feature in a new 8xxx series card at or slightly below the 8800GTS 320MB's performance (which I own, and am very happy with!), somewhere in the middle of November, prices may fall drastically in a few months on 8800GTS and 8600GTS/GT cards as well. -Wooty [Woot?] [Spam! Spam! Wonderful spam!] 07:46, 10 October 2007 (UTC)[reply]
On this topic, how long do you think a 7900 GS (AGP) will last in terms of playing games at around 1024 or 1280 resolution? I've currently got it overclocked to 600 mhz gpu core and 700 mhz memory and it works fine (most settings turned on) in 1280 resolution for almost all games (currently playing Test Drive Unlimited and Bioshock) Sandman30s 14:06, 10 October 2007 (UTC)[reply]
The 7900's a good card. I'd say you'll have DX10 games becoming the norm (and therefore the card just won't work period) before the 7900 has trouble playing most games. -Wooty [Woot?] [Spam! Spam! Wonderful spam!] 01:25, 11 October 2007 (UTC)[reply]
Might as well ask another question - are these 8-series cards being touted for DX10 compatibility? Even though they are not superior in benchmarking, they can at least run DX10 games. When will DX9 be fully de-supported in games? Sandman30s 14:06, 10 October 2007 (UTC)[reply]
I'd give it about a year. Most of the mid-range 8xxx cards probably won't be able to run DX10 games at anything other than "low", sadly. I didn't buy my 8800GTS for DX10, I bought it for performance. Until the 89xx or 9xxx series, you won't be seeing "run everything at max settings" in a DX10 game. -Wooty [Woot?] [Spam! Spam! Wonderful spam!] 01:25, 11 October 2007 (UTC)[reply]
Well the card above (in my original question) is the 'standard' one with a Dell they're getting. Unfortunately they're not into games but obviously I would be pleased if it is capable of playing new ones, even if it is just at an average standard. I've totally got no idea about what's good/bad so thanks for the input everyone. Also does anybody know whether it does or doesn't help improve running of programs like the affore mentioned Lightroom? I know RAM/Processor come into it, but does a better card boost performance on programs such as this or is it purely 3d work? ny156uk 16:22, 10 October 2007 (UTC)[reply]
I don't see any reason why you'd buy from an OEM, even if it's your non-gamer parents, you can get a much better deal just by putting it together yourself. I recently put together a "wish list" on newegg for a budget computer with a 8600GT, Pentium D (these are overclockable to around 3.2ghz, allowing it to actually be superior to the Core 2 Duo), and two gigs of ram for under $600. -Wooty [Woot?] [Spam! Spam! Wonderful spam!] 01:25, 11 October 2007 (UTC)[reply]
Actually, my Core 2 Duo (2.00GHz) is estimated to be equivalent to a 4.94GHz single core, so dual is significantly better. And I've read several benchmarks about the latest DX10 games, and the 8800GTX can run every one of them (so far) on maximum graphics and a resolution of 1600x1200 at over 25 FPS. Not bad, IMO (my integrated graphics can run Age of Empires III on minimum and a stunning 10 FPS!). · AndonicO Talk 13:19, 11 October 2007 (UTC)[reply]

I've never heard of the 8300, but I'm sure it can play HL2 on at least low-mid settings (HL2's graphics aren't as demanding as other recent games); you may want to upgrade/overclock if you want any higher visuals though. · AndonicO Talk 13:19, 11 October 2007 (UTC)[reply]

PS- Flattening a vector

edit

In Adobe Photoshop CS, is there anyway I can flatten a series of vector shapes on different layers so that they still retain their vector properties, but does not allow anyone to edit each individual component? For example: If I constructed a vector silhouette of a person, I have the head on one layer, left arm on another, etc..., is there anyway I can combine all of them into one? At the moment, If i click "Flatten image", PS flattens all the layers, but turns it into a raster. Is there anyway I can flatten, but keep the infinitely scalable properties of a vector? Thanks. Acceptable 21:08, 9 October 2007 (UTC)[reply]

Unfortunately I can't find a simple way to do it. The easiest way I've found is:
  • Copy and paste each of the individual shapes onto one layer. They should retain their position on the canvas
  • Select the combined shape with the Move Tool
  • Go to Edit > Define Custom Shape... and name the shape
  • Select the Custom Shape Tool and select your shape in the shape picker
  • Draw the shape on the canvas
This was in CS2; apologies if it doesn't work in CS — Matt Eason (Talk &#149; Contribs) 22:36, 9 October 2007 (UTC)[reply]
Why not put their layers into a folder and then link all objects in the folder? That way changing one of them changes them all, and you can just select the folder itself if you want to select all of them at once. --24.147.86.187 12:44, 10 October 2007 (UTC)[reply]
Well, I plan to send it to someone so that they can resize it to however large they want it for printing. But I don't want them to be able to change any of the parts easily. Acceptable 01:39, 11 October 2007 (UTC)[reply]
If you can export the vector to SVG format, you can import it in Inkscape and join all the like elements (those with the same colour and border properties) using the path->union command. [To make sure, I just checked the output of taking the union of a set of overlapping rectangles in Inkscape: it outputs a single complex polyline, which would be a royal pain to edit]. -- PrettyDirtyThing 11:57, 11 October 2007 (UTC)[reply]