Wikipedia:Reference desk/Archives/Computing/2014 July 6

Computing desk
< July 5 << Jun | July | Aug >> July 7 >
Welcome to the Wikipedia Computing Reference Desk Archives
The page you are currently viewing is an archive page. While you can leave answers for any questions shown below, please ask new questions on one of the current reference desk pages.


July 6 edit

What is Bit capacity of analog tape ? edit

As I understand bits is for digital information.OsmanRF34 (talk) 21:33, 6 July 2014 (UTC)[reply]

Tape stores digital data in an analog format. See Magnetic tape data storage. --  Gadget850 talk 21:51, 6 July 2014 (UTC)[reply]
Not very helpful.OsmanRF34 (talk) 13:00, 7 July 2014 (UTC)[reply]
Magnetic tapes do not have a well-defined bit capacity. The capacity depends on the details of the method used to store data, and the capacity that can be obtained with a particular method depends on the detailed structure of the tape. Looie496 (talk) 03:05, 7 July 2014 (UTC)[reply]
Recent news in the analogue tape world. Vespine (talk)
But they don't have a well-defined bit capacity because they are analog? And there is no thing like an "analog bit"? I'd check Units of information but the article has no reference to an analog bit type of unit. The question is more about measuring amount of analog information than about tape.OsmanRF34 (talk) 13:00, 7 July 2014 (UTC)[reply]
 
The conceptual difference between an analog signal and discrete signal.
This question cannot really be answered without further assumptions, which is why you are getting some finicky non-answers. But I can maybe help you out with some ideas of ways to compare information stored, and some links. So, maybe you're thinking about audio cassettes, which commonly held 30 minutes of music on each side, but some could hold up to 60 minutes of audio on each side. This audio is represented as an analog signal, and it is pretty much directly read by the player and sent to the speaker without much processing. In comparison, a compact disc holds about 80 minutes of audio, but since it is digitally encoded, the actual amount of audio it holds will depend on the sample rate. The point is, if we only focus on use for playing/storing audio, some tapes hold about as much as a CD. Sample rate is a key concept when converting to/from analog/digital. Increasing the sample rate makes the digital signal "closer" to the analog, but it also takes up more space.
There isn't really anything like an analog bit, bit literally means a binary digit. If we want to encode an analog signal into a digital medium, we might end up with something like this image from discrete-time signal. Think of the smooth curve as analogous to what is contained on a tape or on a groove in a vinyl record, and the red "step" line as analogous to what is contained on a CD (NB this is only an analogy). Basically, an analog signal is conceptually continuous, while digital signals are conceptually discrete. So, if you fix specific types of analog to digital and digital to analog conversion, and tape specification, you can come up with a statement like "this tape can hold roughly X bits of data" -- but unless you specify all those things the question is ill posed.
Now, if you're really interested in how information is measured in an analog domain, one way that can be quantified is by Bandwidth_(signal_processing) (Note that though the term is commonly used for things like internet speed, it is inherently an analog concept), which measures how "wide" the signal is in the frequency domain. But beware of casually skimming topics in information theory, the Shannon's notion of information is that a recorded one-hour sample of white noise has more "information" in it than a recording (1 hr) of Principia Mathematica -- so keep in mind that information has many different definitions, and each is useful for different purposes.
Finally, if pressed to answer "how many bits are in this analog signal?", Infinite is a fairly defensible answer, because we'd (in general) need the entire infinite Fourier series to faithfully represent it digitally... Does any of that help? SemanticMantis (talk) 15:50, 7 July 2014 (UTC)[reply]
I disagree with "infinite". First, current technology will limit how many bits can be encoded on a given analog tape and retrieved reliably. If you tried to go beyond that number, single bit errors would start cropping up, and eventually the information would so corrupt as to be useless. But, assuming we ignore technology limits, even in a theoretical sense, there would still be some limit, due to quantum randomness.
It might help to use writing on paper as an analogy. You can encode digital info that way, by using the pencil to make a dot to represent each bit (that would be a 1 and the absence of a dot there would be a 0). How small and closely together you can make the dots then becomes the limit on the number of bits that can be stored that way. Theoretically, the limit would be one carbon atom (or the lack thereof) per bit. However, practical limits may kick in far sooner, as single atoms of carbon are probably already there, and those you deposit could evaporate away. And devices which could deposit and detect a single carbon atom, while they may exist, are extremely expensive. So, encoding that much digital data just isn't realistic. StuRat (talk) 16:24, 7 July 2014 (UTC)[reply]
I was not talking about encoding bits onto tape, I'm talking about representing an an analog signal with bits. And I said "infinite" was a defensible answer, not the only correct answer. I don't care to discuss engineering issues here, but you've prompted me to give to defend the claim. Conceptually, analog signals of a given length form an infinite dimensional space. If you want to think of sound waves in the air or grooves in vinyl as discrete signals due to physics at the molecular level, that's your prerogative -- but for most purposes and applications scientists and engineers consider them to be analog and continuous. You can call it a Mathematical_model if you like, but it's a damn good one. For specific examples of infinite dimensional spaces of analog signals, L2 is a space of fairly "nice" signals that form a Hilbert space. Even if you restrict to the "nicer" class infinitely smooth analytic functions, you still have an infinite dimensional space. This is in some ways similar to the the question, "How many positions are available on a slide trombone?" The answers "seven", "many", and "infinite" are all defensible. It all depends on who's asking and why. I gave the more formal analytic approach, because it seemed closest to the OP's question about the theory of how information is quantified in the analog domain. SemanticMantis (talk) 17:02, 7 July 2014 (UTC)[reply]
Gross bit rate describes some of the historic data rates, although not specifically for audio tape. StuRat (talk) 16:38, 7 July 2014 (UTC)[reply]
This may be of relevance: Nyquist rate, Bandlimiting, Nyquist–Shannon sampling theorem, [1], [2]. To be perfectly honest, though, I'm not the best at DSP stuff, so it would probably be better if someone else elaborated here for details.Phoenixia1177 (talk) 03:58, 8 July 2014 (UTC)[reply]
Ah, good! I forgot about N-S sampling theorem. That is highly relevant here, but the math might be a bit advanced for some. Basically, the theorem says that finitely many samples can give perfect fidelity in an analog-to-digital scheme, but only if the analog part is band limited. This fits in with my comment above that "we'd (in general) need the entire infinite Fourier series to faithfully represent [the analog signal]." -- If you know certain things about the signal, then sometimes we can achieve non-lossy A-to-D conversion. SemanticMantis (talk) 17:04, 8 July 2014 (UTC)[reply]