Talk:Wavelength-division multiplexing

Latest comment: 1 year ago by 194.193.198.123 in topic CWDM - 8 or 16?

Added some references, but I am very new to wikipedia, and they are probably not in the correct format. If you are more experienced than I and could correct them, thanks.

Spelling Mistake edit

Can someone edit the first picture? It's written as wavelenght.

Infinera edit

I don't understand why Infinera is redirected to Wavelength-division multiplexing. FYI Infinera is one DWDM equipement manufacturer, growing pretty fast over the last few years. 13:02, 23 April 2009 (UTC) —Preceding unsigned comment added by Michilans (talkcontribs)

I don't understand this either. how can it be fixed? Opticalgirl (talk) 15:50, 4 September 2009 (UTC)Reply
OK I have worked how to remove the redirect. Created a basic page, but it needs more info. Opticalgirl (talk) 16:45, 4 September 2009 (UTC)Reply

CWDM - 8 or 16? edit

Part of the article says that CWDM is 8 wavelengths or less, and another part says it's up to 16.

There's a couple of iterations of the standard. The first set has 8 wavelegnths and the second has 16, scattered over the 1310 and 1550 nm bands.

In the description of DWDM there is a small discrepancy in the lower frequency limit of C-band - 1525nm & 1530nm. There is a big mismatch in the definition of L band though - 1570nm to 1610nm in the article .... but it links to a page describing this as an atmospheric transpaency window at 3.5 microns. Clearly somebody did not read what hey linked to. — Preceding unsigned comment added by 194.193.198.123 (talk) 04:02, 21 January 2023 (UTC)Reply


WDM Does not allow for bidirectional transfers, light does edit

"This allows [...] to perform bidirectional communications over one strand of fibre."

Because of the physics of light at large, light does not interfere with itself in waves passing against each other of any great angle of separation irrespective of the frequency. It may be the case that current implementations use WBM for bidirectional transfers (I do not know), but it is by no means required. --Osndok

It's more a matter of practical implementation than basic physics. If you want to send light both ways over a fiber you have to be able to direct the incoming light to the detector and not to the transmitter going the other way. You can do that with a splitter which will send half of the light to the detector, and half to the transmitter. That will give you a 3 dB loss at that end. Since you would need to do this at both ends you will wind up with a 6 dB loss to make it bidirectional. Also, since half of your light still goes to the transmitter, which are very often sensitive to optical feedback, you may wind up with an unstable or noisy transmitter. If you use two wavelengths with a WDM your loss will be well under a dB, and you will pick up anywhere from 16 to >50 dB of isolation to keep interfering light away from your transmitter. (If you really wanted to use the same wavelength in both directions, wanted good isolation, lower insertion loss than a splitter, and didn't mind spending a few more bucks, you could use a circulator.) --216.70.247.242 23:47, 12 February 2007 (UTC)John N.Reply

Maybe, then, it should read something like: "This allows bidirectional communications to be implemented more cost-effectively" ? --Osndok 22:14, 19 April 2007 (UTC)Reply

It is more than just cost, so I would suggest ".. enables practical implementation of bidirectional communication..." or some such John N. 23:51, 15 June 2007 (UTC)Reply

Since it's possible, and practical, to implement bidirectional transport without WDM (using isolators or a circulator, as mentioned), then it's not accurate to say that WDM 'allows' bidirectional communications. What advantage other than cost does WDM bring to bidirectional communication? Madgenberyl 14:26, 15 October 2007 (UTC)Reply

The big advantages come with the ability to direct multiple wavelengths in you network. Using a circulator or a splitter only works when there is just one wavelength traveling in a particular direction. (The wavelengths traveling in each direction may be different, but there is only one wavelength in each direction.) If you have two or more wavelengths traveling in a given direction the splitter or circulator will route both of them to the same receiver. To separate them to different receivers--a key idea behind using multiple wavelengths--you would have to insert a WDM component after the circulator, in which case you might as well leave out the splitter or circulator. Circulators generally get used when two networks or locations are being joined and they both use the same wavelength. In that case the circulator is typically used with a WDM to route just the conflicting channel.

A WDM can also allow the system designer to remove, or add, a wavelength at a point and let the other wavelengths pass through. Example: I have three locations A, B, C. They are connected by a single fiber that runs through B. I can transmit from A on CWDM channels 47, 49 (wavelengths 1470, 1490 nm). At B ch 47 is dropped, and ch 49 continues on to C. B never sees the stuff I send to C, and vice versa. I can also assign channels for B to C, B to A, C to A and have them all running back and forth over the same fiber. With circulators the best you could do is a noisy party line. 216.70.247.242 (talk) 00:39, 29 November 2007 (UTC)Reply

Agreed, but all of that has nothing to do with bidirectionality. I suggest using a different term. Madgenberyl (talk) 14:17, 11 January 2008 (UTC)Reply

British vs. American English -- this article uses British English edit

See Wikipedia:Manual of Style#National varieties of English. This article was originally written using British English and therefore we should stick to British English for consistency's sake. --A. B. 05:45, 2 November 2006 (UTC)Reply

Wouldn't it make more sense to stick with the spelling of the optical fiber page, the American one? —The preceding unsigned comment was added by 130.126.76.122 (talkcontribs) 15:33, 14 November 2006 (UTC)


There's some logic to that, however, you would be amazed at some of the bizarre, silly fights that have erupted over this topic. See (or skim) Talk:Under the Umbrella Tree#Canadian vs. American spelling and Talk:Under the Umbrella Tree#RfC for an example; as for my own true feelings, you can read my own comments at the end of the RfC section. (You might also enjoy some of the uniquely Canadian/American links I posted).
I made my 2 November comment above after a new, fiber-savvy editor, unfamiliar with the rules, conscientiously went through and fixed all the "misspellings" (which I then reverted). --A. B. 22:11, 14 November 2006 (UTC)Reply

As WDM was invented by professor Syed H. Murshid (and his team) at Florida Institute of Technology, the first WDM equipment was made in the United States, and the first fiber optic cable was made in the U.S. (at Corning) it seems to me that it should be in American English. Also, the article references "Fiber Optics" (not "Fibre Optics" as it is called in the U.K.) and the rules according to Wiki are that articles that start one way, should continue one way and since the article it references is actually the main article (with WDM simply being a sub-reference) it should continue the language variant of that article. Supertheman (talk) 16:13, 10 September 2009 (UTC)Reply

WDM, CWDM and DWDM edit

The page contains some factual errors. CWDM is the accepted term for applications with spacing > 100 GHz. It isn't clear whether the C refers to coarse or conventional, and the two are used interchangeably. A system with 100 GHz or denser spacing is DWDM and WDM is a blanket term that applies to both CWDM and DWDM. Madgenberyl 17:31, 18 October 2007 (UTC)Reply

I think the article does a good job of noting that historically the term coarse meant no particular format, and that with the release of the 20 nm standard CWDM is used to mean the 20 nm standard. Also, if DWDM is 100 GHz or denser, what is 200 GHz? If it's not DWDM then a whole lot of companies are going to have to rewrite a whole lot of data sheets. John N. (talk) 01:05, 29 November 2007 (UTC)Reply

Yes, OK, so 200 GHz is also DWDM, but that's hardly the point of the objection. What's more important is that the article implies that only DWDM is within the 1550 nm range, which is not accurate, and it gives a confusing, and false, picture of what CWDM actually is. Historical context should be presented as such, not the state of affairs today. Madgenberyl (talk) 16:54, 11 December 2007 (UTC)Reply

You may have a point about the handling of the different types of xWDM. I actually think the whole article should be scrapped and rewritten. Someone accessing an encyclopedia article about WDM will be better served being given an explanation that it's a mixing of wavelengths, a description of the types of WDM functions (MUX, DMUX, add/drop), an outline of the different categories (1310/1550, CWDM, DWDM), and possibly how the actual devices work (the physics of the devices.) There is no need for coverage of transponders, levels of regeneration, ROADM's, SONET or anything at the network level. Graphics should be schematic presentations of wavelengths being combined into a single output, and pictures showing how small the actual components are, not an image of a rackmount chassis with no indication of which, if any, of the cards are actual WDM components. I can't tell if the article was written by someone who wanted to show off their extensive knowledge and therefore included way too much stuff that won't help (and may confuse) the neophytes. Or whether the writer is one of those who knows so much thay have a hard time paring their presentations to be accessible to beginners. John N. (talk) 03:38, 14 December 2007 (UTC)Reply

I don't think I'd go quite so far as to scrap it, but it definitely needs a serious rewrite. I think it was written by a transponder person, hence the somewhat off-topic slant. I agree with your comments about graphics, too. I can provide graphics and talk about the physics; do you wan tto handle the functions and categories?Madgenberyl (talk) 14:20, 11 January 2008 (UTC)Reply

Costs? Timelines? edit

When these technologies were introduced, what did they cost? What to they cost now? How big are they? For example, the photo shows a huge, rack-mount device. What year was this from? What is it? By comparison, a recent press release from IBM: http://www.kurzweilai.net/Breakthrough%20Chip%20Technology%20Lights%20the%20Path%20to%20Exascale%20Computing

states that they can put a single transceiver on a 0.5mm^2 bit of silicon, and a WDM array on a 4x4 mm^2 chip can handle a 1 terabit rate. That's a heck of a lot smaller than a 10-inch-high rack-mount device! Dates & times are important ... 99.153.64.179 (talk) 16:45, 30 December 2010 (UTC)Reply


Yes, the manufacturers like to see how much they can pack onto a chip. It gives them bragging rights. But even the tiny components wind up getting packaged in something more easily handled by technicians and assemblers. The actual size of the functional heart of the device will vary by technology used, the manufacturer's standard package, and possibly channel count. An arrayed waveguide, or bulk grating would be roughly same size regardless of channel count (within reason.) These technologies are generally only used for high channel count applications. On the other hand if you are using thin film filters (the work horse of the industry) you need to string a bunch of them together, so your size tends to grow with channel count. Individual filters generally come in 60x3 mm tubes with fibers coming out the ends of the tubes. For higher channel counts the tube is tossed and the filters are strung together inside a cassette. Cassette size varies by manufacturer but 100x80x5 mm is typical. The filter themselves are about 1.5 mm square by 1 mm thick. To make them work small GRIN lenses are bonded to them with fibers bonded to the lenses. It's been too long since I popped open a cassette and played with the innards so I will say I recall the GRINs to be about 10 mm in length each, but don't hold me to that. When deployed in the field almost everything gets put into modules that plug into a chassis or mount in a rack.
You can pack a whole lot of channels into small package, but when you want to use the channels in a network you need to have a receiver or transmitter for each channel to convert between optical and electrical. That's what you're seeing in the picture of the huge rack. I would rather the article stayed strictly in the optical realm and left network stuff for a network article. John N. (talk) 06:09, 20 July 2011 (UTC)Reply
Yes, but this still misses the point: what year is this from? How much did it cost? 99.153.64.179 (talk) 19:09, 26 March 2012 (UTC)Reply
Its also missing another point entirely: what you say is all focused on WAN, but IBM is developing for LAN. So, for example, the Blue Waters supercomputer had fiberoptic embedded in the system planar, and this was done because the northbridge had optical connections. This was done in order to save the electrical costs of driving signal pins; optical is just lower-power. The fibers were not meant for WAN or campus-wide connections, but only for within the machine itself (i.e. only within the server room itself). It was used to implement an RDMA scheme for rapidly addresable non-local memory. Using some big-honking rackmount thingy would be totally insane for such deployment, it completely misses the point of minaturization. The point is shrinking this stuff to the chip and planar level, so that the server room can use less electricity, and thus save on ventilation, air-conditioning. 99.153.64.179 (talk) 19:20, 26 March 2012 (UTC)Reply

CWDM vs. GePON edit

It would be very interesting to know which on is better, CWDM or GEPON technology? Advantages and disadvantages... —Preceding unsigned comment added by 85.206.21.24 (talk) 10:29, 1 April 2011 (UTC)Reply

CWDM is a set of wavelengths and passbands. Use them how you wish. Ultimately flexible and lower cost than DWDM. Also allows use of lower cost sources (lasers). That flexibility can bite you when your network has to interface with another and there is a wavelength overlap or mismatch. That also happens within the same network with unplanned growth.

PON's (GPON, GEPON, RFoG) tend to be architectures that dictate the wavelengths to be used and how to use them. With them you know what the downstream, upstream, and control wavelengths should be. For the technology the big thing with the PON's is the interest in higher levels of integration. But they still use the same underlying technologies.John N. (talk) 06:41, 20 July 2011 (UTC)Reply

Update needed edit

This article refers to etalon technology and fixed WDM only. It needs to be brought up to date and referenced to a ROADM page, if there is such a thing. Madgenberyl (talk) 06:53, 24 April 2011 (UTC)Reply

I would strike the etalon reference. The workhorse of DWDM is the thin film filter. Before that the primary component of WDM (1310/1550) was the fused fiber coupler. The etalon reference only confuses. It sounds like he considers a thin film interference filter as an etalon. It is not.John N. (talk) 06:55, 20 July 2011 (UTC)Reply

Wow, seven years later and the etalon reference is still in there. It did get moved and rephrased. But it is still WRONG. Thin film filters are not etalons. Etalons never played a big role in xWDM. The early 1310/1550 were all fused fiber (FBT). Thin film stacks took over when DWDM started growing and dominate the single and low channel count xWDM world. High channel counts were generally handled by arrayed waveguides and bulk gratings, though they have been supplanted for many applications.John N. (talk) 03:15, 18 May 2018 (UTC)Reply

Cut the technical crap edit

I just wanna know what colors are used. Green and red would be tacky. — Preceding unsigned comment added by Mercster (talkcontribs) 07:08, 30 November 2019 (UTC)Reply