Wikipedia talk:WikiProject Mathematics/Archive/2014/Jul

Latest comment: 9 years ago by Lesser Cartographies in topic Soon-to-be-deleted article

Jacobi method edit

I was wondering if we have an article on Jacobi iterative method  ? I noticed a new variant of it scheduled relaxation Jacobi method was recently published doi:10.1016/j.jcp.2014.06.010 -- 65.94.171.126 (talk) 06:01, 2 July 2014 (UTC)Reply

We do have an article on the Jacobi method, and a new section on the recent development has already been created. FireflySixtySeven (talk) 06:44, 2 July 2014 (UTC)Reply
Thanks -- 65.94.171.126 (talk) 07:28, 2 July 2014 (UTC)Reply

Geometric Poisson distribution edit

Someone more knowledgeable about probability theory may want to check out the new Geometric Poisson distribution article. As I explained on the talk page, I suspect the current version of the article covers a non-notable distribution which shares a name with a vastly more notable one, the latter of which is also called the Pólya-Aeppli distribution. We should rewrite the article so it covers the notable topic. Huon (talk) 17:21, 3 July 2014 (UTC)Reply

AfC submission - 04/07 edit

Draft:Deformation tensor. FoCuSandLeArN (talk) 13:46, 4 July 2014 (UTC)Reply

Quotient space (2) edit

I have changed Quotient Space (with capital initials) from a redirect to Equivalence class to a redirect to Quotient space (lower-case "s") and changed the latter to a disambiguation page, so far with only two main links and a "see also" link. So:

  • The disambiguation page would probably benefit from more work; and
  • Lots of pages link to the new disambiguation page. Those need to get appropriatedly directed.

Michael Hardy (talk) 18:38, 12 June 2014 (UTC)Reply

It seems to me that Quotient space should be an article in its own right, not only a disambiguation page. The concept clearly applies (under the name "quotient") throughout abstract algebra, and seems amenable to a general definition. It would naturally link to the articles for the concept as applied in subdisciplines, for example Quotient group, Quotient ring, Quotient vector space, Quotient semigroup, quotient set ... and the like. This is exemplified by the link to Quotient space (disambiguation) in Isomorphism that is a reference to the more-inclusive concept (but not to all the links in the current disambiguation page), when none of the individual articles will do as a target. Quotient space (disambiguation) would then become a disambiguation page rather than a redirect. —Quondum 19:24, 22 June 2014 (UTC)Reply
On review, perhaps that article is Equivalence class, but somehow with reference to the structure-preserving concept, is that the right name? —Quondum 19:42, 22 June 2014 (UTC)Reply
I added a section to equivalence class, trying to cover enough for a reader to understand both generalized and particular meanings and uses of the term. ᛭ LokiClock (talk) 05:56, 7 July 2014 (UTC)Reply

CS crank edit

Just wanted to draw attention to this user, who has recently created two very cranky looking articles (with main citations in MDPI, a predatory low-value journal) and started adding links to them from articles on CS and fractals. --JBL (talk) 13:19, 7 July 2014 (UTC)Reply

Now at Wikipedia:Articles for deletion/AjoChhand Machine. —David Eppstein (talk) 21:20, 7 July 2014 (UTC)Reply
...and Wikipedia:Articles for deletion/Frequency fractal. —David Eppstein (talk) 02:21, 8 July 2014 (UTC)Reply

Jacob Barnett is up for deletion edit

Please comment at Wikipedia:Articles for deletion/Jacob Barnett (2nd nomination). Sławomir Biały (talk) 12:15, 25 June 2014 (UTC)Reply

For those following the story, an admin decided to keep. The decision has since been appealed. I will never be able to figure out why people think that content like this is worth having in WP. Rschwieb (talk) 13:28, 9 July 2014 (UTC)Reply

AfC submission - 11/07 edit

Draft:1/ ∞. FoCuSandLeArN (talk) 14:55, 11 July 2014 (UTC)Reply

Classical group edit

I have rewritten the article completely. As it was before, it managed to miss the majority of the classical groups, but instead had bits and pieces on groups of Lie type that apparently fancied the authors more than classical Lie groups. I have retained most of that stuff, Classical groups over general fields or algebras, but I think it should go somewhere else. Suggestions? I opened a thread at the article talk page. YohanN7 (talk) 18:30, 11 July 2014 (UTC)Reply

Talk:List of mathematical symbols reorg edit

I have proposed a reorganization of List of mathematical symbols at Talk:List of mathematical symbols#Reorganize. As it is a major change, I'd like some consensus and, if possible some help, before proceeding.--agr (talk) 15:26, 14 July 2014 (UTC)Reply

Do we have these articles? edit

Do we have Lie correspondence and Closed subgroup theorem under different names? If not, I might write stubs for them. YohanN7 (talk) 20:20, 14 July 2014 (UTC)Reply

There is Representation_theory_of_the_Lorentz_group#The_Lie_correspondence and Cartan's theorem. I found both of those articles using Wikipedia's search box. RockMagnetist (talk) 20:56, 14 July 2014 (UTC)Reply
Lie algebra#Abelian, nilpotent, and solvable mentions the Lie correspondence for nilpotent and solvable cases but doesn't expound upon it. There are existing red links in various articles for Lie correspondence; I say go for the stub. "Closed subgroup theorem" is in a list of todos at Talk:List of theorems#TODO, so seems like a reasonable candidate for an article as well. --Mark viking (talk) 21:11, 14 July 2014 (UTC)Reply
Thanks for the info. I'll definitely do the closed subgroup theorem because it seems manageable enough. The Lie correspondence is quite an undertaking, and I'll let that be for a little while to come. It really surprises me that there is no such article because the Lie correspondence is truly the fundamental result (or rather class of results) of Lie group theory. YohanN7 (talk) 22:46, 14 July 2014 (UTC)Reply
As a placeholder, I have created Lie correspondence as a redirect. RockMagnetist (talk) 23:28, 14 July 2014 (UTC)Reply
I don't think it's a good redirect. I simply started the Lie group–Lie algebra correspondence, though it's far from complete. -- Taku (talk) 01:00, 15 July 2014 (UTC)Reply
While writing the draft for Closed subgroup theorem (not submitted yet, you are welcome to comment on its talk page), I bumped into the following missing articles candidates; Left invariant (often mentioned in Wikipedia), Group topology (strangely enough not mentioned anywhere, is it unusual terminology?) and Exponential coordinates (coordinates associated with the group topology, rarely mentioned in Wikipedia). YohanN7 (talk) 21:11, 15 July 2014 (UTC)Reply
We have Exponential map and Topological group. We have Invariant, perhaps a a defn of left and right invariants could go there? Exponential map mentions, but does not define, left and right invariants of various objects. --Mark viking (talk) 23:34, 15 July 2014 (UTC)Reply
It requires some thought. Left invariant vector fields constitute the Lie algebra of a Lie group, so that should probably go into one of the major articles (Lie group?). I think exponential coordinates and group topology perhaps could go in there as well, since they are natural constructs for a Lie group. YohanN7 (talk) 23:50, 15 July 2014 (UTC)Reply
Draft now submitted: Closed subgroup theorem. I want it to go the official way to get some sort of sanity check. YohanN7 (talk) 23:50, 15 July 2014 (UTC)Reply
Wow, nice work on short notice. It looks ready for mainspace to me. --Mark viking (talk) 00:16, 16 July 2014 (UTC)Reply
Ty, blushing. You have probably not yet spotted all errors bound to be there . YohanN7 (talk) 00:22, 16 July 2014 (UTC)Reply
Shouldn't it be called Cartan's theorem? (there is no question the theorem deserves an article on it.) -- Taku (talk) 01:41, 16 July 2014 (UTC)Reply
How funny. It looks in Cartan's theorem like that the Lie correspondence is Cartan's theorem too, at least in its global form.
I can live with naming the article I wrote either "Closed subgroup theorem" or "Cartan's theorem", either way is fine. But I really think Lie group–Lie algebra correspondence should be renamed to "Lie correspondence". It sounds better and seems to be a little more common, including in Wikipedia. YohanN7 (talk) 02:14, 16 July 2014 (UTC)Reply
About that. "Lie correspondence" didn't seem descriptive enough for me. Considering Wikipedia is a general encyclopedia, when there is no official or standard term, it's better to use a descriptive name. This is for example why Wikipedia includes the year in the name of an event article: e.g., 2014 Northern Iraq offensive, even the term is not commonly used in media. Also, "Lie correspondence" is a little ambiguous; there is Algebraic group–Lie algebra correspondence too. -- Taku (talk) 11:31, 16 July 2014 (UTC)Reply
In that case, "Cartan's theorem" is decidedly inappropriate for the "Closed subgroup theorem", much more so than "Lie correspondence" for "Lie group–Lie algebra correspondence", which is actually used in the literature. "Lie correspondence" can hardly be confused with "Algebraic group–Lie algebra correspondence". YohanN7 (talk) 14:42, 16 July 2014 (UTC)Reply
I don't think so since Cartan's theorem, in my impression, is fairly standard usage, while the term "Lie correspondence" doesn't appear in any of sources that are used in Lie group–Lie algebra correspondence. Of course, the perception depends on references one uses. But let me try this way too: "Lie correspondence" isn't precise enough; why "Lie" here has to refer to Lie groups? Why not Lie algebra? Why not Lie groupoid? The term would work perfectly in the books with the titles containing "Lie groups" but not quite in the places like here. The same problem with "closed subgroup theorem". Without context, it's not clear it's a result in Lie theory, while "Cartan" conjures the feel of Lie :) -- Taku (talk) 16:02, 16 July 2014 (UTC)Reply
I have exactly the opposite impression from the literature. I have never seen "Cartan's theorem" (much more ambiguous than "Lie correspondence" according to Cartan's theorem (one form of it actually refers to a form of the Lie correspondence)) in the literature, but have always seen "Lie correspondence". But let us not argue about this. The descriptive names (as they are now) are, in view of this discussion, clearly better in both cases. YohanN7 (talk) 16:46, 16 July 2014 (UTC)Reply

Affine space, forgot the origin: gibberish or functor? edit

Talk:Affine space#"Forgotten which point is the origin": gibberish or functor? Please look. Recent edits are generally constructive, but made by a person closer to physics than math (I feel so), with somewhat different philosophy. As for me, the views of physicists are welcome, but our views should not be exterminated. Boris Tsirelson (talk) 10:46, 8 July 2014 (UTC)Reply

Just a small comment: it's not a good practice to confuse the zero vector with the origin, or is it? A polynomial ring (over a field) is a vector space but you don't really call the zero polynomial the origin. The term "origin" suggests positional concepts; i.e., a choice of basis/coordinates. The definition of "affine space" that I like is that it is an element if a quotient space V/W. The problem with this definition is that it is not intrinsic, but works perfectly otherwise. -- Taku (talk) 12:52, 8 July 2014 (UTC)Reply
Tastes differ. In the theory of Banach spaces it is very usual to call 0 the origin. If I think of the vector space of polynomials geometrically, then indeed I think of 0 as the origin, of constant polynomials as a straight line through the origin, and so on. But maybe this is typical for a relatively small group of mathematicians? I do not know. Boris Tsirelson (talk) 13:42, 8 July 2014 (UTC)Reply
Presumably in an encyclopedic article it would be good to have at least a brief discussion of all these different perspectives. --JBL (talk) 15:03, 8 July 2014 (UTC)Reply
This seems very standard as an intuitive description of an affine space. The zero in a vector space is often called an origin, especially when concerned with the geometry of affine spaces. The article includes a quotation by Marcel Berger including the term "origin" in this context. It should be allowed to stay as it is. Sławomir Biały (talk) 15:36, 8 July 2014 (UTC)Reply
Gibberish, but good gibberish. Let in. YohanN7 (talk) 16:23, 8 July 2014 (UTC)Reply
May I say that I have a first in mathematics from Cambridge University, and a Ph.D. in mathematics from University of London, so I think I am by training closer to mathematics than to physics. As has previously been remarked by other, the mathematical definition of vector space, which does not mention an origin. It has an identity element, which certainly cannot be forgotten, and which cannot be identified with an origin. It is not appropriate for mathematical articles to be written by people abusing mathematical words. RQG (talk) 17:39, 8 July 2014 (UTC)Reply
"[A vector space] has an identity element ... which cannot be identified with an origin." This statement is just false, both as a statement about formalisms and as a statement about common usage by [many] mathematicians. --JBL (talk) 17:52, 8 July 2014 (UTC)Reply
Abuse of language has a long and distinguished tradition in mathematical discourse, being frequently employed even by such luminaries of the formalist perspective as Nicolas Bourbaki. The section that you continue to try to delete, without consensus, is clearly not meant to be a formal description anyway (even the title of the section is "Informal descriptions".). Regarding the use of the word "forget", the assignment of the affine space underlying a vector space is indeed an example of a forgetful functor. If one regards vector spaces geometrically, rather than algebraically, it is precisely the functor that "forgets" the origin. This perspective is well-supported by high quality references (and standard use in the mathematical community). Sławomir Biały (talk) 18:41, 8 July 2014 (UTC)Reply
RQG has a point, although he presents it in a way which is not entirely helpful (appeals to personal authority tend not to go down well). The question is about language and the abuse thereof, creative an otherwise. One could unpick the various concepts as follows. A vector space over a field is an algebraic object with algebraic axioms. Being a group it has a zero element. A linear space (a term which is not much in use, and here redirects to vector space) is a geometric object satisfying certain geometric axioms. As a geometric object it has a special point called O or the origin. The axioms of a vector space are sufficiently abstract that the concept is capable of modelling a linear space. An affine space (1) is an algebraic object which is a principal homogeneous space over a vector space, or a commutative heap on which a field acts. An affine space (2) is a geometric object which satisfies axioms (about parallel lines and parallelograms). An affine space (2) can be modelled by an affine space (1), the para-associative law is the existence of parallelograms. A linear space is an affine space (2) with a distinguished point called the origin. A vector space is an affine space (1) with a distinguished element called zero. The questions are, how to present this on Wikipedia, and in particular whether the slogan in question is helpful or unhelpful for the reader who wants to understand and remember the material. Obvious but important mistake corrected. Deltahedron (talk) 17:25, 16 July 2014 (UTC)Reply
Is not the idea of a principal homogeneous space as a space with a free and transitive group action already a geometrical one? One often describes a principal homogeneous space as a group without a fixed identity element. That is a group in which we forget the identity. So I don't think the distinction you're making reflects actual usage. Sławomir Biały (talk) 17:58, 16 July 2014 (UTC)Reply
I'm trying to unpack the various possible meanings using an ad hoc terminology to make the discussion easier -- I'm not claiming these are universally accepted terms or that Wikipedia should use them: I'm asking how they should be presented. Deltahedron (talk) 18:44, 16 July 2014 (UTC)Reply
One more source of ideas: nLab:affine space. Boris Tsirelson (talk) 19:12, 16 July 2014 (UTC)Reply
I think this seems like a much more productive direction than continued quibbling over the informal description. There is good stuff in that article. Sławomir Biały (talk) 19:38, 16 July 2014 (UTC)Reply

The nature of the misunderstanding? edit

Quote from "RQG":

"Take an affine space (A,V). Choose a point a as origin. Then you have a particular representation of A, together with a vector space V. You do not have V=(A,V), which is what you are saying, and which is patent nonsense. To get the vector space from an affine space you have to forget A, not choose an origin. The origin is a point in A. It is not in V."

End of quote from RQG

This says "Take an affine space (A,V)". I take that to mean A is a set and V is a vector space that acts transitively on that set in a way that satisfies certain desiderata. RQG seems to say that if you then delete A from this structure, you're left with V, so that an affine space is something more than a vector space: If you start with an affine space and discard part of the structure, you're left with a vector space. That is consistent with at least this much of the way I originally learned it: An affine space has an underlying set A and some vector space that acts on A in a certain way. But this notion that an affine space is (A,V) where A and V satisfy certain conditions and are related in certain ways is only one way of encoding the concept of affine space. There are others. One of those other goes like this:

  • A vector space involves an underlying set V whose members are called vectors, and a field F whose members are called scalars, and an operation of linear combination by which one takes scalars s1,...,sn and vectors v1,...,vn, one gets a vector s1v1 + ... + snvn, and this operation of linear combination satisfies certain algebraic laws.
  • An affine space involves an underlying set A whose members let us call "points", and and a field F whose members are called scalars, and an operation of affine combination by which one takes scalars s1,...,sn satisfying s1 + ... + sn = 1, and points p1,...,pn, and gets a point s1p1 + ... + snpn, and this operation of affine combination satisfies certain algebraic laws.

This is demonstrably equivalent to the "(A,V)" characterization of affine spaces. I leave the proof of equivalence to RQG as an exercise. And any undergraduate reading this may also find it useful to go through this exercise. By this second characterization of the concept of affine space, a vector space is an affine space with this bit of additional structure: One chooses some point which we will call 0 to serve as the origin or zero or whatever you want to call it, and and then one can define a linear combination s1p1 + ... + snpn in which s1 + ... + sn need not add up to 1 by saying that it is

s1p1 + ... + snpn + (1 − s1 − ... − sn)0.

Viewed in that way, a vector space is an affine space with some additional structure. And this way of viewing it is demonstrably equivalent to the "(A,V)" point of view.

I am pleased to see that user:John Baez has joined the discussion on the article's talk page. Michael Hardy (talk) 03:54, 17 July 2014 (UTC)Reply

Yes. But, ridiculously, RQG thanks John for clarifying "that an affine space has more structure than a vector space"! I guess, the problem is, not understanding the difference between "principal base sets" and "auxiliary base sets" (in the Bourbaki terminology). For a vector space over a field, the field is auxiliary, not principal. Accordingly, no one introduces "the forgetful functor from the category of vector spaces to the category of fields". Likewise, for an affine space, its difference space is auxiliary, not principal. Indeed, "a continuously differentiable map from one finite-dimensional affine space to another" means a map from A1 to A2 (surely not from V1 to V2). Accordingly, I think, "the forgetful functor from the category of affine spaces to the category of vector spaces" is not a good idea. Boris Tsirelson (talk) 05:50, 17 July 2014 (UTC)Reply
Another similar case (for example). A topological space is usually defined as a pair (set,topology), the topology being the set of "open" subsets. The topology is in fact a partially ordered set (by inclusion), moreover, a lattice. It is legitimate to consider the functor (set,topology)->topology, from the category of topological spaces to the category of lattices. However, would anyone call this functor "forgetful"? Definitely, I would not.
Such functors may superficially seem to be forgetful in one definition (of the "from" category), but do not sustain transition to an equivalent definition. A topological space may be defined, equivalently, via neighborhoods. An affine space may be defined equivalently... see above... not mentioning a vector space. The functors are still well-defined, but do not seem to be "forgetful" anymore. Since really they are not.
The true meaning of a mathematical notion, on one hand, and its specific encoding (via sets), on the other hand, should not be confused. An equivalent definition leads to the same notion, but (often) another encoding. Boris Tsirelson (talk) 08:02, 17 July 2014 (UTC)Reply
Isn't "The location of the gap in somebody's understanding?" quite a nasty subject title? There may be holes in your brain too. YohanN7 (talk) 15:50, 17 July 2014 (UTC)Reply
I felt so too, and have retitled. --JBL (talk) 16:28, 17 July 2014 (UTC)Reply
Problems at that article appear to continue. I begin to suspect definite WP:COMPETENCE issues after RQG's insistent misunderstanding of Baez's post on the matter. Sławomir Biały (talk) 18:21, 17 July 2014 (UTC)Reply

Double factorial vs semifactorial edit

Zaslav (talk · contribs) has moved double factorial to semifactorial, claiming that this name is both more traditional and more correct, and has edited many other articles to implement the same change. Some of us disagree. Please join the discussion at Talk:whichever name it is. —David Eppstein (talk) 04:40, 17 July 2014 (UTC)Reply

Wow. I'm all for WP:BOLD but this seems like it was a bad move. --Kinu t/c 04:50, 17 July 2014 (UTC)Reply
Excellent idea, David Eppstein. I'm afraid I jumped into a confused area of terminology. For instance, the "double factorial" article gave three definitions of an even "double factorial", one of which is "semifactorial", another of which is called "odd factorial", and the third of which, in terms of gamma functions, is different from both. I am now thinking there should be separate articles. For instance, the "semifactorial" n(n-2)... is used in combinatorics, while the gamma-function-defined "double factorial" is not. On the other hand, it's possible the "double factorial" mentioned in some of the analytical articles is not the semifactorial (but it is, in articles on volumes of spheres). I am abandoning the edit of Semifactorial, formerly Double factorial, until this is cleared up. I can't clear it all up myself.
Note 1: I made a bad mistake by assuming this was a simple matter. I acknowledge it was my bad.
Note 2: Possibly Meserve was the first to use the !! notation but the semifactorial was certainly known long before that. What it was called, if anything, is not known to me. My teachers in the 1960s said the name was "semifactorial" (and I believed them implicitly), so I'm sure there is something more to the history than the article says.
Note 3: There should be a distinction between notation and concept. The function "semifactorial" ("double factorial" in the parity product meaning) is not the same as the "double factorial" notation. Is the article to be about the function(s) or the notation? (Is the answer obvious?!) Zaslav (talk) 04:57, 17 July 2014 (UTC)Reply
I know we cannot depend on Google searches, but Google finds no references to "semifactorial" outside of Wikipedia. "Semi-factorial" and "Semi factorial" are sometimes used. I think this needs to be reverted. Now. If Zaslav isn't willing to do it, then he needs to stop editing until this can be resolved. "Google scholar" searches are even more impressive (238 v 1880), but few of either of them are this topic. — Arthur Rubin (talk) 06:11, 17 July 2014 (UTC)Reply
The Estonian and Farsi Wikipedias use "double", while Swedish uses "semi". The more I look, the more confused I get. — Arthur Rubin (talk) 07:49, 17 July 2014 (UTC)Reply
Thank you, Arthur Rubin, for your help. All in all, I made a stupid mistake. I will check to see if there are any further reverts needed.
There is still the issue of the three contradictory definitions of "double factorial" as well as the ambiguously related term "odd factorial". According to the WP article, the use of n!! for even natural numbers n is not well defined. There needs to be some work on that. Some of my edits were to put the "correct" definition in some articles. That still should be done. Zaslav (talk) 15:14, 17 July 2014 (UTC)Reply
I see there are (possibly more than 2) definitions of double factorial/semifactorial, all of which coincide if the argument is an odd integer (positive or negative), and differ if the argument is a nonnegative even integer. As we (Wikipedia) seem to use it only for integer arguments, it may make a difference. — Arthur Rubin (talk) 18:27, 17 July 2014 (UTC)Reply
"Semifactorial" would be unambiguous. It "clearly" means alternating factors and nonfactors down to 1. (Cf. Michael Hardy's following comment.) "Odd factorial" would also be unambiguous; it "clearly" means only odd factors down to 1. (I personally never heard of it and have never seen any use of it.) "Double factorial" is the problem. This should be continued on the Talk:Double factorial page. Zaslav (talk) 16:40, 18 July 2014 (UTC)Reply

"Semifactorial" seems like a good name for the concept, because you're only multiplying half the integers. I don't recall having heard it before. I've always thought the notation n!! is obnoxious because it looks like the factorial of the factorial, and that is not at all what is meant. Michael Hardy (talk) 23:54, 17 July 2014 (UTC)Reply

I agree it seems a good name. I've just never seen it used, and we need a related name for multifactorial Semifactorial as an instance of multifactorial sounds particularly weird. — Arthur Rubin (talk) 15:38, 18 July 2014 (UTC)Reply
I agree with both of you (thanks, Michael Hardy). I will still favor "semifactorial" in my own writing. However, as an encyclopedia, WP ought to follow convention. I'm now convinced that "double factorial" is overwhelmingly the most used. And thanks again to Arthur Rubin for reverting for me. Zaslav (talk) 16:36, 18 July 2014 (UTC)Reply

signal-flow graph could use a bit more math edit

It's actually written in a rather impenetrable engineering jargon/style as far as I'm concerned. It's basically just a bunch of examples, and it's kinda missing all its math content/background, which is a bit non-trivial. A search found that the recommend text (up to 1980s or so) for the mathy part is Wai-Kai Chen (1971). Applied Graph Theory. Elsevier. ISBN 978-0-444-60193-3.. Chapters 3-4 in particular, but most of the book is basically just about this topic. There actually two signal-flow graphs, the Mason graph and the Coates graph and they can be converted to each other easily, but no such info can be gleaned from wikipeidia etc. 188.27.81.64 (talk) 04:45, 21 July 2014 (UTC)Reply

Graph categories: some clean-up might be needed edit

There seem to be 3 categories that more or less overlap in their actual contents:

  1. Category:Graph theory objects
  2. Category:Application-specific graphs
  3. Category:Graph data structures

I'm guessing the first one is intended for "core" graph theory concepts. But some concepts like interval (graph theory) are application specific, but aren't exactly graphs themselves so don't neatly fit in the 2nd category. The third category seems to be the most problematic, as it seems to contain mostly items that should be in the 2nd one (app-specific graphs) or some variation thereof, i.e. are graphs augmented with various other info. Most of the stuff in the 3rd category aren't actually ways to implement graphs as data structures, e.g. adjacency list seem okay in that category, but and-inverter graph seems to belong to 2 instead—the article doesn't even say how these might be implemented. 188.27.81.64 (talk) 13:14, 21 July 2014 (UTC)Reply

Many of the things in the 3rd list are data structure that take the form of graphs. I agree that they should be separate from data structures for representing graphs. The category for data structure in the form of graphs should probably be a subcategory of application-specific graphs. —David Eppstein (talk) 16:12, 21 July 2014 (UTC)Reply
That sound good to me. Still, the smaller problem of categorizing app/domain specific concepts (rather than whole graphs) remains. They seem to be all over the place. E.g. dominator (graph theory) is in the main category (Category:Graph theory) whereas interval (graph theory) is in category #1 ("objects") from the above list. 188.27.81.64 (talk) 19:09, 22 July 2014 (UTC)Reply
Some of that is just articles not yet having been sorted into appropriate subcategories of the main graph theory category. If you think they should be in a specific subcategory (as seems likely in this case) please go ahead and make that change. —David Eppstein (talk) 21:03, 22 July 2014 (UTC)Reply

Equivalent definitions of mathematical structures edit

Bothered with the recent trouble with affine spaces, I'd like to have an article about "Equivalent definitions of mathematical structures". To this end I ask myself two questions:

(a) What can we reasonably say in this direction?
(b) Which part of the said can we find in reliable sources?

Item (a).

First, some case study.

Topological space has at least 7 definitions. "The utility of the notion of a topology is shown by the fact that there are several equivalent definitions of this structure. Thus one chooses the axiomatisation suited for the application." (From "Topological space".)

Uniform space has at least 3 definitions.

Differentiable manifold has at least 4 definitions.

Algebraic space has at least 2 definitions.

Ordered field has at least 2 definitions.

Surely, in each case these definitions are equivalent. But what exactly does it mean? From the article "Topological space": "there are many other equivalent ways to define a topological space: in other words, the concepts of neighbourhood or of open respectively closed set can be reconstructed from other starting points and satisfy the correct axioms". But it refers to the main article "Characterizations of the category of topological spaces"; there the equivalence means isomorphism of categories. This does not make me happy. First, having the category of topological spaces up to isomorphism I still do not know what is a topology on a given set. Second, yes, continuous maps are most natural as morphisms, but other possibilities exist (and are sometimes used), such as open maps or even Borel measurable maps.

According to the article Ordered field the equivalence means that "there is a bijection between the field orderings of F and the positive cones of F". I dislike this formulation. As for me, "there is" means "exists", and "exists a bijection" means, equal cardinalities. No, surely this is not meant! Rather, it is meant that the specific correspondence described there in the next lines is a bijection.

Now, some thoughts.

If I ask you "give me an example of a topology on the 2-element set {a,b}", you may give me the set "{{},{a},{a,b}}" of all open sets, or the function "a->{{a},{a,b}},b->{{a,b}}" that maps each point to the set of all its open neighborhoods, and so on. I'd say, this is similar to describing a vector in this or that coordinate system. But for a vector, our level is much higher! We have the general notion of a coordinate system, a general transformation formula for vector coordinates, and (if we are physicists) we can define a vector as something that transforms this way. For topologies, even this "physical" level is still in the sky! Do we imagine the class of all (rather than these 7) equivalent definitions of a topology? Can we define a topology as something that transforms as required from one definition to another? (Yes, we can do so for the 7 definitions, but I really mean all potentially possible definitions.)

Given a cardinality α, introduce the category S(α) of all sets of this cardinality, with bijections (not all maps!) as morphisms. Each definition of topology leads to a functor S(α)->S(β), (a set)->(the set of all topologies on this set). Two equivalent definitions lead to two naturally equivalent (in other words, naturally isomorphic) functors. It is tempting to consider the whole equivalence class of functors (similarly to the class of all coordinate systems). Pretty elegant, and general (applies to all structures, not just topologies). However, there is a problem.

Is it true that for every pair of naturally equivalent functors (of this kind) there exists only one natural equivalence between them?

Even simpler: what if there exists a nontrivial natural equivalence from one such functor to itself?

For topologies in general, I do not know. (Do you?) But for some mathematical structures the answer is discouraging: yes, there exists a nontrivial natural equivalence from one such functor to itself. For groups, it happens because of opposite group. For topologies on two-element sets, it happens because of possible swapping of "{{},{a},{a,b}}" and "{{},{b},{a,b}}" (exercise: check that continuous maps are insensitive to this swap).

Thus, it seems, we are able to list a finite number of definitions (for a given mathematical structure) and write down a consistent (that is, commutative) system of natural equivalences between them. But we are unable to do more.

Do you agree? Boris Tsirelson (talk) 18:29, 20 July 2014 (UTC)Reply

There is something like this already at cryptomorphism. The article is very specific to matroid theory, though, and I don't know if that specific word has been used in a lot of other contexts. —David Eppstein (talk) 18:51, 20 July 2014 (UTC)Reply
The question about nontrivial natural equivalences reminds me of ∞-categories. I believe that the disciples of homotopy type theory would tell you that if two definitions give you equivalent ∞-categories, then the two definitions are indistinguishable, and therefore you shouldn't worry about the difference. I think, however, that that will not satisfy you. Your question seems to ask for a Bourbaki-style answer in terms of sets with additional structure.
Personally, I have long thought that for formal purposes, one should treat different definitions as defining different kinds of structure (whether you're working in terms of structured sets, categories, ∞-categories or whatever) and then prove that those structures are equivalent or isomorphic as appropriate. (For example, the category "topological spaces in terms of open sets" is isomorphic (not merely equivalent) to the category "topological spaces in terms of closed sets".) So long as you are clear on exactly what kinds of properties you are hoping the definitions to express, this has always seemed sufficient to me. Ozob (talk) 21:06, 20 July 2014 (UTC)Reply
I think that seeing the equivalence between defining topological spaces in terms of open sets or in terms of closed sets as categorical is somewhat weaker than what's going on the the mind of the mathematician: they are attempts to define the "same thing". Deltahedron (talk) 21:41, 20 July 2014 (UTC)Reply
But I like categories. Ozob (talk) 03:24, 21 July 2014 (UTC)Reply
Categories are very interesting, and I use this occasion for learning more on them. However, I could not like such a definition of topology that cannot answer questions like "is this set open in this topology?" I want to use the Baire category theorem (when applicable). I want to consider probability measures in topological spaces, and see that (under appropriate conditions on the topology) each measure sits on some sigma-compact set. Etc, etc. Can I do it in the category language? Boris Tsirelson (talk) 05:35, 21 July 2014 (UTC)Reply
You might want to follow the example I set in finite set#Necessary and sufficient conditions for finiteness. One definition is taken as the official definition and the equivalent definitions are rolled into a theorem stating that they are all the same thing. JRSpriggs (talk) 07:07, 21 July 2014 (UTC)Reply
This is exactly what I'd like to avoid. In the vector metaphor, it means: "officially" define a finite-dimensional vector space as Rn, and then discuss arbitrary coordinate systems in addition to the "official" one. Boris Tsirelson (talk) 07:31, 21 July 2014 (UTC)Reply
I think it's helpful here to distinguish the defining properties of a mathematical object (e.g. a vector space is a group with a scalar multiplication operation over a field (etc), its dimension if finite is the maximum number of independent vectors, etc) from the models of those objects (Rn). (Coming from a CS point of view, I think of these as analogous to abstract data types vs their implementations.) I wouldn't use Rn as the definition of a finite dimensional vector space; it (together with the right tuple of operations) is an example of a finite dimensional vector space. (For instance, I think of k-tuples of members of GF(2), with pointwise addition, as having a different data type than subsets of the numbers 0, 1, ..., k − 1, with symmetric difference of sets, and that in turn is different from k-bit binary numbers, with bitwise exclusive or, but all three give naturally-isomorphic finite dimensional vector spaces.) You could define a finite dimensional real vector space as being a vector space isomorphic to Rn, but you'd need to have already defined vector spaces and isomorphisms already for that to even make sense, and it would be an ugly definition to choose as the primary one. —David Eppstein (talk) 07:44, 21 July 2014 (UTC)Reply
Sure. But here I do not really discuss definitions of a vector space. Rather I use this as a metaphor; see my first message, the paragraph "Now, some thoughts". And I would be happy to treat topologies as an abstract data type, and the 7 definitions as 7 implementations (equally "official", or equally "unofficial"). Can I? Boris Tsirelson (talk) 08:28, 21 July 2014 (UTC)Reply

Really, now I feel that the "abstract data type" (thanks to David) is the most apt word.

After more thinking I see how naive is my original idea that functoriality itself can dictate a single bijection between, say, topologies as sets of open sets, and topologies as families of neighborhood filters. (Initially I wrote I do not know whether this fails... now I see it surely fails.)

A notion of a mathematical structure arises from our intuition; and these bijections between different "implementations" are dictated by our intuition (rather than a formal requirement). Therefore the number of "implementations" must be finite (since our intuition cannot do more) (but of course some parameters running over infinite sets could appear).

Now the question is, to which extent is it (not) Original Research? Boris Tsirelson (talk) 08:58, 21 July 2014 (UTC)Reply

Something that would be indirectly relevant: "From Set Theory to Type Theory" by Mike Shulman; "Why do categorical foundationalists want to escape set theory?" (Mathoverflow); Homotopy type theory; Sear (redlink to "SEAR (mathematics)"); Univalence axiom. Boris Tsirelson (talk) 13:22, 24 July 2014 (UTC)Reply

'Mathematicians are of course used to identifying isomorphic structures in practice, but they generally do so by "abuse of notation", or some other informal device, knowing that the objects involved are not "really" identical. But in this new foundational scheme, such structures can be formally identified...' (Subsection "Univalent foundations" of Introduction in book:Homotopy Type Theory: Univalent Foundations of Mathematics. The Univalent Foundations Program. Institute for Advanced Study). Boris Tsirelson (talk) 16:59, 24 July 2014 (UTC)Reply

I agree that type theory presents an intriguing alternative to set theory and category theory as a foundational system. In Shulman's piece, there was a good discussion of equality and isomorphism. Getting back to your original proposal, perhaps the isomorphism article would be a good location to expand upon equivalent definitions of mathematical structures. While there are surely foundational and metamathematical sources that discuss structural equivalence in these broad terms, the idea of isomorphism occurs in all three systems of set, category and type theory and could be a good launching point for readers. There is already some material on the notions of isomorphism and equality there. --Mark viking (talk) 18:19, 24 July 2014 (UTC)Reply
Maybe. But, as noted by David (above), it is rather a cryptomorphism. Indeed, it is unusual to consider isomorphisms between different (differently described) structures. Moreover, in order to introduce such notion, a specific transition between such structures must be chosen. Boris Tsirelson (talk) 19:25, 24 July 2014 (UTC)Reply

Well, I did: Equivalent definitions of mathematical structures; please look. Improvements are welcome, of course. Boris Tsirelson (talk) 19:14, 28 July 2014 (UTC)Reply

Dear mathematicians: Some time ago I asked about this draft article, but received no reply. I am assuming that it is not a notable topic and should be deleted. —Anne Delong (talk) 19:24, 27 July 2014 (UTC)Reply

Maybe not. Google gives me 5270 results to "Chi-squared divergence" (with the quotation marks). Surely, the author did very little. I am not enthusiastic to collaborate in that. But the topic could have some notability. Boris Tsirelson (talk) 20:14, 27 July 2014 (UTC)Reply
"Chi-squared divergence" gets 58 hits in GScholar, not a lot, and none of the most cited papers are about this divergence in particular. It is a minor variant in a family of distribution metrics and divergences. I think notability is marginal and may not survive an AfD. The divergence is verifiable, however, and a redirect to f-divergence#Instances of f-divergences, where it is mentioned, seems warranted if this is not kept. --Mark viking (talk) 20:49, 27 July 2014 (UTC)Reply
Ah, yes, a good redirect. Boris Tsirelson (talk) 21:10, 27 July 2014 (UTC)Reply
It's a redirect at Chi-squared divergence now, but the text is available in the history if anyone wants to add some of it to the main article. —Anne Delong (talk) 01:55, 30 July 2014 (UTC)Reply

"Constant curvature" needs some tender loving care edit

The article Constant curvature appears to be in a rather poor state for an old and important article. I only noticed it because someone changed a link at Hyperbolic space to point to it. JRSpriggs (talk) 02:37, 30 July 2014 (UTC)Reply

Soon-to-be-deleted article edit

For some amusement, before it's gone, check out Is theta a scalar quantity or vector quantity. —David Eppstein (talk) 07:02, 23 July 2014 (UTC)Reply

Well, at least that one didn't last 8 years. 188.27.81.64 (talk) 10:01, 23 July 2014 (UTC)Reply
Well, it actually might last 8 more years. Someone "saved" it by pasting a bunch of references to papers about graphs being used for something else, papers that happen to have a guy named Muller among the authors. JMP EAX (talk) 19:25, 25 July 2014 (UTC)Reply

Before reading that, I had not suspected that no scalars are negative. Michael Hardy (talk) 19:12, 25 July 2014 (UTC)Reply

Grafting (ordered tree) is another pearl. Lesson learned: I'm not mentioning it on Jimbo's page. This article is already in "saved" format, meaning it consists of a bunch of non-sequiturs with some ref tags. JMP EAX (talk) 19:25, 25 July 2014 (UTC)Reply

(ec) Current convention is not to delete articles if the underlying topic is notable. Is there a single thing that is represented by "Grafting (ordered tree)" that could be used to cut an unsatisfactory article back to a stub and then grown out again? Deltahedron (talk) 19:31, 25 July 2014 (UTC)Reply
In the time it took you write that you could have done the trivial work of clicking the article's history and see for yourself that the answer is no. If you really think that the editor who made this edit could possibly write something intelligent in a math/CS article, then you probably drank too much wikicoolaid about the encyclopedia that anyone can (but should?) edit and so forth. JMP EAX (talk) 20:23, 25 July 2014 (UTC)Reply
It is more helpful to comment on the content, not on the contributor. If there is no prospect of writing even a short stub, then that is an argument for deletion. Deltahedron (talk) 20:46, 25 July 2014 (UTC)Reply

Oh, and rooted binary tree is actually citing its source correctly, except the source doesn't make (much) sense. JMP EAX (talk) 19:27, 25 July 2014 (UTC)Reply

Well, someone figured that one out. Vertices ≠ nodes for some. JMP EAX (talk) 19:29, 25 July 2014 (UTC)Reply
Insert obligatory quote from Knuth here: "The material that follows comes mostly from a larger area of mathematics known as the theory of graphs. Unfortunately, there will probably never be a standard terminology in this field, and so the author has followed the usual practice of contemporary books on graph theory, namely to use words that are similar but not identical to the terms used in any other books on graph theory." JMP EAX (talk) 19:33, 25 July 2014 (UTC)Reply
Rooted binary tree is clearly a notable concept. reliable sources would be easy to find. Deltahedron (talk) 19:36, 25 July 2014 (UTC)Reply
And different from binary tree how? JMP EAX (talk) 20:24, 25 July 2014 (UTC)Reply
In being rooted, I would imagine. Propose a merge and redirect if you think the difference is insufficient. Deltahedron (talk) 20:29, 25 July 2014 (UTC)Reply
May I suggest to you to spend the next 500 years of your life saving this edit? JMP EAX (talk) 20:35, 25 July 2014 (UTC)Reply
Why? Do you think it's important? Deltahedron (talk) 20:42, 25 July 2014 (UTC)Reply
Your sarcasm is unwarranted. Deltahedron is a well-known contributor here, and his comments are calm and in line with Wikipedia policy. We all share your frustration with material that shouldn't be on Wikipedia but is. Ozob (talk) 23:44, 25 July 2014 (UTC)Reply
Thanks for that. As far as unwanted content is concerned, I simply want us to distinguish between useful/useless topics for articles and useful/useless article content. Deltahedron (talk) 08:30, 26 July 2014 (UTC)Reply
What can I say. Do I see double today [1]? Same content, "different" user. Allow me to be frustrated just a little bit. 188.27.81.64 (talk) 14:39, 26 July 2014 (UTC)Reply
Just to spell this out for the benefit of passing readers: the articles Grafting (algorithm) and Grafting (ordered tree) appear to be substantially identical, despite having been created by apparently different users, User:Mmmzeta0 and User:Exe89. The articles have been proposed for deletion by User:JMP EAX and 188.27.81.64 (talk · contribs · WHOIS). Deltahedron (talk) 16:33, 26 July 2014 (UTC)Reply

n.b. I have amused myself by completely rewriting Grafting (algorithm). The article is no more and no less than Left-child right-sibling binary tree. A redirect there might be marginally helpful, but I'm not going to argue too strenuously for it. Lesser Cartographies (talk) 03:26, 30 July 2014 (UTC)Reply