Talk:Object-oriented programming/Archive 2
This is an archive of past discussions about Object-oriented programming. Do not edit the contents of this page. If you wish to start a new discussion or revive an old one, please do so on the current talk page. |
Archive 1 | Archive 2 | Archive 3 |
need to add an example of multiple inheritance
In the bulleted paragraph describing multiple inheritance, the statement "When an object or class inherits its traits from more than one ancestor class, it's called multiple inheritance." seems to say that 'Lassie is a Collie, a Collie is a Dog' is an example of multiple inheritance, when it is not. An example of multiple inheritance could be added, perhaps where a Collie and a Chihuahua breed; the offspring would inherit from both the Collie and the Chihuahua classes. This example would illustrate the complexities of multiple inheritance, because the offspring would be able to tremble(), but confusion arises as to which bark() to use, either the high-pitched Chihuahua.bark() or the default Dog.bark(). Spemble 16:08, 3 January 2007 (UTC)
- I think the Chimaera example is quite clear: a Chimaera isa cat anda dog. For another example:
- aircraftCarrier isa ship.
- ship isa vehicle.
- twoWheeler isa vehicle
- and
- nuclearReactor isa primeMover
- pedal isa primeMover
- then you can have a couple of new classes using multiple inheritance:
- nuclearAircraftCarrier isa nuclearReactor-powered-ship
- bicycle isa pedal-driven-twoWheeler
(These are not good examples of multiple inheritance. A bicycle is not a kind of pedal. Bicycle would be better modelled as having an attribute driven-by (perhaps inherited from Vehicle) which would have an object of class Pedal as its value. The NuclearAircraftCarrier is slightly better, but here again I would say that the the NuclearReactor is a component of rather than a superclass of a NuclearAircraftCarrier.
I suggest that a better example is HouseBoat which would naturally inherit from both House and Boat.) —Preceding unsigned comment added by 195.137.21.118 (talk) 08:19, 18 October 2007 (UTC)
- There are also some more classes you can have, but knowledge tells you that they cannot exist, such as a pedal-driven-aircraftCarrier and a nuclearReactor-powered-twoWheeler, though probably both might be fun... Dpser 16:44, 26 January 2007 (UTC)
Criticism Section Removal
These references are of no use what-so-ever. There is no real programmer that deputes the value of object oriented programming. I outline my individual reasons for each of the points below, and why they do not apply.
IMHO we must add the "advantages" section, instead of "criticism" section removal.
Do you think that article became more neutral, after criticism section elimination? We can discuss the improvements of criticism section, but it must be present. Otherwise article will lose neutrality.
Please discuss/remove the sections if you agree...
A study by Potok et al. [1] has shown no significant difference in productivity between OOP and procedural approaches.
- 1. Potok is not a programmer, he is a "computer science researcher".
- I think, this is not reason in order to ignore his opinion.
- 2. This article is misleading. First off, much of the evidence for this was done on programs 5000-25000 lines long. I work for a software company with a single executable that is well over 250,000 lines long. At this level, it is very very very useful. Second, all this article says is 'our results indicate that in a commercial environment there may not be a consistent statistically significant difference in the productivity of object-oriented and procedural software development'.
- I also worked in the software company. I consider that team is important, project management is important, requirements is important and architecture is important. Not OOP or any such technology.
- 3. What does "productivity" mean?
- Hmm... implemented features per time unit?
- 4. '...may not be significant..'? This isn't a criticism.
- It means "gives no advantages". Is it criticism ?
- 5. You can't stick code into a calculator and get a "productivity" rating, thus a "statistically significant difference" also has no meaning, because quality is not quantitative, yet the article contains numbers and formulas all throughout. This article was obviously written by someone who has not worked on a large-scale application.
- As far as i remember, article discusses "commercial environment". Not "Calculator" program development.
- The study appears objective and valid on first reading. However the summary above miss represents the conclusions of the study. The study does not claim "no significant difference in productivity" but states "no_evidence" of improved productivity on certain circumstances. The study accepts productivity gains have been demonstrated under different circumstances. The study admits the limits of it's own methodology using LOC. My feeling is the comment should be enhanced to properly reflect the contents of the study. MartinSpamer 15:16, 8 August 2007 (UTC)
Richard Mansfield wrote a controversial, widely discussed critique of OOP, asserting that "Even after years of OOP, many—perhaps most—people still don't get it. One has to suspect that we're dealing with the emperor's new clothes when OOP apologists keep making the same excuses over and over..." [2].
- If this is true, then there are many very bad programmers. OOP is critical in the development of large applications. It still says nothing of whether OOP is good or bad, only that there are many bad programmers, and judging by the quality of many programs, this is no surprise.
- It's your opinion. Many different opinions exists. For example, I think SOA is critical in development of large applications. We must mention only your opinion and not mention about the rest?
- This a subjective opinion piece by a Journalist/Author not a Computer Scientist and is near worthless considering the sites own review system rates it at a lowely 2.4 out of 5.
Christopher J. Date stated that critical comparison of OOP to other technologies, relational in particular, is difficult because of lack of an agreed-upon and rigorous definition of OOP.[7]
- What does a definition of OOP have anything to do with it. Objects are primarily structures and classes. There's your definition. Once again, this article only states that he was unable to make a comparison, which says nothing good or bad about OOP.
- The absence of formal description, reduces OOP to the level of the cookbooks or astrology. The opinion Of CJD must be mentioned.
- Christopher J. Date expertise is Relational Theory not OO. This is rather like asking Issac Newton to comment on General Relativity.
- I don't think that OOP proponents comparable with Albert Einstein. CJD do research on Object Relational Mapping issues. IMHO, he is competent enough in OOP.
Alexander Stepanov suggested that OOP provides a mathematically-limited viewpoint and called it, "almost as much of a hoax as Artificial Intelligence" (possibly referring to the Artificial Intelligence projects and marketing of the 1980s that are sometimes viewed as overzealous in retrospect) [3].
- Mathematically limited viewpoint? What does that mean?
- Second, "I find OOP philosophically unsound. It claims that everything is an object. Even if it is true it is not very interesting - saying that everything is an object is saying nothing at all. I find OOP methodologically wrong.". Philosophically unsound? Everything is an object? Methodologically wrong? Keep in mind his article is in Q&A format, and he makes no explanations of these abstract answers.
- Lastly, Alexander Stepanov currently programs in C++ (which is object oriented) and uses objects in his code. [[1]]
- This is not reason, in order not to mention about his opinion. By the way, STL is part of C++. STL is not object-oriented.
Edsger W. Dijkstra wrote: ... what society overwhelmingly asks for is snake oil. Of course, the snake oil has the most impressive names —otherwise you would be selling nothing— like "Structured Analysis and Design", "Software Engineering", "Maturity Models", "Management Information Systems", "Integrated Project Support Environments" "Object Orientation" and "Business Process Re-engineering" (the latter three being known as IPSE, OO and BPR, respectively)." — EWD 1175: The strengths of the academic enterprise
- Edsger worked on computers in the 1970's. He died before OOP even became mainstream, and was not a computer programmer working on large projects, but rather a computer scientist whom rarely left the lab. He was a theoretical physics major, and "Dijkstra was also noted for owning only one computer (late in life) and rarely actually using them." Need I say more?
- This is not reason, in order not to mention about his opinion. Edsger W. Dijkstra is one of the acknowledged authorities of our industry
- And finally, while I agree there are many snake oil salesmen in the computer science arena, classes are not seriously disputed. "Maturity Models" and "Application Lifecycle Management" may be hoaxes (I don't know enough about them to say), but objects in general are a necessity. If there are criticisms of these specific topics, they should be put in the respective topics, not in the main OOP article. Java, C++, C#, Pascal, Visual Basic, and even JavaScript are all object oriented.
- EWD says "Object Orienation". Not "classes". Not "Visual Basic".
Using the EWD comment in this way is missleading, the previous paragraph(s) of EWD1175 talks about the difficultly of deciding if something is worthy or study or is snake oil. Many other EWD show his expertise in using objects for Abstractions. —Preceding unsigned comment added by 194.66.238.27 (talk) 18:01, 4 March 2008 (UTC)
Will removing critisism of OOP imply that it is the only way to program in all circumstances, and eventually supercede all other methods? Mrrealtime 23:29, 13 March 2008 (UTC)
Removal?
Who keeps removing the link to the Richard Mansfield article? If it does not "qualify", please state why.
And I removed the opening "real world" claim for reasons given later in the wiki article (3.3). —The preceding unsigned comment was added by 66.120.226.1 (talk • contribs).
- I removed the entire external links section, not just this one particular link, because none of it met WP:EL. Wikipedia is a collection of articles, not a link farm.
- The link is consistent with "Sites which fail to meet criteria for reliable sources
- yet still contain information about the subject of the article from knowledgeable sources."
- I don't consider Mr. Mansfield unreliable, I do consider his article to be knowledgeable.
Other people may have removed just this specific link in the past, and I can't speak for them; I can only talk about why I removed the external links. We should add verifiable, cited content to this article rather than a bunch of links to various people's opinions. The "further reading" section is in my opinion listcruft and probably needs to go, too. The books/articles which are citations should be moved to references, and those which are not citations should be removed from the article.
- Now, about this link in particular. The Mansfield article does not meet any of the "What should be linked" criteria in WP:EL. It doesn't appear to me to be a reliable source. It's one guy's opinion, without a single citation nor even a real-world example. He states that OOP "isn't doing what it promises" without saying who, exactly, is doing the promising or what they are, precisely, and then compares it to Marxism! This is not research, it's a screed, and while he's more than welcome to post it on devx (more power to him, frankly, for speaking his mind), it doesn't meet Wikipedia guidelines for what should be cited or linked and doesn't belong here. --Craig Stuntz 21:02, 10 January 2007 (UTC)
- The link is consistent with "Is it proper in the context of the article" and "Links to be considered".
- I, personally, think it's unfair to the reader for you to remove it.
- There are many people on the internet who are engaged in research and want to compare the
- advantages and disadvantages of certain topics.
- You ought to let educated readers decided what is valid or not for their research.
How is this different from Bertrand Meyer's opinions and rants in his often-cited book? There is very little proof of OOP being better anyhow. Thus, if we only stick to science, then this entry would be real short. OOP is largely a psychological phenomena, not a scientific nor mathematical one. Mansfield is the author of several technology books, I would note (but not directly about OOP). And, devX is published material. —The preceding unsigned comment was added by 68.183.137.111 (talk • contribs).
- Well, Meyer doesn't run up against Godwin's Law, for starters. Seriously, are you really not able to see the difference between Meyer's book and the devx article? I find that difficult to believe, especially as I know you've read both. The point is not that Mansfield is somehow less of a reliable person than Meyer, it's that this specific piece by Mansfield doesn't even attempt to justify its assertions. Yes, Mansfield is capable of writing a cite-able source; he just didn't do it in this specific case. Oh, and the article does not claim that OOP is generically "better" (or worse, for that matter), nor should it. Assertions that a particular development methodology is "better" or "worse" are suspect by their very nature and certainly out of place in an encyclopedia. --Craig Stuntz 14:08, 11 January 2007 (UTC)
- No, I don't see much difference. Meyer does use shape and animal examples with code, but those don't extrapolate to real-world problems. And he has several flaws in his reasoning, but we can't consider those because they are "uncited". Wikipedia is growing stale by limiting opinions and links. —The preceding unsigned comment was added by 68.183.137.111 (talk) 04:15, 14 January 2007 (UTC).
the authors discuss inheritance with regards to an example of a circle which is a subtype of an ellipse. Other than redefining a data type to represent an object with two foci (which are the same if it is a circle) they find very little utility in inheritance beyond the what can be provided via a database view, which uses the well-known normalization transform of a super type sub type relationship. IMHO anything beyond this "inheritance" is pure garbage, and cannot be supported be rigorous theory.
For example, read http://www.amazon.com/Core-J2EE-Patterns-Practices-Strategies/dp/0130648841/ref=cm_cr-mr-title/103-9197091-7991048. There are many "Patterns" such as the Business Delegate, but no rigourous, mathematical formulation of why they exist or are helpful compared with other alternatives. -Anon.
- Gentlemen. This is not the place to debate the merits of OOP; this is the place to discuss the article. People's personal opinions of inheritance, etc. (and I agree that inheritance is quite frequently abused) aren't relevant here. There are many notable critiques of OOP in various contexts (Chris Date is generally a reliable source; though he avoid comment on OO outside the scope of databases) without happing to cite unreliable sources like tablizer, this Mansfield article, etc. Go find them. :) If you want, write an article about the Circle-ellipse problem; it's certainly deserving and lots has been written about it. --EngineerScotty 19:04, 19 January 2007 (UTC)
- The circle-ellipse problem is not very practical in my opinion. Too many OOP books use shape, animal, and device-driver examples. They may be okay for studying philosophical concepts or basic training, but they are difficult to apply or compare to more real-world problems. At least "tablizer" uses banking and power utility examples. --Anonymous.
- Non-sequiter for several reasons: 1) The above comment concerning tablizer.com was not made because I like or dislike the page in question; it was made simply because tablizer.com doesn't meet Wikipedia's requirements for a reliable source. It's self-published and not peer-reviewed, it's ignored by the literature, and tablizer isn't recognized as an expert in the field. (For that matter, tablizer doesn't even like his real name being printed). 2) The circle/ellipse example is entirely appropriate, as it illustrates a basic issue. That said, the OO literature is full of examples of complex applications being implemented beyond shapes and critters and what tablizer (and you--assuming you aren't him--if you are User:tablizer you should log in rather than pretending to be an anonymous third party) refers to as "device drivers". At any rate, I'll work on the circle-ellipse page; there's a good start there. --EngineerScotty 18:27, 22 January 2007 (UTC)
- Tablizer - notable source on "OOP criticism" topic. Try googling "OOP criticism", "Object oriented programming criticism" User:Bibikoff
- Coming up at the top of a Google search does not make you a notable source. Nor does it make you meet WP:EL. Please respond to the wealth of reasons listed on this page why it's neither notable nor WP:V verifiable rather than just keep plugging it back into the page. You might also want to read WP:SOCK, which is Wikipedia official policy. Thank you. --Craig Stuntz 13:06, 10 August 2007 (UTC)
- 1) About WP:SOCK. Personal insults do not make your position more convincing. I'm new to English wikipedia. I work on russian one. You can [verify this]. can you present any proofs?
- 2) About WP:EL. It states "...What should be linked... 3. Sites that contain neutral and accurate material that cannot be integrated into the Wikipedia article due to copyright issues, amount of detail (such as ... online textbooks) or other reasons...". I think B.Jacobs's site falls under this criteria. It's good starting point for people interested in OOP disadvantages.
- 3) About WP:V and WP:N. This is the debatable question. But you "forget" about WP:NPOV. It says that "All Wikipedia articles and other encyclopedic content must be written from a neutral point of view (NPOV), representing fairly and without bias all significant views..." and "...Detailed articles might also contain the mutual evaluations of each viewpoint, but studiously refrain from asserting which is better... ". You consecutively delete information about the OOP deficiencies. Will article become better by this? Decide yourself.
- 4) Why did you delete link on Mansfield article? Community didn't reach the consensus on this concern.
- 5) Craig, maybe we together can find appropriate way to present info about OOP criticism resourses. Waiting for your reply. Bibikoff
- Coming up at the top of a Google search does not make you a notable source. Nor does it make you meet WP:EL. Please respond to the wealth of reasons listed on this page why it's neither notable nor WP:V verifiable rather than just keep plugging it back into the page. You might also want to read WP:SOCK, which is Wikipedia official policy. Thank you. --Craig Stuntz 13:06, 10 August 2007 (UTC)
- (1) No offense was intended, certainly. The simple fact is that in the first edit you (apparently) ever made to EN wikipedia you marched into the center of a long-running discussion and apparently didn't consider what was written here before doing so. How do you think that looks? The point isn't to accuse you of being a sockpuppet, it's to point out that your actions make you look like one.
- (2) Same as (3); see below. Sources for articles on fact need to be verifiable.
- (3) WP:V is absolutely non-negotiable. It's official Wikipedia policy, and there aren't exceptions. See, especially, "Sources." I won't bother to quote the entire page to you; you can read this yourself, but do note the bits about self-published sites and publications "with a poor reputation for fact-checking or with no editorial oversight." (Which describes most of the technology press, sadly, with webzines like devx only marginally worse.) We need data on this subject, not opinion pieces.
- (4) See 3; it's not verifiable.
- (5) This is the most important point, actually. Why fight to have lousy sources restored when energies could be better expended at finding good sources to represent the POVs you'd like to include?
- --Craig Stuntz 17:26, 10 August 2007 (UTC)
- I would like to see citations for "examples of complex applications being implemented beyond shapes and critters" (and device drivers). For example, comparative business applications. I don't dispute that such examples run, but I dispute that there is any literature that clearly shows how they are objectively better than procedural/relational equivalents. The scientific case for OOP for business applications is nearly non-existent. --Tablizer 04:45, 29 January 2007 (UTC)
- PJTraill 19:16, 20 January 2007 (UTC) I have written Circle-ellipse problem.
- They are certainly not! Object-oriented programming is about programming and not run-time efficiency. The end-product can very well be exactly the same. Or in some cases it may be even better if you use procedural programming and relational knowledge. It is not about the result, it is about the methodology! If you understand your world better using relational knowledge that's all right. Other people identify objects within their representation of the world, that's also not bad either. For example I don't have to worry about whether Mrs. Simpson has a beard. For a relational database this is of course an issue. And it may be right after all, but for me it is just a matter of taste. However, I think it is clear that the article is about object-oriented programming and disputes about whether the end product is better or worse are wrong, because this is not the issue.Dpser 12:41, 29 January 2007 (UTC)
- Could someone please clarify where run-time speed was brought up as an issue? There seems to be a speed-related response, but no original matching statement.--66.120.226.1 20:57, 1 February 2007 (UTC)
- Discussing whether OOP is "better" (or "worse") than other methodologies completely misses the point. The article makes no claim that OOP is "better" or "worse," nor should it. The article is intended to describe OOP, not advocate an opinion about it. --Craig Stuntz 14:53, 29 January 2007 (UTC)
- Agreed with Craig. To address top's point--there isn't literature which "clearly shows" that other paradigms (relational, functional, procedural, what have you) are "objectively better"--and as you've pointed out long and loud at other forums, such comparitive studies are highly difficult. Lots of people who can be considered experts--Chris Date, Peter Norvig, Bertrand Meyer, to name three, have weighed in on the debate; feel free to add cites from these on the subject. But as Craig points out, this is not an OO or relational advocacy forum. We're not here to decide which is better. --EngineerScotty 16:35, 29 January 2007 (UTC)
- I disagree with your and Craig's point of view. Most knowledgeable experts will list advantages and disadvantages of a giving process or paradigm. I think listing the known disadvantages of OOP is beneficial to the community.--24.8.150.120 07:51, 15 August 2007 (UTC)--24.8.150.120 07:51, 15 August 2007 (UTC)
- The question isn't "should it be done?" The question is "Can it be done verifiably?" Asserting that it should be done doesn't answer the question on the table. IMHO any verifiable information is welcome here, but I've tried to find verifiable information on the pros and cons of OOP (see elsewhere on this page) and haven't come up with much. --Craig Stuntz 13:08, 15 August 2007 (UTC)
- If one is to follow edit according to verifiability, one would have to remove all the positive and negative assertions that haven't been cited yet.--24.8.150.120 19:57, 17 August 2007 (UTC)
- Or cite them; sure. Feel free to contest anything you don't think is true. --Craig Stuntz 13:25, 21 August 2007 (UTC)
Timeline
- Object-oriented programming developed as the dominant programming methodology during the mid-1980s,[citation needed] largely due to the influence of C++, an extension of the C programming language.
No way. I remember my first job in 1986, and everyone was using C. Mid 1990's, maybe. And as far as I know, it was "largely due" to Java. I know this because I happen to be a Java programmer (persons with no sense of irony, please do not reply to that).
Paul Murray 05:52, 31 January 2007 (UTC)
- I think 90s is more accurate, and I think C++ and Java would be more accurate. But I'm reluctant to change it from one person's recollection to my own without a good source. --Craig Stuntz 14:24, 31 January 2007 (UTC)
OOP was not dominate in mid-1980's due to C++ or anything else. COBOL was still dominat in the 80's - University courses not-withstanding. And despite C++ being an improvement over C, it wasn't much used for OOP (again outside of University courses). OOP didn't gain any real traction until Borland Object Pascal and Delphi put it in the hands the business programmers so they could study it up close. But OOP couldn't begin to claim dominatation until JAVA became the lingua-franca for the new class of business systems (client-server/web-based) starting to be developed in the late 90's. Complete legitimacy in the business world was achieved only when SAP added OOP to ABAP in the early 2000's. -- HKL47 04:06, 15 March 2007 (UTC)
Another issue is that it's origin had nothing to do with the "software crisis". Instead it was born in physical simulation systems, not as a response to what Bertrand Meyer calls the "software crisis". Meyer tends to spin the history to fit his view. But, I've yet to find a cite-able source for the simulation-based origin. —Preceding unsigned comment added by 208.127.151.158 (talk) 06:27, 5 November 2007 (UTC)
Argumentation unclear
"Pure" object-oriented languages, on the other hand, lacked features that many programmers had come to depend upon. To bridge this gap, many attempts have been made to create new languages based on object-oriented methods but allowing some procedural features in "safe" ways. Bertrand Meyer's Eiffel was an early and moderately successful language with those goals.
I do not understand what this means: which constructs were added in Eiffel (so much that it can be seen as a goal of the language) to support procedural features that many programmers had come to depend upon? --Schoelle 23:37, 23 March 2007 (UTC)
Argumentation unclear, part two
Under "Polymorphism " is the following: "it's very useful, once it improves code readability, to enable implicit conversions to the correct handling method when apply add() method to integers, like in add(1,2), or to strings like in add("foo","bar") since the definitions of these signatures are available."
First of all, the sentence should be capitalized, but beyond that, I cannot make sense of it. The sentence appears to be headed off in several different directions without enough transition to ascertain where it is headed. Actually, the more I look at it, the more it seems to resolve itself simply by removing the independent clause, "once it improves code readability." That's a clumsy phrase to begin with. And unless someone else can figure out what really is intended, I say, let's get rid of that one clause--and capitalize the first word.
--C-U RPCV 05:09, 12 April 2007 (UTC)
I find the whole Polymorphism section extremely badly-worded and difficult to understand. Will someone with a better understanding of the subject than me please rewrite it.
Why does the link "Checking type instead of interface" Anti-pattern redirect to this page? I'm not sure what it should link to, but this page isn't about that anti-pattern but OO in general, right? 209.129.94.61 17:33, 8 May 2007 (UTC)
The Polymorphism section is unreadable garble. Most of the sentences are grammatically incorrect, fragments, redundant, or confusing.
Rather than adding to the whining, I'll take some initiative here. I hope an expert will check the correctness of my revisions after I make them. Fisherm77 14:16, 15 June 2007 (UTC)
Clarification/contrast of Microsoft .NET (CLR) vs Sun Java (JVM) - primarily their respective intent/effect.
Note: This is User:Greg tresters edit meant for the talk page
In article:
In the past decade Java has emerged in wide use partially because of its similarity to C and to C++, but perhaps more importantly because of its implementation using a virtual machine that is intended to run code unchanged on many different platforms. This last feature has made it very attractive to larger development shops with heterogeneous environments. Microsoft's .NET initiative has a similar objective and includes/supports several new languages, or variants of older ones.
User:Greg tresters comment:
This is innaccurate - Microsoft's .NET Framework (see <a href='http://en.wikipedia.org/wiki/Common_Language_Runtime>CLR</a>')is not to be confused in intent, nor implementation with Java's concept of a JVM.. CLR requires the resultant bytecode to be run on the Windows platform; the JVM allows the resultant bytecode to be run on any platform where the JVM has been developed. Think of .NET as "language agnostic, platform affinitive", and Java as "platform agnostic, language affinitive". --Zven 08:48, 18 May 2007 (UTC)
- I agree with Greg's assessment. Microsoft have never stated that the goal of .NET was platform independence. Having multiple languages compiling to the same bytecode, allowing the same runtime to be used for those languages IS a stated goal, however. --Surturz 02:55, 21 May 2007 (UTC)
Delphi
Why is it that delphi is always bypassed as an oop language, this seems to be very common especially alot of oop concepts ended up in .NET.
- I use Delphi and like it a lot. I think most people in the OOP community do not consider Delphi a pure OOP language because it allows the programmer to use and write plain procedures and functions; Delphi allows you to write procedural programs.--24.8.150.120 07:55, 15 August 2007 (UTC)
- Also not everthing in Delphi, like a string or an integer, is an object. --75.71.70.81 09:32, 27 August 2007 (UTC)
- To be precise, not everything in Delphi for Win32 is a TObject subclass. In Delphi for .NET, every type is a subclass of System.Object. "Plain procedures" in Delphi for .NET are methods of the implicit Unit class. --Craig Stuntz 02:28, 29 August 2007 (UTC)
- The article is not meant to be a comprehensive list of all OOP languages. If everyone listed their favorite language, it will be an excessive list for its purpose. --66.120.226.86 (talk) 20:45, 18 August 2008 (UTC)
Association - Aggregation and Composition
I think this topics are lack of examples.
//
class material
{
private String name;
public material () {
}
public void setNM(String X) {
name = X;
}
public String getNM() {
return name;
}
}
class List
{
private material m01;
public List() {
}
public void setMat(material X) {
m01 = X;
}
public material getMat() {
return m01;
}
}
Would you like to help me ?
regards
This may be useful for the article
On who coined that term and where ( http://www.purl.org/stefan_ram/pub/doc_kay_oop_en )... --85.139.120.189 04:26, 26 August 2007 (UTC)
Unlike something, but... how exactly?
In the current version of the page we can read at a certain point that "[...] Oberon, included a distinctive approach to object orientation, classes, and such. The approach is unlike Smalltalk, and very unlike C++." I personally believe that this hint should either be removed or complemented even with the briefest positive description of what the approach is like: just telling what it is not like doesn't help much. --Blazar.writeto() 20:35, 23 October 2007 (UTC)
- You have a point there, that section should be clarified or removed. IIRC, Wirth introduces the concept of "type extensions" in 1988. In contrast to other languages, type extensions are very "low level": all you are allowed to do is to extend RECORDs (structs) at the end by extra fields (see [2]). Pointers to such records are the polymorphic, following the implied "single inheritence" hierarchy. Also, some later dialect of Oberon (I think it was Oberon.NET [3]) introduce "Facets", which are very similar to Java interfaces (just subtyping, no inheritence). I am not sure which mechansism the original author referred to. --Schoelle 07:49, 25 October 2007 (UTC)
Changes by 59.183.251.132
I do not mind to simplify the list of fundamental concepts, but the changes by 59.183.251.132 seem to oversimplify ("A method is the ability of an object."). Also, they do not cite a source (looks like a case of WP:NOR), so I undid the changes. --Schoelle (talk) 07:28, 23 November 2007 (UTC)
Real World examples
We've all read the "bicycle is an example of a vehicle", "cat is an example of a mammal" examples, but in day to day programming these are meaningless, and only add confusion. In practice, a base class contains a subset of properties usable by all child classes, to which they add or override specific methods and attributes to do useful work. IMHO functional, rather than a "set theory" approach is more helpful. —Preceding unsigned comment added by 65.43.175.214 (talk) 03:47, 26 November 2007 (UTC)
Criticism again
I wish I had a penny for every WP article with a poor criticism section... Here's the thing. These are terrible criticisms. Date is a non-criticism so I have no idea what it is doing there, although Stallman is clearly a critic of OOP in that quote it is not clear what his criticism is. Dijkstra is better, but it really doesn't help me to know that he called OOP "snake oil." My friend called OOP "a bunch of sillyness" but that isn't going to get listed on WP. Why? Its not notability - its because its an empty and uninformative criticism.
Stepanov and Potok get bonus points for actually having a substantive complaint that gives me information about possible failures of OOP that I didn't know before I read the section. The other lines are pretty useless, the possible notability of their sources aside. Not everything Stallman says is worth writing down.
Thoughts? Agreement? Disagreement? I'll let this idle around for a while before I make a change. 66.216.172.3 (talk) 20:46, 6 February 2008 (UTC)
- You shouldn't make a change because those are interesting quotes from interesting people. —Preceding unsigned comment added by 71.237.89.49 (talk) 08:53, 22 February 2008 (UTC)
The referenced article by Dijkstra is about universities and society. While it is critical of OOP in passing, it is not a criticism of it. An actual criticism by him would be interesting. The "Snake Oil" reference is merely tabloid sensationalism. Gerardw (talk) 16:50, 27 November 2008 (UTC)
Origins in artificial intelligence?
I have heard a few times from separate sources that OOP has its origins in AI. Can anyone confirm or discredit this rumour? Pgr94 (talk) 13:13, 28 February 2008 (UTC)
- Just came across: "Mark Stefik and Peter Friedland (Molgen, the first object-oriented representation of knowledge - 1978)" Brief history of artificial intelligence @ aaai.org Pgr94 (talk) 16:10, 26 March 2008 (UTC)
Object system
Object system redirects to this article. What does it mean? --Abdull (talk) 20:38, 21 March 2008 (UTC)
New language A#
I suppresssed the following:
- Another new language, named A# is being developed by students. It is stated that this language will revolutionize Object-Oriented Programming as we know it. The language is still very much in the works with no information available on the mother-developer of the language itself.
because it is unsupported, but perhaps someone can back it up with some meat? A Sharp (.NET) talks of a port of Ada to .NET (hardly revolutionary), or is something else meant? Google suggests not! PJTraill (talk) 00:40, 20 October 2008 (UTC)
"Misleading sentence"
"In the 1980s, there were a few attempts to design processor architectures which included hardware support for objects in memory but these were not successful. Examples include the Intel iAPX 432 and the Linn Smart Rekursiv."
"Not successful" is misleading here, the fact that the iAPX 432 was slow or Rekursiv not successful commercially doesn't mean that they not were succeseful implemting OO in the MMU. —Preceding unsigned comment added by 200.127.136.93 (talk) 02:17, 6 January 2009 (UTC)
Object vs Instance
These are the exact same thing. That should be explicit. —Preceding unsigned comment added by 81.252.207.82 (talk) 13:43, 20 March 2009 (UTC)
- No. This is the same only in the language that support classes. P99am (talk) 11:57, 24 April 2009 (UTC)
"Static programming languages"?
The criticism section implies that OO is a feature of "static programming languages" as opposed to Lisp.
- Object orientism requires no specific language feature other than the ability to encapsulate or 'black box' state within a container and expose behavior to external containers.
- I can't find the term "static programming language" in the literature. Is this phrase supposed to refer to "statically-typed programming languages" (e.g. C++), as opposed to "dynamically-typed programming languages" (e.g. Lisp)?
- If so, why is OOP exclusively associated with statically-typed languages, when there are so many OO or multiparadigm dynamically-typed languages (e.g. Objective C, Python, Ruby, etc.)?
I am not knowledgeable enough about Lisp (or OO design, for that matter) to rewrite this section. Could someone clarify, or at least add some references? --Otterfan (talk) 23:36, 16 March 2009 (UTC)
Wait, I get it--"static programming languages" isn't a commonly-used term, but it appears that "dynamic programming languages" is. Still, there seem to be a lot of object oriented dynamic programming languages. This still needs clarification, or at least some references. --68.239.60.5 (talk) 04:11, 19 March 2009 (UTC)
- I'm no expert on langauge definitions, but I understood the term 'static language' to be as above - ie not a dynamic language (such as lisp) (aside comment eg maybe compiling to a contiguous machine code without any list structure connecting 'bytecode' as might be found in a dynamic language)
- I definately didn't think it refered to static/dynamic typing.
- If that's the issue maybe a note type reference would be a good idea - clarifying what is meant by 'static programming language' ?
- Or was the issue whether or not the statement(s) is(are) true?83.100.250.79 (talk) 22:38, 26 June 2009 (UTC)
Pointers are routines?
How so? Is this about pointers to functions? If so it needs some rewriting. Pcap ping 19:23, 26 August 2009 (UTC)
Section History
Hi, When I was reading about the two commercial products, it struck me that there was only one example given. I think that is not in line with the objectivity of Wikipedia. Moreover, this example - which I do think is a beautiful characteristic of VB.NET framework - can be put into more perspective of the principles of Object Oriented Programming, which is the subject of this article. This cross-language inheritance is an interesting way to abstract code from implementation using the Strategy pattern. Java does have a similar feature, the virtual machines, which is abstracting the code from the implementation as well, but now using the Adapter pattern. So I thought I could enrich the example provided, extend it with another example from the commercial world thereby raising the objectivity of this article using just three sentences. I do hope one approves. Loekbergman (talk) 06:23, 27 July 2010 (UTC)
Section "criticism" needs expansion
Would very much like to see the section "criticism" expanded to at least a couple of paragraphs. I am not at all competent to this myself. -- 201.37.230.43 (talk) 14:09, 21 February 2009 (UTC)
- I agree with you that it needs expansion. Instead of destructive criticisms, rants or laconical criticisms that do not help the reader to assess the benefits and limitations of OO technology, what I would like in that section is published, referenced examples of specific, well-delimited problems where a "pure" OO solution would be inherently more complex, harder to understand or harder to evolve than a solution to the same specific problem using another technology (say, DSLs, functional programming or even cell-oriented programming like in Microsoft Excel).
- Now this would be a nice contribution to improve the article so that it provides a better overview of the characteristics, advantages and limitations of OOP. --Antonielly (talk) 16:56, 21 February 2009 (UTC)
The current criticism section reads in a petty and personal way. It doesn't read like an encyclopedia should at all. --M2tM (talk) —Preceding unsigned comment added by 207.47.201.6 (talk) 00:51, 4 August 2009 (UTC)
- It seems to me that anyone who dares to criticise OOP (or wonder what all the fuss is about), is seen as unknowledgable/reactionary/ petty or similar. If you actually care to look at the experience and stature of these critics you will see that this is not the case. As a programmer myself, with more than 40 years experience (and many successful products), I actually believe that conventional programming is far more efficient, faster and does not involve largely undefinable terminology and "get arounds"/"fudges". It is suspicious, to say the least, that nobody has yet provided a recognizable generic definition of OOP or provided proven benefits. I have spoken with several well respected programmers who, like me, have yet to accept any of the so-called "benefits" of OOP. Sadly it's another example of "The Emperor's New Clothes" that has simply got out of hand in a massive way. — Preceding unsigned comment added by 81.157.169.126 (talk • contribs) 11:19, 7 August 2009 (UTC)
- In the same way that you normalize relational databases you can normalize objects to move their state closer and closer to the behavior that utilizes it. This is the benefit of true object oriented programming when used correctly. You get logical containers or 'tools' that are specfic behaviors married to their state. This allows stability through immutability. Other programmers using these stateful/encapsulated objects can trust that they will behave the same no matter who else is utilizing that object within the program. In contrast procedural or data driven programming like i've often seen used in VB6 or COBOL programs the driver program maintains the state and governs over mutability and the classes merely have behavior. You pass state into the behavior and you you get back a modified state or you get back a result or condition of whether or not your behavior succeeded. All of which your driver program must maintain. The benefit of procedural program is one of 'visibility' into state management and less 'delegation' as you can read the driver program as one document of events happening. My only citation is my experience of 13 years in software development. The enlightened software developer knows when to employ OOP or procedural or SOA or any new paradigm as it comes about. Only zealots use one and speak against the others. —Preceding unsigned comment added by 69.76.159.241 (talk) 20:25, 21 March 2010 (UTC)
- "It is suspicious, to say the least, that nobody has yet provided a recognizable generic definition of OOP or provided proven benefits."
- The same can be said about cults. Are you equally suspicious about the effectiveness of cults? Kidding aside, this is called a continuum, in the real world(OOD), continuum exist. — Preceding unsigned comment added by 72.187.199.192 (talk) 22:44, 20 November 2010 (UTC)
Using OOP to Simulate the Real World
incomprehensible
|
---|
You are effectivley talking about creating the matrix for real here. Dispite the number of code flaws that have yet to be sorted out with Object Oriantated Programing Languages this is simply not humanly possable, and compleatly inpractical. It would take every Super Computer that has ever been built and decomishiond to be rebuilt, all the Super Computers currently commishiond and every desktop computer in the USSR, USA UK, Australia, Canada and India to be set up in Cluster configiration. That would mean Every Laptop and every Desktop computer left in the would would be devoted to running the world, if it was evean possable. Since all the computers left would be of variable Computational speed's and variable RAM and HD capacity the total number of TFps achived would be unknown as would the total capacity of RAM and HD. when networked all together. Plus all the computers that were left would have to be devided in bulk to the diffrent countrys running the matix. Im not convinced you can run a country on say 750 comercial laptops 125 comercial desktops and 125 comercial highspec desktops. Then there is always the matter of diplomacy would other countrys be willing to help the us in creating such an extrordinarily complex simulation, would the people be willing to donate there computeres to run the would ? this whole thing could break down in to world war 3. And that is suposed to be the Eugenics Wars in 2026 acording to Dr Lenord McCoy Chief Medical Officer abord the Star Ship Enterprise NCC 1701 as predicted by Dr McCoy signs that the world would esculate into full blown war began to emerge in the mid 1990's, since then Nuclear Development has progresed 10 fold as has Genetic Development. World War 3 could easily be triggerd by the conflict that will arise from forcing the issue of creating the matrix, and it is not so much the Nuclear Threat we should be concernd about but the Genetic Threat. The Eugenics Wars is going to be an attempt to controal the Human Genome through Genetic Manipulation, with the goal of creating bettor soldiers and bettor humans. The faild experiments of these tests will be human beings, with horibale disfigerments, servere lerning difficulty, cognitive imparments, mental impairments, physical disability and the goverments will call these the lucky ones because most of the rest will die. You can the kind of difficultys people will be forced to live with in Star Trek the Next Geniration Episode 1.1 Encounter at Farpoint, in the scene's were Q puts Picard on Trial in that Court Room, you can also see how within the next decade or so the military will be controling there solders with drugs. If the matrix were to be simulated on all the Super Computers in the world, and they actually maniged to pull it of and get it up and running, due to the complexity of the simulation you wouldent be able to turn it of. Think about it, the current global population of the world is estimated at 6,889,300,000 acording to the United States Census Bureau, so you would need 6,889,300,000 individual and unique physical modals with 6,889,300,000 unique preceptual subroutines with 6,889,300,000 personality profiles with 6,889,300,000 psycological profiles with 6,889,300,000 adaptive subroutines with 6,889,300,000 algorythems alowingthem to reproduce, with 6,889,300,000 subroutines alwing them to soschilise and form relationships with 6,889,300,000 speach recognition paramiters with 6,889,300,000 tactile subroutines with 6,889,300,000 anilitical subroutines, with 6,889,300,000 AI models and thats just for the population of the world. Next you need things like metreological models with a static AI based on historical weather patterns of the earth (aka based on the simulations current geographical conditions the metreological static AI pics out the most appropriate weather conditions to simulate, geographic modles to simulate things like tectonic plate shifts so it could reproduce earthquacks, stress in a support beam of a building, a flood, a volcano, this would also be a static AI model. You would also need a gravity subroutine, a day night subroutine, a day month year model wich would incorpirate the day night and gravity subroutines and be linked to the moon orbit and sun orbit models wich i havent mentiond yet. To simulate the would you not only haave to simulate the world but the orbit of the moon around earth and the earth around the sun as well as moon spin, eath spin and sun spin. It is the only way to get an acurate time in a simulated world, otherwise every time the realworld has a leap year or a Daylight Saving Time, the configerd time in the simulated world will continue as configured and will not stay acurate. Other thing you would need, are things like an algorythim to determine if wind preashure on a dead leaf blows moves the dead lead leaf as much and in the same direction as it does on a living leaf, when the dead or living leaf is still atateched to the tree, this would patch in to the metereological model to I/O instruction to/from the algorythem. Another example of something you might need in a simulation like the matrix, is physics, a Physics model would be an importent part of any object in the simulated enviroment that relies on Physics in the real world, thus a Physics model would also comprise of the Metreology Model. You would also need 6,889,300,000 Physics Models for the population. With so much data being inputted it is clear that you would need a huge data buffer as well as RAM, because i havent eaven begun on algorythembs, models, class's, subroutines, etc... etc... that would be needed for buildings, roads, parts, decision making, cars, jobs, schooling, growing up, death, religon, friends, emotion and i only mentions a about a handfull of the things i can think of that would fit on the tip of the pin compaird to what would be needed to recreate the matrix. So Clearly OOP is not a sutable language to simulate the entire world, and if it were used and the simulation sucseeded, because every model that had its own AI type would require its own shutdown subroutine, it would be impossable to do cause eaven with out the population you would have sevral thosand shutdowns to compleate, then with the population you would have to shutdown 6,889,300,000 times and thats just compleatly impractical. So the matrix my friends is best left on the tape till after the year 2263 when Zefren Chcrem Develops his Warp Ship And we Make First Contact with the Vulcans. I dont know about you but i intend to be there. Hi i have Aspergers Syndrome, Pathalogical Demand Avoidance Syndrome, Servere Dyscalculia, Mild Dyslexia, Dyslexic Dysgraphia, Reactive Depreshion, Traits of OCD, Epilepsy and Mild Developmental Delay (My Mental Age is close to 19 Instead of 23) Thanks for rading my Article. Good Morning —Preceding unsigned comment added by 82.10.123.229 (talk) 10:40, 22 December 2010 (UTC) |
Overview
Hope I'm not stepping on anyone's toes, but it seemed that a very simple-to-understand intro at the start of the Overview was in order so I added in several very simple paragraphs there. Unsure as to how much material to cover there; any suggestions? Warraqeen (talk) 18:44, 12 December 2010 (UTC)
Also, I've moved the pre-existing paragraph in the Overview which starts "A large number of software engineers agree..." into the Criticisms section. I'm not really sure it's necessary at all, but I thought it was safer to move it than delete it. Obviously a Criticisms section is important (OOP has many critics and many alternatives), but the paragraph I'm referring to seems to me not to say much more than 'some are for it; some are against it' which didn't need stating in an overview, in my mind. Warraqeen (talk) 10:42, 30 December 2010 (UTC)
"Design Patterns"
Should we be listing the 23 patterns here, given that there's a Dedicated Wikipedia article on the book? Gerardw (talk) 12:22, 19 December 2008 (UTC)
- I've read the entire "Design Patterns" section twice now and I still don't really understand what its about and what it has to do with design patterns. Given that most (all?) design patterns have relevance outside of OOP, I think its probably best just to have a link to the Design patterns page unless the content is truly OOP specific. --Kragen2uk (talk) 23:08, 12 January 2011 (UTC)
Technical Template
I see that somebody has added a technical template to the article, but I have not seen anybody offer an explanation in the talk section for why it is there. OOP is a fairly complicated subject which I don't really believe anybody would be trying to learn specific details of without enough background in programming to be able to understand or quickly research the technical terms in the article. Looking at what the template says should be done, it says to simplify things as much as possible without losing truthful accuracy to the information and that simple information as a general overview should be toward the beginning of the article. There's a nice concise definition at the beginning and then an expansive and fairly non-technical overview written almost wholly in common vocabulary. Unless we want to rewrite the entire article as lame analogies, I'm failing to see anyway to put it in laymen's terms moreso than it already is. Going into specifics beyond the overview doesn't seem to be possible in less technical terms than are already used and the subjects which might require further reading provide links to the appropriate articles. So does anybody care to explain how this article is too technical?98.27.162.44 (talk) 02:23, 13 December 2010 (UTC)
- Agreed! I am unable to imagine how to "improve this article to make it understandable to non-experts", so I think this template should be removed!--Tim32 (talk) 21:34, 13 March 2011 (UTC)
Abstraction is also achieved through Composition
Abstraction section says "Abstraction is also achieved through Composition". Could we get a source? All references on internet seem to come from wikipedia.
- Try searching for "inheritance vs composition". 69.111.194.167 (talk) 02:01, 26 April 2011 (UTC)
Formal def issues
Here's why I tagged those bullets:
- coalgebraic datatypes -- these are infinite data types like streams; not clear how they apply here. Is this a confusion with System F-related stuff?
- See Poll: Subtyping and inheritance for Categorical datatypes (http://www.cs.ru.nl/E.Poll/papers/kyoto97.pdf). esap (talk) 15:39, 5 June 2011 (UTC)
- recursion -- seems too simplistic; is this a confusion with open recursion?
"Comarea"?
A few days ago 86.142.127.235 changed a sentence to read "With designs of this sort, it is common for some of the program's data to be accessible from any part of the program (sometimes grouped into what is often known as a "Comarea").". The paranthetical section referring to "Comarea" is, IMO, nonsensical as this is not a common usage at all. In fact, the only time such a thing comes up on a Google search (e.g. for "program +comarea" or "programming +comarea") is a small number of references to IBM and HP mainframes, almost entirely when dealing with CICS. 86.142.127.235 has also recently modified some CICS related articles, so I'm assuming this editor spends their time engrossed in this niche community, rather than the computer science or programming community at large. I am reverting this part of the edit, unless anyone has any objections. -- Menacer (talk) —Preceding undated comment added 00:08, 2 August 2011 (UTC).
- Now restored with some modifications. Google search is not definitive and many terms were in use 30 years before PC's were in existence. IBM 360 Assembler (1960's) had a "COM" area addressable via a system parameter. CICS was (is?) the most common transaction processor worldwide. Programs in commercial use were (are?) significant in terms of global trade/importance and are hardly "niche". Engrossed is a massive exageration for editing just two articles.ken (talk) 14:56, 14 August 2011 (UTC)
Copyvio
This edit reeks of a copy-paste. It is the user's only contribution. It's clear from copying any part of the edit into google that the exact same text exists on multiple separate websites. This text has since been edited with wiki markup and integrated into the opening section of the article, but the point remains that the text itself is duplicated elsewhere.
I do not know exactly what to do about this, as my knowledge of copyright violation on Wikipedia is limited. I did not use the copyvio template because, to my knowledge, that involves blanking an entire page. It is unlikely that the text on the other websites was copied from Wikipedia, as the original contribution is significantly different from the prevalent style of Wikipedia.
At the very least, the section needs to be edited to reword everything and to conform to current style guidelines. The simplest solution is to delete the text, but I'd rather have an editor with a more thorough understanding of copyvio policies take a look at it. 67.193.178.107 (talk) 07:45, 24 January 2012 (UTC)
- Agreed, gone. Sorry about that, I should have caught it when it was added. Andy Dingley (talk) 10:29, 24 January 2012 (UTC)
Why is there no "benefits" section?
The article gives a long list of arguments not to use OO programming techniques. But OO must have benefits other than than it is perhaps "fashionable" or "modern".
FWIW I think that OO programming allows more elegant programs that are (potentially) easier to debug - at the expense of a considerably longer learning curve. Learning an "ordinary" computer languages typically is a matter of weeks, while writing truly OO programs (not just programs that dutiful apply OO constructs) requires the experience of months of even years.
I leave it to the (real) experts) to describe the (claimed) benefits of OO programming in a more precise (but hopefully still concise) way. Rbakels (talk) 07:28, 23 April 2012 (UTC)
- Ok. The fact is that OOP is very elegant programming for GUI--Tim32 (talk) 14:35, 24 April 2012 (UTC)
Restating the Definition of OOP
Hey I'm a student at NJIT. I am doing this for my Technical Communications class. I was reading the definition that is presented at the top of the article and I feel that it lacks some clarity. I think the definition is stated well enough for people with some background in programming and the Object Oriented paradigm but since wikipedia is supposed to be a place where the general population can come to learn the basics about a topic I feel that The definition should be rewritten to facilitate this. Another Problem with this definition is that it has not been cited with a source (scholarly or otherwise). I think adding a source would enhance this page as well.
My proposed definition is as follows. Changes I have made to the definition based on my gathered information is displayed in bold.
Object-oriented programming (OOP) is a programming paradigm that represents concepts as "objects" that have data fields(attributes which describe the object) and associated procedures known as methods. Objects which are instances of classes are used to interact with one another to design applications and computer programs.
I took out the information regarding techniques associated with OOP from the definition because I do not not believe it enhances the definition. That sort of information can be put in other areas of the article so that the reader does not become overwhelmed with information in the introduction. The reader can build his/her knowledge base up by reading the article so that when these topics appear in the article he/she can have a better grasp of what they actually are.
The source I found that corroborates this information is: Kindler, E., & Krivy,I. (2011) Object-oriented simulation of systems with sophisticated control. International Journal of General Systems,40(3),313-343. Doi:10.1080/03081079.2010.539975
If anyone has any thoughts or opinions on my proposed changes please feel free to comment. — Preceding unsigned comment added by IKP2-NJITWILL (talk • contribs) 00:18, 22 November 2012 (UTC)
- I disagree with the change. Object-oriented paradigm doesn't represents only concepts. I will prefer to use more general terms like "real-world things" or "subjects domain". Also when it says that represents concepts, I would avoid mention of terms like "data field", "methods", and "classes" because these are how the paradigm is implemented in concrete languages, but not the definition of the paradigm. The paradigm wants to represents objects of the real-world, encapsulating its attributes and behavior, and interacting with each other.
Gang of Four design patterns - Main article: Design pattern (computer science)
The "Main Article" is about more than the gang of four - it lists many more patterns than seen here — Preceding unsigned comment added by 68.183.23.147 (talk) 22:31, 16 January 2013 (UTC)
History does not include mention of CLU or Barbara Liskov
Why is there no mention of the CLU language or Barbara Liskov in the history section? Her work has been recognized as important to the development of object oriented programming.Dllahr (talk) 23:38, 29 January 2013 (UTC)
References to NOOP...(self promotion?)
Hi. I've been reading a few articles on OOP, coupling, cohesion, etc., and all over the place I find references to a recent book (NOOP, by AbdelGawad). Furthermore, these references in the text claim that "the author of XXX proved YYY", leaving very little room for doubt and suggesting that whoever added that to the wikipedia article read and confirmed that those proofs are correct, or that they are widely acknowledged as correct.
I see that all modifications to these references (on several pages) were made on the same day, within a few minutes, from the same IP. As far as I've been able to check, it's an IP in Egypt, were the author of the very same book is from (haven't checked the city, haven't been able to link the IP to a university or institution).
I think this is reason to suggest that this might be a case of self promotion or modification by a person involved with the author (co-worker, student, etc.), which would make that reference and information probably biased and, if I remember correctly, against wikipedia rules.
Could somebody else take a look and confirm whether these suspicions may be correct?
--2001:610:1908:1200:6C87:7C1:6CA2:BEF5 (talk) 15:21, 30 July 2013 (UTC)
- I took a look, and it seems like a pretty clear-cut case of citation spamming to me. I removed the ones I could find - if you see more, please remove them as well. Thanks! - MrOllie (talk) 15:40, 30 July 2013 (UTC)
Examples
Okay, this one isn't a biggie, but does anyone think namng Objective-C as the first example, without even mentioning C++ is a little... Weird? I mean.. come on. — Preceding unsigned comment added by 31.25.23.102 (talk) 11:31, 15 August 2013 (UTC)
"Decoupling"
The term "decoupling" is vague, and has a buzzword feel to it. Attempts to objectively measure it have been problematic or depends on many unproven assumptions. I'd suggest not mentioning it. --66.120.226.84 (talk) 18:30, 24 October 2008 (UTC)
- I don't think the term is vague at all, though I think the article currently does a terrible job of explaining what it is (namely, elimination of inter-object dependencies by use of encapsulation/information hiding/interface rigor). Nevertheless, neither "decoupling" nor "instance" is among the "quarks" listed in Armstrong's original (2003) paper submitted for publication, which is available free online at [[4]] (retrieved as reference number ITRI-WP034-0303). The version published in 2006 is available from the ACM at [[5]], but you have to be a member (I'm not). Unless the published version added "instance" and "decoupling", they should be deleted from the article. -- Unconventional (talk) 19:10, 14 November 2008 (UTC)
- I suggest either finding a good clean formal definition along with a wiki article, or not mention at all. Doing it half-way is confusing to the reader, who will already be burdened with lingo. --63.192.29.10 (talk) 16:01, 10 May 2010 (UTC)
I'm reading and I don't understand it. The following sentence can be read in two different ways: "Decoupling allows for the separation of object interactions from classes and inheritance into distinct layers of abstraction." I'd be grateful if someone who understands it could rewrite it. —Preceding unsigned comment added by 141.108.15.99 (talk) 10:36, 15 February 2011 (UTC)
Up until this section, the article was making some sense to me. I agree that some change is required, but have no idea what change. Jonathan G. G. Lewis 10:42, 5 November 2013 (UTC) — Preceding unsigned comment added by Jonazo (talk • contribs)
Who was the first OOP language?
Right now this article makes two contradictory claims:
- Used for simulating system behavior in the late 1960s, SIMULA was the first object-oriented language. In the 1970s, Xerox's Smalltalk was the first object-oriented programming language
There can only be one first one. --SingpolymaT E 21:00, 11 December 2012 (UTC)
I agree this section is crying out for some informed editing. The comments about the development of OOP languages belong in the History section, and should be reconciled with the existing content already there, and the duplicated content in other sections should be eliminated. Jonathan G. G. Lewis 10:57, 5 November 2013 (UTC) — Preceding unsigned comment added by Jonazo (talk • contribs)
- Actually, I don't think this is a contradiction. The text says Simula was the first OO language and Smalltalk was the first OO Programming language. Simula was not a general purpose programming language, it was a language specifically designed for developing simulations. Simula definitely came before Smalltalk though. I've heard Allen Kay say that and since he was one of the creators of Smalltalk... MadScientistX11 (talk) 00:41, 10 December 2013 (UTC)
LOOPS?
In the "History" section there is a hyper link for "LOOPS (programing language)". This page does not exist, it should be "LOOP (programing language)". The language itself is called LOOP so the word and the hyperlink should both be changed
- I just fixed this. Made the red link point to the actual article. From the article I'm not convinced that the link makes sense anyway, LOOP seems like a pure functional language not an OO language, the two are quite different, but I'm not going to address that for now, at least the red link is gone and the link points to the intended subject. RedDog (talk) 16:11, 11 November 2013 (UTC)
- I think this is supposed to be a link to an article on LOOPS, an OO extension to Lisp that was part of the Xerox tooklkit, I think on their Lisp machines. It looks like there was an article on LOOPS at one point, or perhaps there was always just a red link, anyway there is no article now although I think one would be merited. It definitely shouldn't be to LOOP which is a functional not OO programming language. MadScientistX11 (talk) 01:10, 10 December 2013 (UTC)
- I changed this and replaced "LOOP" with "LOOPS" but with no (red) link. MadScientistX11 (talk) 14:36, 10 December 2013 (UTC)
- I think this is supposed to be a link to an article on LOOPS, an OO extension to Lisp that was part of the Xerox tooklkit, I think on their Lisp machines. It looks like there was an article on LOOPS at one point, or perhaps there was always just a red link, anyway there is no article now although I think one would be merited. It definitely shouldn't be to LOOP which is a functional not OO programming language. MadScientistX11 (talk) 01:10, 10 December 2013 (UTC)
Two main branches of OOP?
Apparently there's an Alan C. Kay concept of OOP and a Barbara Liskov concept of OOP, and they don't mix on a fundamental level.
Kay OOP is about isolated machines sending messages to each other (Smalltalk). Liskov OOP is data-oriented encapsulation (C++).
Liskov said she was not aware of Smalltalk until 1975, and that development of CLU had started at about the same time as Smalltalk.
Alan C. Kay coined the term object-oriented programming (OOP) and he "did not have C++ in mind." C++ is Liskov OOP, not Kay OOP.
Sources: Barbara Liskov, Keynote, OOPSLA'09, 24th ACM SIGPLAN http://www.youtube.com/watch?v=qAKrMdUycb8
Alan C. Kay http://en.wikipedia.org/wiki/Alan_Kay
Liskov OOP vs Kay OOP https://news.ycombinator.com/item?id=2336444
Moryton (talk) 06:56, 24 July 2013 (UTC)
- There aren't two branches of OOP there are lots of them. Or at least there were. In the early days you had people from AI, software engineering, formal methods, etc. I think these "X vs. Y" fights (e.g. Booch vs. Rumbahgh, C++ vs. CLOS,...) were always mostly silly. They were part ego and part hype from vendors who made outrageous claims because that unfortunately is what you often need to do to sell some new software. I looked at that last link you provided and it seemed mostly to be people saying "Kay said X" on a discussion board which isn't a valid reference IMO. I've heard Allen talk about C++, his main issue with it (which a lot of us had) is that it's an OO layer on top of a low level procedural language and it still left a lot of the gunk like pointers available for people to use and cause problems. Anyway, I would strongly urge us to stay away from these religious arguments, X vs. Y, they are for the most part of historical interest at this point anyway, I suggest we focus on concrete issues: benefits, costs, etc. MadScientistX11 (talk) 14:45, 10 December 2013 (UTC)
"Criticism"?
The so called criticism section seems weird to me, trying to concoct a conflict pro or con object orientation. The real questions instead seems to be how much? and when?, and any answer and opinion must explain in what PL, with which object implementation and what OOA methodology. F.ex. the article by Luca Cardelli pinpoints some good and some bad qualities of OOP:s and ideas about how to circumvent the bad sides. Potok, Vouk and Rindos instead pinpoints that the reuse of code provided by OOP:s is discouraged by organisational structures within large corporations.
I think the section is valid and contains usable material providing reflection over advantages and disadvantages of object orientation, but not specifically "criticism". At least some of the sources provided produce an analysis of the productivity improvements, or lack thereof. This is more like neutral "evaluation", and so the section shouldn't describe the authors as having "criticized OOP". Rursus dixit. (mbork3!) 20:12, 2 April 2011 (UTC)
- "Criticism" refers to both positive and negative feedback. The current title is too wordy. I will use a title that makes the full meaning clear.
- Fennasnogothrim (talk) 08:08, 30 April 2011 (UTC)
- I agree. The criticism section made sense ten or even five years ago but now it should be titled "The Academics who don't know much about developing software in the real world and made fools of themselves" section. This debate is over guys, OO won, it's the default for new systems in just about every industry and most kinds of applications. And except for Stahlman most of the people quoted don't actually know much about real software development. I think that section should be removed or greatly reduced. MadScientistX11 (talk) 00:57, 10 December 2013 (UTC)
Actually the problem is the term "objectoriented" by itself: Without doubts are functions in Javascript and even structs in C objects. The only thing what they don't have is inheritence. Which is a great thing, but only if it's used correctly (and that needs a lot programming experience!). So we should slowly forget about this term "objectoriented" for languages with inheritance and see what's real. Even more, "objectoriented" programmers often include controller-code into the objects, which is a violation of MVC...so I fully understand the critics in this article (but only after more than a decade of "OO" programming). 178.197.236.172 (talk) 15:33, 24 July 2013 (UTC) "You wanted a banana but what you got was a gorilla holding the banana and the entire jungle"...lol, that just reminds me of nowadays Java frameworks. 178.197.236.172 (talk) 15:43, 24 July 2013 (UTC)
- Structs in C and Javascript are absolutely unequivocally NOT object-oriented. Saying it's "object-oriented but without inheritance" is like saying my girl friend is a beautiful woman except she has a penis (that was hypothetical btw). There actually is a word for languages like Javascript (Ada as well although they may have added true OO to Ada by now but at least in the version I was familiar with a few years ago) those are called Object Based, you can define objects which are essentially Abstract Data Types but they don't have inheritance or polymorphism. Sturcts in C though don't even qualify as object based. I've never heard anyone say that. Of course you can define various processes and utilities that build some OO capabilities on top of structs in C or in any language but by that metric assembler and COBOL and any language is OO, it's all a Turing Machine after all. MadScientistX11 (talk) 14:55, 10 December 2013 (UTC)
Section "criticism" needs contraction
As of 2010, I think we have a little perspective about what criticisms of OOP proved to be valid or at least insightful, keep those and drop the rest. Otherwise what we have here is a List of failed predictions regarding object-oriented programming patsw (talk) 18:04, 20 July 2010 (UTC)
- Agree absolutely MadScientistX11 (talk) 15:00, 10 December 2013 (UTC)
well, I use oop a lot, I find it a very useful technique. But I also am aware that it has a fad like nature, in the sense that it has been very much over hyped (cough cough ruby). I find the anti-opp quotes to be refreshing, illuminating, and humorous. It makes for a great counter-balance to a subject that is often lopsided. My vote is to keep the quotes. I'm a programmer who been at it for about 30 years. 67.40.8.215 (talk) 07:40, 9 December 2010 (UTC)
- Wikipedia is an encyclopedia not a blog. The purpose of the articles is to inform not amuse or to include things that people find "refreshing". I agree with Patsw those quotes are out of data and in many cases now just flat out wrong. One of the references talks about how OO can't support strong typing which is obviously false. MadScientistX11 (talk) 15:00, 10 December 2013 (UTC)
Removed external link to a Richard Mansfield article
I have removed an external link to a purported whitepaper by Richard Mansfield titled "Has OOP Failed?". It seems to be little more than a subjective rant; the author refers to computer science as "computer 'science'," claims that most programmers prefer non-object-oriented languages without citing a single source, etc. dpol (talk) 22:31, 14 March 2010 (UTC)
Mr. Mansfield has been in the industry quite a while and has been involved in many publications. This qualifies him as an experienced industry observer. There is not a lot of direct evidence on the benefits of OOP either way such that if you turn up the scrutiny knob, the entire article may end up disappearing. --63.192.29.10 (talk) 16:07, 10 May 2010 (UTC)
With regard to the question of whether OOP has failed or succeeded, I think someone should investigate whether vb.net (a pure OOP language) is more successful today than VB6 (not a pure OOP language) was in its day. Also, it might be worth clarifying the definition of success and failure. —Preceding unsigned comment added by 2.96.55.14 (talk) 18:30, 13 March 2011 (UTC)
I agree, Mr. Mansfield's criticisms aren't very constructive or insightful, and I'm hardly a fan of OOP. The link to his article in the "Criticism" section should be removed. The other OOP criticisms are well made, but the one by Mr. Mansfield is very weak and poorly defended. I also don't get why he thinks OOP is a darling of academia, either. Maybe in 1980 it was, but it certainly isn't right now. A far more useful critique of OOP comes from Oleg Kiselyov. (see http://okmij.org/ftp/Computation/Subtyping/) — Preceding unsigned comment added by 76.90.217.240 (talk) 22:37, 20 October 2011 (UTC)
- I just looked and the quote is back. I agree that ads no value and I'm going to remove it MadScientistX11 (talk) 15:06, 10 December 2013 (UTC)
What's Missing from This Article IMO
I used to be a consultant specializing in helping large IT organizations start using OO methods and tools. For what it's worth I had a way of introducing the ideas of OO that usually went over very well. This is one of those things you hear someone else say and you pick it up and make it your own, in my case I got this from an ex boss who was a lead scientist at Bell Labs and MCC before being my boss. What he and later I would always say to the programmers who had been doing it for 20 years and didn't want to change was the following: "Look there is a lot of BS and hype around OO. The vendors and gurus will pretend it's the mythical "silver bullet" that all of us who have read Brooks know really doesn't exist. So, no it's not magic and it's not going to suddenly revolutionize everything and make software trivial to develop and maintain. In fact what it REALLY is is just an extension of the good design principles you guys have been using all along. If you look at the progress of IT it starts with spaghetti code then moves to structured code then to abstract data types with structured code. OO just takes that idea to the next step. An Object is essentially just an abstract data type and the methods are just the functions that would be defined on an ADT. All the rest are bells and whistles to make things a bit more manageable and to incorporate good ideas and best practices we've learned over the years but the fundamental idea is an EVOLUTION and in fact a natural evolution from the good practices you've been using and not some drastic revolution." Curios what others think, I guess I'm slightly guilty of starting a general discussion but from my read of the article maybe a little abstract discussion is worthwhile, I think there is a lot of good stuff in the current article but it doesn't flow well and needs serious work in some places. I might try doing some of that at some point, I'm working on other things that I think will generate less controversy (and are easier to fix) right now. So I was wondering what people thought of the above idea. MadScientistX11 (talk) 14:33, 10 December 2013 (UTC)
- I rewrote the Overview section with this viewpoint in mind, used several classic software engineering and OO references: Booch, Meyer, Books, etc. --MadScientistX11 (talk) 16:12, 11 December 2013 (UTC)
new major section needed
"Design Patterns" should not include "Object-orientation and databases" and the following sub-sectinos — Preceding unsigned comment added by 68.183.23.147 (talk) 20:12, 16 January 2013 (UTC)
- There are patterns for integrating OO and RDBs. So I'm not sure what your point is. --MadScientistX11 (talk) 16:15, 11 December 2013 (UTC)
FUNDAMENTALS OF OBJECT ORIENTED PROGRAMMING AND ITS BASIC PROPERTIES
WRITE THE FUNDAMENTALS OF OBJECT ORIENTED PROGRAMMING DESCRIBE ITS BASIC PROPERTIES? — Preceding unsigned comment added by 1.38.23.41 (talk) 08:31, 21 February 2014 (UTC)
"MIT trying to steal OO ??"
I think most programmers agree that SIMULA from norway is the first OO language / model. Indeed the MIT examples show OO constructs, and I will argue that almost ANY language by the 80's was converging to this.
So the MIT paragraph at the beginning is not relevant and misleading. I vote to remove it. — Preceding unsigned comment added by 24.17.241.173 (talk) 18:46, 22 March 2014 (UTC)
Stepanov interview
From http://www.stlport.org/resources/StepanovUSA.html (for possible use in article):
- Yes. STL is not object oriented. I think that object orientedness is almost as much of a hoax as Artificial Intelligence. I have yet to see an interesting piece of code that comes from these OO people. In a sense, I am unfair to AI: I learned a lot of stuff from the MIT AI Lab crowd, they have done some really fundamental work: Bill Gosper's Hakmem is one of the best things for a programmer to read. AI might not have had a serious foundation, but it produced Gosper and Stallman (Emacs), Moses (Macsyma) and Sussman (Scheme, together with Guy Steele). I find OOP technically unsound. It attempts to decompose the world in terms of interfaces that vary on a single type. To deal with the real problems you need multisorted algebras - families of interfaces that span multiple types. I find OOP philosophically unsound. It claims that everything is an object. Even if it is true it is not very interesting - saying that everything is an object is saying nothing at all. I find OOP methodologically wrong. It starts with classes. It is as if mathematicians would start with axioms. You do not start with axioms - you start with proofs. Only when you have found a bunch of related proofs, can you come up with axioms. You end with axioms. The same thing is true in programming: you have to start with interesting algorithms. Only when you understand them well, can you come up with an interface that will let them work.
70.36.142.114 (talk) 00:49, 21 April 2014 (UTC)
- Not commenting on your opinion of OOP, but you are categorically wrong about mathematics and axioms. You start with axioms. If you don't, you don't have any mathematics. Proofs do not precede axioms and axioms are not produced by proofs.
Early Object Oriented Language Disagreement
I bring this up because of the reference to LISP atoms in the history section and features listed in Fundamental features and concepts.
Some Object Oriented Languages simply referred to use of self typed data. That is the implementation of the objects were part of the language. A variable usually was simply a pointer to the object or contained both type and a pointer to the object. LISP 2 is an example. A variable could contain any object type. An object could be tested as to it's data type. In the early object oriented programmings languages the class methods description does not apply. Objects types were not programmable but built in types of the language. They simply were dynamic objects created and destroyed. Dynamic memory management garbage collection etc was usually automatic.
Fundamental concepts
Fundamental concept of Object Oriented paradigm is object, not a class. Define objects as the implementation/exemplar/instance of a class is not correct. This is a purely technical definition in some languages (supporting classes). Object Oriented Programming is not equals Class Oriented Programming.
I agree that classes are not fundamental to Object Oriented programming. A well-known example is Javascript, which does not have classes. Also, the meta-analysis on wikipedia is basically inclusive, so it includes various concepts that are disputed. There exists (I have read it), a meta-analysis research paper which compares a bunch of OOP systems and concludes that the only concept that is universal and fundamental to all OOP systems is that of objects having identity (as opposed to being values). I wanted to add this with the reference, as I believe it has great practical and theoretical significance to the topic, yet this idea isn't mentioned in the article. Unfortunately I can't find the paper. Does anyone know of this paper? IIRC it was an IBM research paper. There are various other papers that mention it as a core concept, but this paper was the strongest. — Preceding unsigned comment added by 124.120.204.38 (talk) 15:43, 29 January 2013 (UTC)
Section Fundamental concepts turns everything upside down.--P99am (talk) 11:57, 24 April 2009 (UTC)
I agree except: The only concept that is universal and fundamental to all OOP systems is that of objects having identity (as opposed to being just values). Object types may be part of the language definition. Just is in non object oriented languages, types are not extensible. Variables are not typed instead are object containers.