Talk:Corecursion

Latest comment: 11 years ago by Ruud Koot in topic Related concepts?

Earlier discussion at recursion page edit

Add section title. —Nils von Barth (nbarth) (talk) 17:56, 25 July 2012 (UTC)Reply

Here is a discussion (which was started at Talk:Recursion) about how the recursion page at that time failed to properly account for corecursion. --Malcohol 15:08, 6 June 2006 (UTC)Reply

Duality? edit

I don't understand the definition corecursion is a type of operation that is dual to recursion because I don't know which duality it refers to. I know lots of dualities (e.g. min/max in lattices, point/line in projective geometry.) Is the particular duality referred to here something to do with reversing arrows in Category Theory? If so, perhaps the article should say something about it, or at least make reference to a page that does. Tom Duff 19:48, 15 January 2007 (UTC)Reply

Also, what does bisimulation (one of the references) have to do with corecursion? Tom Duff 19:51, 15 January 2007 (UTC)Reply

I added a reference to David Turner's "Total Functional Programming" paper, which might help: a significant part of it is the introduction and discussion of codata, corecursion, and coinduction as first-class duals to data, recursion, and induction. --Piet Delport 15:27, 26 January 2007 (UTC)Reply

Code Errors edit

The given python code from Turner's book is invalid, since it references an "add" variable which has not been defined. I don't have a copy of the book, but it would be helpful if someone could work out what it's meant to refer to and add (hah!) that in. 5-13-2010 —Preceding unsigned comment added by 72.248.210.37 (talk) 14:38, 13 May 2010 (UTC)Reply

  • Huh? This article links to an article by Turner, which you can download and read at no charge, not a book. And the python code is from Raymond Hettinger's recipe on ActiveState. Obscuranym (talk) —Preceding undated comment added 12:12, 16 May 2010 (UTC).Reply

Language edit

What language was this written in originally before it got run through Google's translator? (Don't bother telling me the person who wrote it knew English because I'm not having it!) —Preceding unsigned comment added by 221.232.82.218 (talk) 01:28, 27 July 2010 (UTC)Reply

If you have some constructive criticism, I'd love to hear it. Obscuranym (talk) 05:54, 28 July 2010 (UTC)Reply

Lazy or non-strict? edit

The article says:

Even if the result is finite, this example depends on lazy evaluation due to the use of self-referential data structures.

I think non-strict evaluation (not necessarily lazy) is enough; other non-strict evaluation strategies, such as lenient evaluation, might suffice. See http://www.haskell.org/haskellwiki/Lazy_vs._non-strict . 187.46.235.5 (talk) 22:40, 8 October 2011 (UTC)Reply

You're right (although outside of the Haskell community "lazy" is often used synonymously with "non-strict", i.e. without the sharing requirement). —Ruud 11:35, 10 October 2011 (UTC)Reply

Major expansion: non-technical introduction, imperative examples (Python) edit

I’ve just completed a significant expansion (more than doubling) of the article, as of this edit. This consists essentially of:

  • Rewriting and expanding the non-technical lede (introduction),
  • Giving two detailed example, contrasting with recursion: factorial and tree traversal, implemented imperatively in Python,

(together with various cleanup and formatting, and a brief note on history, AFAICT).

Corecursion seems a very interesting topic, and perhaps a useful fundamental viewpoint, which appears to be only recently appreciated and not yet widely-known (certainly not outside functional programming community). I am no expert on this subject – I just reworked the existing article (discussion and Haskell examples), expanding the wording and giving explicit examples, and looked at the references. The non-technical discussion is perhaps a bit glib (it very sharply delimits the recursion/corecursion duality), but hopefully accurate and readable. The key points – corecursion useful for infinite data, and works forward (synthetically, in contrast to analytically, as regular recursion) – seem easy enough to state clearly in non-technical terms.

As for the examples:

  • Factorial is very simple – it’s basic recursion example, and simpler than Fibonacci, so seems a good base case from the recursion point of view
  • Tree traversal seems a great contrast example – depth-first traversal is fundamental non-trivial example of recursion (and very well-known), while breadth-first traversal is a basic motivating example of corecursion, which thus should have similar prominence.

As this topic is potentially very abstract, and functional programming is often unfamiliar, concrete examples with imperative code (the Python is virtually pseudo-code, with the only confusing point being “yield”) should clarify it.

Hope this helps, and please improve as you see fit!

—Nils von Barth (nbarth) (talk) 17:50, 25 July 2012 (UTC)Reply
In my opinion the difference between recursion and corecursion is best demonstrated in languages that make an explicit distinction between them (e.g. Coq or Agda). In Haskell the distinction is vague (because—ignoring strictness annotations—all data is codata), imperative language are perhaps even worse because recursion is less "natural" in those languages and they don't have algebraic data types. In Coq a "recursive function" is one that operates by structural recursion over data (i.e. its domain is data), while a "corecursive function" is one that has codata as it's result/range and operates by guarded recursion. Perhaps a strict function language such as ML or Scheme work better as all data their data is data and have to codata by explicit delay and forces. —Ruud 19:22, 25 July 2012 (UTC)Reply

Related concepts? edit

I’d be interested in related concepts for context and to help people more familiar with other areas. Notably, in set theory/logic terms, recursion/corecursion sound similar to the recursive set/recursively enumerable set, in the the former is “defined by input” while the later is “defined by output” – corecursion and recursively enumerable both sound basically like “what you can get as output”.

Corecursion also feels very similar to tail recursion, or perhaps better, a generalization – more precisely, tail recursion feels like doing recursion by having a backwards corecursion; the accumulator variable you often find in tail recursions in the give-away (the accumulator is the corecursive sequence). For example, computing the factorial via tail recursion actually consists of computing the falling factorial corecursively, and then outputting the end of this sequence (which is finite if you start at a natural number). Indeed, user Will Ness at StackOverflow claims that corecursion is basically (exactly?) tail recursion modulo cons (e.g., this answer, July 2012). As intriguing as this may seem, I didn’t find any substantiation of this in a quick literature/Google search, so this may be completely mistaken, and in any case is unsourced speculation.

If anyone could find discussions of corecursion that are not just in abstract functional programming terms, they would be quite appreciated!

—Nils von Barth (nbarth) (talk) 18:30, 25 July 2012 (UTC)Reply
Some related concepts:
  • Codata (or coinductive types) are defined by coinduction, that is as the greatest fixpoint of a set of equations. These types can model infinite structures (e.g. potentially infinite lists or streams). In strict languages you have to do explicit lazy evaluation to work by those types, e.g. explicit delaying and forcing, or using generators.
  • Regular data is defined inductively, that is as the least fixpoint of a set of equations. These types can only model finite structures (e.g. finite lists).
I don't immediately see the connection with tail recursion modulo cons. Perhaps it's related to guarded recursion? —Ruud 19:32, 25 July 2012 (UTC)Reply