On Wed, Oct 3, 2012 at 10:15 AM, Matthieu Moy <Matthieu.Moy at grenoble-inp.fr> wrote:> Preston Briggs <preston.briggs at gmail.com> writes: >> Think about costs asymptotically; that's what matters. Calls and >> returns require constant time, just like addition and multiplication. > > Constant time, but not necessarily constant memory. > > Deep recursion will blow up your stack (AKA "segmentation fault" :-( ) > if the compiler could not optimize it (tail recursion optimization).Only if the recursion is very deep. In practice, a recursive descent parser isn't going to run out of stack space, nor will a quicksort or binary-tree walker, You're distracting this man from his job of learning how to program. Yes, it's a mistake to have a set of mutually recursive routines that chew up all your stack space; but it's a mistake to reply on a compiler optimization to avoid such problems. Write an explicit loop if it's a possibility. Preston
Preston Briggs <preston.briggs at gmail.com> writes:> On Wed, Oct 3, 2012 at 10:15 AM, Matthieu Moy > <Matthieu.Moy at grenoble-inp.fr> wrote: >> Preston Briggs <preston.briggs at gmail.com> writes: >>> Think about costs asymptotically; that's what matters. Calls and >>> returns require constant time, just like addition and multiplication. >> >> Constant time, but not necessarily constant memory. >> >> Deep recursion will blow up your stack (AKA "segmentation fault" :-( ) >> if the compiler could not optimize it (tail recursion optimization). > > Only if the recursion is very deep.Not so much. Stack size is usually a few megabytes, much smaller than the heap size.> In practice, a recursive descent parser isn't going to run out of > stack space, nor will a quicksort or binary-tree walker,Yes, because you're taking examples where the recursion depth is log(n) or close to it. "there are cases where recursion is OK" is true, but it doesn't imply "you shouldn't worry about your stack size".> Yes, it's a mistake to have a set of mutually recursive routines that > chew up all your stack space; but it's a mistake to reply on a > compiler optimization to avoid such problems.I agree with this, but this is not what the message I replied to said. Comparing calls and returns to addition and multiplication in terms of performance is still wrong. (BTW, it actually depends if the language enforces this optimization. IIRC, C doesn't so you can't rely on it, but Caml does, so tail-recursion is OK even for large datasets in Caml) -- Matthieu Moy http://www-verimag.imag.fr/~moy/
> Only if the recursion is very deep. In practice, a recursive descent > parser isn't going to run out of stack space, nor will a quicksort or > binary-tree walker,The recursive-descent parser case has happened in practice: http://my.opera.com/hallvors/blog/2012/07/17/twitter-crashes-itself-with-commas?1 Also, I've seen some recursion-related PR's in Clang, although I think that they are usually related to templates and not parsing. Also, it's easy to blow stack in quicksort if you don't tail call into the recursive invocation of the _larger_ subrange. Recursively calling into the smaller subrange guarantees that it's size is less than half of the current range, whereas a non-tail call into the larger subrange can require linear stack space if the partition isn't good.> You're distracting this man from his job of learning how to program.Indeed. Journeyer, you should ignore this stuff that we are saying, these are all exceptions: you'll fix them if they arise, but you generally shouldn't design with them in mind. Journeyer, if you want to learn more about tail calls, I recommend this paper by Guy Lewis Steele, Jr.: http://repository.readscheme.org/ftp/papers/ai-lab-pubs/AIM-443.pdf It's a pretty old paper, but it is exactly about your original question. By the end, you'll probably know more about tail calls as you'll ever need to know :) -- Sean Silva On Wed, Oct 3, 2012 at 1:28 PM, Preston Briggs <preston.briggs at gmail.com> wrote:> On Wed, Oct 3, 2012 at 10:15 AM, Matthieu Moy > <Matthieu.Moy at grenoble-inp.fr> wrote: >> Preston Briggs <preston.briggs at gmail.com> writes: >>> Think about costs asymptotically; that's what matters. Calls and >>> returns require constant time, just like addition and multiplication. >> >> Constant time, but not necessarily constant memory. >> >> Deep recursion will blow up your stack (AKA "segmentation fault" :-( ) >> if the compiler could not optimize it (tail recursion optimization). > > Only if the recursion is very deep. In practice, a recursive descent > parser isn't going to run out of stack space, nor will a quicksort or > binary-tree walker, > > You're distracting this man from his job of learning how to program. > > Yes, it's a mistake to have a set of mutually recursive routines that > chew up all your stack space; but it's a mistake to reply on a > compiler optimization to avoid such problems. Write an explicit loop > if it's a possibility. > > Preston > _______________________________________________ > LLVM Developers mailing list > LLVMdev at cs.uiuc.edu http://llvm.cs.uiuc.edu > http://lists.cs.uiuc.edu/mailman/listinfo/llvmdev
Hello Sean Silva, Matthieu Moy, Preston Briggs, Kent Williams and people interested in this Thank you for the useful information you provided. Now it's time for me to study the information you provided. Because the contents you pointed have some amount for me to eat up I now need time to understand those. In short, I understood the points but I need to read the pdf and examples you provided. Thank you for the kind replies. Sincerely Journeyer J. Joh 2012/10/4 Sean Silva <silvas at purdue.edu>:>> Only if the recursion is very deep. In practice, a recursive descent >> parser isn't going to run out of stack space, nor will a quicksort or >> binary-tree walker, > > The recursive-descent parser case has happened in practice: > http://my.opera.com/hallvors/blog/2012/07/17/twitter-crashes-itself-with-commas?1 > > Also, I've seen some recursion-related PR's in Clang, although I think > that they are usually related to templates and not parsing. > > Also, it's easy to blow stack in quicksort if you don't tail call into > the recursive invocation of the _larger_ subrange. Recursively calling > into the smaller subrange guarantees that it's size is less than half > of the current range, whereas a non-tail call into the larger subrange > can require linear stack space if the partition isn't good. > >> You're distracting this man from his job of learning how to program. > > Indeed. Journeyer, you should ignore this stuff that we are saying, > these are all exceptions: you'll fix them if they arise, but you > generally shouldn't design with them in mind. Journeyer, if you want > to learn more about tail calls, I recommend this paper by Guy Lewis > Steele, Jr.: > http://repository.readscheme.org/ftp/papers/ai-lab-pubs/AIM-443.pdf > It's a pretty old paper, but it is exactly about your original > question. By the end, you'll probably know more about tail calls as > you'll ever need to know :) > > -- Sean Silva > > On Wed, Oct 3, 2012 at 1:28 PM, Preston Briggs <preston.briggs at gmail.com> wrote: >> On Wed, Oct 3, 2012 at 10:15 AM, Matthieu Moy >> <Matthieu.Moy at grenoble-inp.fr> wrote: >>> Preston Briggs <preston.briggs at gmail.com> writes: >>>> Think about costs asymptotically; that's what matters. Calls and >>>> returns require constant time, just like addition and multiplication. >>> >>> Constant time, but not necessarily constant memory. >>> >>> Deep recursion will blow up your stack (AKA "segmentation fault" :-( ) >>> if the compiler could not optimize it (tail recursion optimization). >> >> Only if the recursion is very deep. In practice, a recursive descent >> parser isn't going to run out of stack space, nor will a quicksort or >> binary-tree walker, >> >> You're distracting this man from his job of learning how to program. >> >> Yes, it's a mistake to have a set of mutually recursive routines that >> chew up all your stack space; but it's a mistake to reply on a >> compiler optimization to avoid such problems. Write an explicit loop >> if it's a possibility. >> >> Preston >> _______________________________________________ >> LLVM Developers mailing list >> LLVMdev at cs.uiuc.edu http://llvm.cs.uiuc.edu >> http://lists.cs.uiuc.edu/mailman/listinfo/llvmdev-- ---------------------------------------- Journeyer J. Joh o o s a p r o g r a m m e r a t g m a i l d o t c o m ----------------------------------------