May 23, 2019 4 minutes

An algorithm design paradigm that is :

  • Based on “divide and conquer” approach.
  • Is used to solve complex problems by breaking the original problems into small problems.
  • Implemented by calling the same methods or routines for small subsets of the problems and then combining the results of those individual subsets.

To understand recursion, one must first understand recursion - Stephan Hawking

Every recursive algorithm is comprised of two sections:

  1. Base Case: The condition used to break the recursive calls. This case returns some value without making any additional recursive calls. In other words, the base case holds true for a certain input that can be evaluated without recursive calculation. While making the recursive calls, this case eventually breaks the cycle and avoids the infinite recursion.
  2. Recursive Case: Also known as the reduction step, this step is used to map the current input-set to one or more smaller input sets which will finally converge to the base case.


Consider the following recursive method to calculate the factorial of a number:

public int calculateFactorial(int number) {
   return number == 1 ? number : // base case
	(number * calculateFactorial(number - 1)); // recursive case

The above method works by checking for the base case (number == 1) in every iteration:

  1. If in any given iteration, number==1 is true, value 1 is returned
  2. else, the current value is multiplied by the output of the same method called with number-1

As you can see, in every iteration, the input number is decreased by 1 which will eventually lead to the base case satisfaction ( i.e. number==1), after which no more recursive calls are made.

Let’s consider another example of calculating $n^{th}$ fibonacci numbers that involves making two recursive calls in each iteration:

public int calculate(int n) {
    return (n == 1) ? 1 : (n < 1) ? 0 : //base case
	calculate(n - 1) + calculate(n - 2); // recursive case

As clear from above example, we can see that the base case is for values 1 and anything less than 1. For every other case, we make two recursive calls with input n-1 and n-2 respectively. The output of each iteration is summed and returned to the parent call.

Head Recursion vs Tail Recursion

  • Head Recursion: A recursive routine is said to be head recursive, if it invokes itself before the completion of the routine and the results returned are somehow used (generally) in the parent call.

  • Tail Recursion: A tail recursive function generally has the recursive call as the last statement in the routine so that the results returned are not used inside the routine itself.

The tail-recursive version of any recursive routine is said to be more efficient than the corresponding head-recursive version as the same can be optimized by the compiler.

The idea about tail call elimination is simple: since the recursive call is the last statement, there is nothing left to do in the current iteration, so there is no need to save the current stack. The compiler achieves this (not limited to) by using simple goto statements with updated inputs.

//head recursive factorial calculation
private int factorialHeadRecursive(int N){
    if (N == 0){
        return 1;
    return N*factorialHeadRecursive(N-1); // the result is multiplied with current N

//tail recursive factorial calculation
private int factorialTailRecursive(int N, int previousResult){
    if(N == 0){
	return previousResult;
    return factorialTailRecursive(N-1,N*previousResult); // the result is not used inside the parent stack

MergeSort vs QuickSort

Even with a worst case time complexity of $O(n^2)$, quick sort is more preferred than merge sort due to tail recursion support. This makes it good candidate for cache locality on supported systems.

One thing to note here is that java does NOT support tail call optimization (TCO). Watch this video on youtube where Brian Goetz explains why java does not support it yet (due to certain design choices made for security-sensitive methods). The same is in pipleline but not just now.


Some common questions around recursion:

Q. Recursion over Iteration?
  • Recursive algorithms are generally more clean and concise as compared to their iterative counterparts.
  • Certain problems like tree traversal, tower of hanoi etc are inherently recursive and are more apt for recursive solutions as compared to iterative solution.
Q. Iteration over Recursion?
  • Recursive algorithms have an additional space complexity associated with them in the form of function call stack.
  • Even if the written code is more concise, easy to read and manage, it involves a bit of additional execution cost in terms of maintaining function call stacks and return overheads.

Please refer to this link for various examples solved using recursion.

comments powered by Disqus