The Recursion Tree Method resolves recurrence relations by converting them into recursive trees, where each node signifies the cost at different recursion levels. This visual representation simplifies understanding. Recursion, vital in computer science and mathematics, enables functions to self-reference. Recursion trees visually depict the iterative execution of recursive functions, aiding comprehension in problem-solving.

## Types of Recursion

Broadly speaking, there are two types of recursion namely,

- Linear Recursion
- Tree Recursion

### Linear Recursion

- A
function is a function that only makes a single call to itself each time the function runs. The factorial function is a good example of`linear recursive`

**linear recursion**.*A linearly recursive function takes linear time to complete its execution that’s why it is called linear recursion*. - Consider the psuedo-code written below,

```
function doSomething(n) {
// base case to stop recursion
if nis 0:
return
// here is some instructions
// recursive step
doSomething(n-1);
}
```

- If we take a look at this function
, it takes a parameter`doSomething(n)`

and does some computation and in the end, it calls again the`n`

**same function with smaller values**. - Suppose
is the time required to perform all of the computation when the function`T(n)`

is called with a parameter value`doSomething()`

. We can write a recurrence relation for this as well,`n`

. Here`T(n) = T(n-1) + K`

is some constant. Constant`K`

is added because some time is also needed to allocate or deallocate memory to a variable, or doing some mathematical task. in the function. The time is very`K`

**small and negligible**so we useto define it here.`K`

- The time complexity of this recursive program can be easily determined as the function
is called`doSomething()`

times in the`n`

**worst case**. More formally the time complexity of the function is.`O(N)`

### Tree Recursion

is just a phrase to describe when you make a recursive call`Tree Recursion`

**more than once**in your recursive case. The**fibonacci function**is a good example of Tree recursion.*The time complexity of tree recursive function is not linear, they run in exponential time.*- Consider the pseudo-code written below,

```
function doSomething(n) {
// base case to stop recursion
if n is less than 2:
return n;
// here is some instructions
// recursive step
return doSomething(n-1) + doSomething(n-2);
}
```

- Here the function
is almost similar to the previous one, the only difference is that`doSomething(n)`

**one more call**is being made to the same function with a smaller value of.`n`

- Let’s write the recurrence relation for this function as well,
. Again`T(n) = T(n-1) + T(n-2) + k`

is`K`

**some constant**here. - These types of recursion are called tree recursion where more than one call is made to the same function with smaller values. Now the interesting part,
*what is the time complexity of this function?* - Take the below recursion tree for the same function like a hint and take a guess.

- You may realize that it’s hard to determine the time complexity by directly looking into a recursive function,
*especially when it’s a tree recursion*. There are a lot of ways to determine the time complexity of such functions, one of them is. Let’s understand it more deeply.`Recursion Tree Method`

## What is the Recursion Tree Method?

is used to solve recurrence relations like $T(N) = T(N/2) + N$ or the two we have discussed above in types of recursion section. Generally, these recurrence relations follow the`Recursion tree method`

**divide and conquer**approach to solve a problem.- When a problem is divided into smaller subproblems, there is also some time needed to
**combine the solutions**to those subproblems to form the answer to the original problem.**For example**, the recurrence relation for**Merge sort**is $T(N) = 2 * T(N/2) + O(N)$. This additional O(N) is the time required to combine the solutions of two subproblems of size $T(N/2)$ which is true at the implementation level as well. We merge two sorted arrays to form a new sorted array in linear time in the**merge step**of Merge sort.**For example**, the recurrence relation for Binary search is $T(N) = T(N/2) + 1$, we know that in Binary search the search space is reduced to half in each iteration. As soon as we find the result, we return from the function. This is a constant time operation, this is the reason why $+1$ is added in the recurrence relation.

- Consider the recurrence relation, $T(n) = 2T(n/2) + Kn$. Here
is the time needed to combine the solutions of subproblems of size $n/2$.`Kn`

- Let’s draw the
**recursion tree**for the recurrence relation stated above,

By analyzing the above ** recursion tree** we can make a few deductions,

- The value of the node at each level is nothing but the
**size of the problem at that level**. Atthe problem size is $n$, at`level 0`

problem size is $n/2 and n/2$, and so on.`level 1`

- The
**height**of this recursion tree is equal to the**number of levels in the tree**, in general, we define the height of the tree as equal to $log(n)$, whereis the problem size. This is because we already discussed that recurrence relation uses the divide and conquer approach to solve a problem, and we only need $log(n)$ steps to reach problem size`n`

from problem size`1`

.`n`

**For example**, consider the value of $N = 16$, how many steps are needed to reach, if we are allowed to divide N by`N = 1`

at each step? Answer is`2`

**4 steps**, which is the value ofbecause we are dividing by 2 at each step.`log(16) base 2`

- $log(16) base 2$
- $log(2^4) base 2$
- $4 * log(2) base 2$, since $log(a) base a = 1$
- so, $4 * log(2) base 2 = 4$

- We consider the second term in recurrence as root at each level.

Although this method uses ‘tree’ in its name, you will be able to understand this method even with little or no knowledge of trees.

## Steps to Solve Recurrence Relations Using Recursion Tree

In the `recursion tree`

method, the time required to solve a subproblem is referred to as the **cost of the subproblem**. So if you find **“cost”** word associated with the recursion tree then it means nothing but the **time required to solve a subproblem.**

To solve a recurrence relation using the recursion tree method, a few steps must be followed. They are,

- Draw the
**recursion tree**for the given recurrence relation. - Calculate the
**height of the recursion tree**formed. - Calculate the
**cost**(*time required to solve all the subproblems at a level*) at each level. - Calculate the total number of nodes at each level in the recursion tree.
- Sum up the
**cost of all the levels**in the recursion tree.

*Let’s understand all of these steps with a few examples*.

### Example

- Consider the recurrence relation,

`T(n) = 2T(n/2) + K`

`T(n) = 2T(n/2) + K`

#### Solution

The given ** recurrence relation** shows the following properties,

- A problem size
is divided into`n`

**two sub-problems**each of size. The cost of combining the solutions to these sub-problems is`n/2`

.`K`

- Each problem size of
is divided i`n/2`

**nto two sub-problems**each of sizeand so on.`n/4`

- At the last level, the sub-problem size will be reduced to
. In other words,`1`

**we finally hit the base case.**

*Let’s follow the steps to solve this recurrence relation*,

### Step. 1 Draw the Recursion Tree

### Step. 2 Calculate the Height of the Tree

- Since we know that when we continuously divide a number by $2$, there comes a time when this number is
**reduced to**$1$. Same as with the problem size $N$, suppose after $K$ divisions by 2, $N$ becomes equal to 1, which implies,**(n / 2^k) = 1** - Here $n / 2^k$ is the
**problem size at the last level**and it is always**equal to**$1$. - Now we can easily calculate the value of $k$ from the above expression by taking $log()$ to both sides. Below is a more clear derivation,
- $n = 2^k$
- $log(n) = log(2^k)$
- $log(n) = k * log(2)$
- $k = log(n) / log(2)$
- $k = log(n) base 2$

So the **height of the tree** is $log(n) base 2$.

### Step. 3 Calculate the cost at each level

- Cost at
`Level-0 = K`

,**two sub-problems**are merged. - Cost at
`Level-1 = K + K = 2*K`

,**two sub-problems are merged two times**. - Cost at
`Level-2 = K + K + K + K = 4*K`

,**two sub-problems are merged four times**. and so on….

### Step. 4 Calculate the number of nodes at each level

Let’s first determine the **number of nodes in the last level**. From the recursion tree, we can deduce this

- $Level-0$ have $1 (2^0)$ node
- $Level-1$ have $2 (2^1)$ nodes
- $Level-2$ have $4 (2^2)$ nodes
- $Level-3$ have $8 (2^3)$ nodes

So the level $log(n)$ **should have** $2^(log(n))$ nodes i.e. **n nodes**.

### Step. 5 Sum up the cost of all the levels

- The
**total cost**can be written as,- Total Cost = Cost of all levels except last level + Cost of last level
- Total Cost = Cost for
**level-0**+ Cost for**level-1**+ Cost for**level-2**+ …. + Cost for**level-log(n)**+ Cost for last level

- The cost of the last level is
**calculated separately**because it is the base case and**no merging is done at the last level**so, the cost to solve a single problem at this level is**some constant value**. Let’s take it as $O(1)$. - Let’s put the values into the formulae,

- If you closely take a look to the above expression, it forms a
**Geometric progression**($a, ar, ar^2, ar^3 …… infinite time$). The sum of GP is given by $S(N) = a / (r – 1)$. Here $a$ is the**first term**and`r`

is the**common ratio**. - In the GP formed above, $a = 1$ and $r = 2$, after solving this we get,

Thus, `T(n) = O(n)`

That’s how we step by step solve any **recurrence relation** using the ** recursion tree method**.

*Ready to Build Strong Foundations? Enroll Now in Our Data Structures and Algorithms Course for Coding Excellence!*

## Conclusion

is a method of solving a large problem where the solution depends on the`Recursion`

**solutions of smaller instances**of the same problem.- Broadly speaking there are
**two types**of recursion,**Linear Recursion**, and**Tree Recursion**. - There are a lot of methods to solve r
**ecurrence relations**, the**Recursion tree method**is one of them. - Generally, these recurrence relations are just a
**mathematical way**to define a recursive problem. There are a few steps that should be followed to solve a recurrence relation using the recursion tree method.