The Fibonacci Affair: Time Complexity for three approaches

B-letter
3 min readDec 25, 2020

Analyzing the time complexity of three different Fibonacci algorithms.

The following is one of the most quintessential algorithms offered to beginner students: print the first n numbers in the Fibonacci sequence, in which each number is the sum of the two previous numbers.

First 10 Fibonacci numbers:

1, 1, 2, 3, 5, 8, 13, 21, 34, 55.

Now, were you to implement an algorithm to carry out the task of computing Fibonacci numbers, you would have plenty of possible approaches. The “naive” one is a simple recursive algorithm:

Now, what is the time complexity of this algorithm? To answer appropriately, we must understand what this program does: given an arbitrary n, it checks if n = 0, returning 0, or n = 1, returning 1 (first Fibonacci number). Otherwise, it returns f(n-1) + f(n-2), which is exactly the definition of a Fibonacci number: the sum of the two previous numbers in the sequence. This means that the function f will call itself until n = 0 or n = 1. E.g., for n = 4:

f(4) will return f(3) + f(2)

f(3) will return f(2) + f(1)

f(2) will return f(1) + f(0)

f(1) will return 1

f(0) will return 0, Then:

f(2) = f(1) + f(0) = 1,

f(3) = f(2) + f(1) = 1 + 1 = 2,

f(4) = f(3) + f(2) = 2 + 1 = 3, which is indeed the forth number in the Fibonacci sequence.

Let’s get back to the time complexity of this algorithm. As one can easily see, unless n = 1 or n = 0, the f function will call itself twice every time. A visual approach is:

Therefore, it is reasonable to expect an exponential runtime for this algorithm. An upper bound for the runtime is O(2ⁿ). Actually, it is possible to find a more precise estimation of the runtime, which is discussed in the following article by Emily Marshall: https://www.baeldung.com/cs/fibonacci-computational-complexity

Now, if you look at the image above, you’ll notice that many computations are done more than once, f(2), for instance. What if we could develop an algorithm that could remember computations already done, reducing our recursion tree?

Such an approach is known as memoization since we shall find a way to register computations previously done, avoiding repetition.

Although we still use recursion, this time around, if we have already calculated a certain Fibonacci number, the function directly returns that value, instead of making more recursion calls.

For this particular algorithm, the time complexity is O(n). That is because now each call of the function does not result in another two calls. Take f(4) as an example. First, it calculates f(2) and f(3), storing its values in the vector lst. When it returns to the function and calls f(2), the value is already computed.

There’s yet another possibility, which completely ditches recursion. That’s the bottom-up approach. Let’s take a look:

This time around, we didn’t have to do any recursive call at all: function f receives a vector (we’re using C++ STL’s vector class) that was initialized with 0 at index 0 and 1 at index 1. For any n integer, all that is left to do is sum the elements of the vector until we reach the desired n position.

Again, the time complexity is O(n). This is easy to see as the function simply iterates through the vector, summing each pair of adjacent values.

I hope this quick article motivated you to further study algorithm complexity and runtimes, as well as the fabulous advent of dynamic programming.

--

--

B-letter
0 Followers

Business, code, and some other stuff.