Who’s Got the Time for Time Complexity?

Aimee
4 min readOct 19, 2020

Why might you want to use a forEach callback method over a traditional for loop? What’s the advantages of using reduce to find a sum versus declaring a variable and having each number added to the variable? When I began my coding journey, I wondered these questions but hadn’t given it much thought. Both achieve the same result in the end. What was frustrating was the answer I’d often hear back, “Well, doing X makes your code run a little faster than Y.” That’s when I learned about time complexity. Time complexity is a term in computer science theory that “quantifies the amount of time taken by an algorithm to run as a function of the length of the input”. For those of us who didn’t come from a computer science background, the way I understand it is essentially how much time it takes for your code or for an algorithm to run.

Time complexity is especially important when you begin to work with large amounts of data and code. For computers, “large amounts” of said data and code would be in the hundreds of thousands. If we think of companies such as Twitter or YouTube, who have 330 million and 2 billion users respectively, it wouldn’t be time efficient nor inexpensive to have code that iterates through all that data.

To begin, there are three types of time complexity notations: O Notation (Big O), Omega Notation (Big Omega), and Theta Notation (Big Theta).

Omega Notation is the term used to describe the ideal run time for an algorithm.

Theta Notation is the term used to describe for the worst run time.

Big O Notation is the runtime of an algorithm, given the input. Big O is the most common and (in a sense) misunderstood notation. You may be asked about Big O Notation by someone in the Tech field — a prospective employer, for example — when what they really mean is Theta Notation. It’s safe to say Big O Notation has come to stand for all three.

There are subsets of Big O Notation, with these four being the most common:

Credits to FreeCodeCamp

O(1) — This is fixed run time. O(1) is considered the “ideal” run time in software engineering and computer science. The following is an example of fixed run time:

let testArray = [1, 2, 3]const printPlease = (testArray) => {
return testArray[0]
}
printPlease(testArray);

O(N) — This is linear time. As the input grows, so does runtime. One common example is a for loop. As the range of your input grows, so does “N”.

let sampleArray = ["apple", "biscuits", "corn"]const example = (sampleArray) => {
for (let i = 0; i < sampleArray.length, i++){
return(`I'm going to a picnic and bringing ${sampleArray[i]}.`)
}
}
example(sampleArray);

O(N^2) — This is quadratic time. One common example is a nested for loop, where our first input is the outer for loop and our second input is the the inner for loop.

let quadraticArray = [[1, 2, 3], [3, 4, 5], [4, 6, 9]]const quadraticTimeExample = (quadraticArray) => { let product = 0;

for (let i = 0; i < quadraticArray.length; i++){
for (let j = 0; j < quadraticArray.length; i++){
product = ([i] * [j])
}
} return product
}
quadraticTimeExample(quadraticArray);

O(2N) — Although this is not displayed in the graph above, O(2N) is exponential run time. The runtime starts off small and grows exponentially larger. A common example is finding a Fibonacci sequence using recursion. Note: this is one of the worst runtimes.

const fibonacci = (num) => {
if (num < 1) {
return 0;
} else if (num < 2) {
return 1;
} else {
return fibonacci(num - 1) + fibonacci (num - 2)
}
}
fibonacci(10);

I hope this post helped simplify time complexity, and in doing so, gave a better perspective on how much impact individual code and work product has to the larger picture. There are many more subsets of Big O Notation, including logarithmic and polynomial (three or more nested for loops). If you’re interested in learning more about time complexity, or digging deeper and learning about space complexity (which is how much working space an algorithm takes) as well, I encourage you check out the below resources.

--

--