We use cookies to ensure you have the best browsing experience on our website. Please read our cookie policy for more information about how we use cookies.
Remove the constant! Should focus on rate-of-growth rather than constant speed operations. :) O(n) = O(2n) = O(3n) etc.. When factoring time complexities of algorithms into your function.
Like rzhang said: How fast does the time it take to complete the operation grow relative to the number of entries in the dataset.
If you have something that grows at a rate of O(n^2) (e.g. nested for loop) you have to analyze every element for every element of an array. (Google bubble sort: worst-case is O(n^2), and we only care about worst-case in most situations)
If you have something that grows at a rate of O(log n) (e.g. Binary Search Tree, B-Tree, Red-Black Tree, etc. I know, I'm getting a little bit into data structures in the algorithms section.) you have the ability to reach what you're looking for, add what you're adding, remove what you're removing without having to look at every single element. (Google heap sort: worst-case is O(n log n))
You should also consider the space complexity (how much memory) of your algorithm. Bubble sort may have a worst-case time complexity of O(n^2), but it's space complexity is O(1), which is the best you can get. Radix sort has a worst-case time complexity of O(nk), but it's space complexity is higher at O(n+k).
Cookie support is required to access HackerRank
Seems like cookies are disabled on this browser, please enable them to open this website
Mini-Max Sum
You are viewing a single comment's thread. Return to all comments →
Remove the constant! Should focus on rate-of-growth rather than constant speed operations. :) O(n) = O(2n) = O(3n) etc.. When factoring time complexities of algorithms into your function.
Like rzhang said: How fast does the time it take to complete the operation grow relative to the number of entries in the dataset.
If you have something that grows at a rate of O(n^2) (e.g. nested for loop) you have to analyze every element for every element of an array. (Google bubble sort: worst-case is O(n^2), and we only care about worst-case in most situations)
If you have something that grows at a rate of O(log n) (e.g. Binary Search Tree, B-Tree, Red-Black Tree, etc. I know, I'm getting a little bit into data structures in the algorithms section.) you have the ability to reach what you're looking for, add what you're adding, remove what you're removing without having to look at every single element. (Google heap sort: worst-case is O(n log n))
You should also consider the space complexity (how much memory) of your algorithm. Bubble sort may have a worst-case time complexity of O(n^2), but it's space complexity is O(1), which is the best you can get. Radix sort has a worst-case time complexity of O(nk), but it's space complexity is higher at O(n+k).