We use cookies to ensure you have the best browsing experience on our website. Please read our cookie policy for more information about how we use cookies.
Nah. Look up Big O amortized.
The short answer is that, generally Big O is about worst case, but when that worst happens very rarely and it's expected that an algorithm will be faster, that's the one we go with.
Good example (from Cracking the Coding Interview) is dynamic arrays . Say we were to initialize a dynamic array with size 1. Assigning a value to it is O(1). However, when the array is full - but we still want to add soemthing to it - it needs to be resized. In that case, a new array is created with double the size and the original array's contents are transferred over. When we do that, the time to add that second element becomes O(n) for that single operation. As we continue to add elements to the array, resizing it becomes less and less common. So the time to add an element to the array is O(1), amortized O(n)
Cookie support is required to access HackerRank
Seems like cookies are disabled on this browser, please enable them to open this website
Equalize the Array
You are viewing a single comment's thread. Return to all comments →
Nah. Look up Big O amortized. The short answer is that, generally Big O is about worst case, but when that worst happens very rarely and it's expected that an algorithm will be faster, that's the one we go with. Good example (from Cracking the Coding Interview) is dynamic arrays . Say we were to initialize a dynamic array with size 1. Assigning a value to it is O(1). However, when the array is full - but we still want to add soemthing to it - it needs to be resized. In that case, a new array is created with double the size and the original array's contents are transferred over. When we do that, the time to add that second element becomes O(n) for that single operation. As we continue to add elements to the array, resizing it becomes less and less common. So the time to add an element to the array is O(1), amortized O(n)