We use cookies to ensure you have the best browsing experience on our website. Please read our cookie policy for more information about how we use cookies.
The array allocation complexity has to do with the internals of the language/compiler. For a language like C#, that allocation operation has to go through chunks of the memory and forcibly set each element of the array to 0. Other languages or operators might do this lazily by just marking the start and end points of allocated memory. This is constant time with random access, but you cannot guarantee that all the elements are 0 (they are instead garbage values that are left over from whatever was in that memory location previously). You might find more details about this in a computer architecture or compilers textbook.
Most hash table designs usually have a tradeoff between having constant worst case lookup and constant worst case insertion. Some designs do one or the other in constant worst case time while the other . I wrote "insertion/lookup" to encompass this. These tend to have a logarithmic worst case complexity for some operation, but also have an EXPECTED (not worst case) amortized constant runtime complexity (over uniformly random access sequences).
Cookie support is required to access HackerRank
Seems like cookies are disabled on this browser, please enable them to open this website
Divisible Sum Pairs
You are viewing a single comment's thread. Return to all comments →
The array allocation complexity has to do with the internals of the language/compiler. For a language like C#, that allocation operation has to go through chunks of the memory and forcibly set each element of the array to 0. Other languages or operators might do this lazily by just marking the start and end points of allocated memory. This is constant time with random access, but you cannot guarantee that all the elements are 0 (they are instead garbage values that are left over from whatever was in that memory location previously). You might find more details about this in a computer architecture or compilers textbook.
Most hash table designs usually have a tradeoff between having constant worst case lookup and constant worst case insertion. Some designs do one or the other in constant worst case time while the other . I wrote "insertion/lookup" to encompass this. These tend to have a logarithmic worst case complexity for some operation, but also have an EXPECTED (not worst case) amortized constant runtime complexity (over uniformly random access sequences).