We use cookies to ensure you have the best browsing experience on our website. Please read our cookie policy for more information about how we use cookies.
Firstly, d is absolutely capped at 20 in this challenge and solutions that might scale better to large values are not relevant here. They might be more efficient in other situations, they are not more efficient here.
Secondly, your solution doesn't require more space with larger d because it holds the entire data set in memory all the time. Your solution has small auxiliary space because your main storage space is n, which can be 10000 integers, where the auxiliary space of the queue solution is d (never more than 20) elements (two integers per element, if using tuples to represent both the numbers and the count of numbers seen thus far in that particular interval sequence). So the only situation in which your solution doesn't take up more space is when d >= n/2 and there are consistently no gaps in the sequence of integers. If d were likely to be more than half the size of the total data set, you might have an argument. As it is, you do not.
If variables are unbounded (again, not relevant to this solution), we also have to consider unbounded n - an unbounded amount of integers which your solution requires to be held entirely in memory. (Please stop saying "that could be fixed"; a) that's been answered, the fixes wouldn't actually be fixes and b) that's a different, hypothetical algorithm which you haven't written.) If d rises beyond a certain large size (relative to the size of the data set and the computer's memory), then the queue solution becomes impractical. Your solution is only practical when n is known to be trivial and the slower speed (3 times the number of iterations per item in the data set) is judged not a concern.
Your algorithm is naive. There are more naive algorithms displayed in on this page, to be fair, but you came into this particular thread about an efficient algorithm to claim you had something better. You don't; your solution does more work and requires more space. If you can do the work to show how a variation might require less space, you might have something. Go do that work, bring it back to present it. That would be productive. This isn't.
Cookie support is required to access HackerRank
Seems like cookies are disabled on this browser, please enable them to open this website
Beautiful Triplets
You are viewing a single comment's thread. Return to all comments →
Firstly, d is absolutely capped at 20 in this challenge and solutions that might scale better to large values are not relevant here. They might be more efficient in other situations, they are not more efficient here.
Secondly, your solution doesn't require more space with larger d because it holds the entire data set in memory all the time. Your solution has small auxiliary space because your main storage space is n, which can be 10000 integers, where the auxiliary space of the queue solution is d (never more than 20) elements (two integers per element, if using tuples to represent both the numbers and the count of numbers seen thus far in that particular interval sequence). So the only situation in which your solution doesn't take up more space is when d >= n/2 and there are consistently no gaps in the sequence of integers. If d were likely to be more than half the size of the total data set, you might have an argument. As it is, you do not.
If variables are unbounded (again, not relevant to this solution), we also have to consider unbounded n - an unbounded amount of integers which your solution requires to be held entirely in memory. (Please stop saying "that could be fixed"; a) that's been answered, the fixes wouldn't actually be fixes and b) that's a different, hypothetical algorithm which you haven't written.) If d rises beyond a certain large size (relative to the size of the data set and the computer's memory), then the queue solution becomes impractical. Your solution is only practical when n is known to be trivial and the slower speed (3 times the number of iterations per item in the data set) is judged not a concern.
Your algorithm is naive. There are more naive algorithms displayed in on this page, to be fair, but you came into this particular thread about an efficient algorithm to claim you had something better. You don't; your solution does more work and requires more space. If you can do the work to show how a variation might require less space, you might have something. Go do that work, bring it back to present it. That would be productive. This isn't.