-
Notifications
You must be signed in to change notification settings - Fork 51
Test passing #39
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: master
Are you sure you want to change the base?
Test passing #39
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
✨💫 Nice job, Veronica. I left some comments on your implementation below.
A few things about the comprehension questions. Heaps aren't entirely unordered, we can say they are weakly, or partially ordered due the relationship between a node and it's two children (not left-right less-more, just strictly less or more than the children depending on whether this is a MinHeap or MaxHeap). And while we could use a nodes to build a heap, we benefit from the convenience of index calculations and data being located close together when using an array. The reason for adding and removing being O(log n) has to do with use needing to swap the new node (or the node we moved to the top in a remove) potential all the way to the top or bottom of the heap. Because a heap is balanced, the height will be at most log n, and therefore, we swap up to O(log n) times.
🟢
Time Complexity: ? (log n) | ||
Space Complexity: (O 1) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
👀 In the worst case, the new value we're inserting is the new root of the heap, meaning it would need to move up the full height of the heap (which is log n levels deep) leading to O(log n) time complexity. Your implementation of the heap_up
helper is recursive, meaning that for each recursive call (up to log n of them) there is stack space being consumed. So the space complexity would also be O(log n). If heap_up
were implemented iteratively, this could be reduced to O(1) space complexity.
Space Complexity: (O 1) | ||
""" | ||
pass | ||
if value == None: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
👀 Prefer using is
to compare to None
Time Complexity: (O n) | ||
Space Complexity: (O N) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
👀 Similar to add
, In the worst case, the value that got swapped to the top will need to move all the way back down to a leaf, meaning it would need to move down the full height of the heap (which is log n levels deep) leading to O(log n) time complexity. Your implementation of the heap_down
helper is recursive, meaning that for each recursive call (up to log n of them) there is stack space being consumed. So the space complexity would also be O(log n). If heap_down
were implemented iteratively, this could be reduced to O(1) space complexity.
Space Complexity: (O N) | ||
""" | ||
pass | ||
if len(self.store) == 0: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
👀 We have a helper method that can check for the heap being empty.
Time complexity: (O 1) | ||
Space complexity: (O 1) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
✨
self.swap(parent_node, index) | ||
self.heap_up(parent_node) | ||
|
||
def heap_down(self, index): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
✨ Nice set of conditionals to narrow in on where to swap, and then only perform the swap and continue the heapification if the key of the candidate requires it.
Though not prompted, like heap_up
, heap_down
is also O(log n) in both time and space complexity. The worst case for re-heapifying is if the new root need to move back down to a leaf, and so the stack growth will be the height of the heap, which is log n. If we implemented this instead with an iterative approach, the space complexity would instead be O(1).
the heap property is reestablished. | ||
""" | ||
pass | ||
current_size = self.store |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
👀 To me current_size
sounds more like a number than a list of the data comprising the heap. When making an alias to an internal field (which allows us to avoid writing self
so frequently), it's pretty common to use the same name as the aliased field for clarity.
store = self.store
left_c = index * 2 + 1 | ||
right_c= index * 2 + 2 | ||
|
||
if left_c < len(self.store): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
👀 You made your alias to self.store
, so be sure to use it here for consistency. The biggest concern around consistency is that when we do things inconsistently, it makes the person reading the code stop and wonder whether the inconsistency is meaningful, or just an oversight.
if left_c < len(current_size):
Time Complexity: (log n) | ||
Space Complexity: (O n) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
👀 Since sorting using a heap reduces down to building up a heap of n items one-by-one (each taking O(log n)), then pulling them back out again (again taking O(log n) for each of n items), we end up with a time complexity of O(2n log n) → O(n log n). While for the space, we should be aware of the O(log n) space consume during each add and remove, but they aren't cumulative (each is consumed only during the individual call to add or remove). However, the internal store for the MinHeap
does grow with the size of the input list. So the maximum space would be O(n + log n) → O(n), since n is a larger term than log n.
Note that a fully in-place solution (O(1) space complexity) would require both avoiding the recursive calls, as well as working directly with the originally provided list (no internal store).
i = 0 | ||
|
||
while not heap.empty(): | ||
list[i] = heap.remove() | ||
i += 1 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Note the since this isn't a fully in-place solution (the MinHeap has a O(n) internal store), we don't necessarily need to modify the passed in list. The tests are written to check the return value, so we could unpack the heap into a new result list to avoid mutating the input.
result = []
while not heap.empty():
result.append(heap.remove())
return result
Heaps Practice
Congratulations! You're submitting your assignment!
Comprehension Questions
heap_up
&heap_down
methods useful? Why?