Complexity of different operations on different data structures according to the Big-O notation

落爺英雄遲暮 提交于 2021-02-07 17:11:00

问题


I was reading about big O notation in java programming. I found the following table it shows different big O for different data structures.

http://bigocheatsheet.com/

My questions are:

  1. If I want to delete an item in an array, is it O(n^2)? (search and delete)
  2. If I want to delete an item in a stack, is it O(n)?
  3. Which one is more effective, is it a single linked list or double single list?
  4. In what case is that the insert operation is O(1) or O(n) in a hash table?
  5. If I want to delete an item in a binary search tree, is it O(log(n)*log(n)) while insert is just O(log(n))?

Thanks.


回答1:


Let me answer your questions:

  1. Nope. It's O(n) + O(n) = O(n).
  2. Nope, it's O(1) but you only have access to one, very on top element. For other elements it's O(n).
  3. Double single list will be not worse, might be faster sometimes, but needs more memory for additional references.
  4. Best time - when there is no element with that hash yet, Worst when all inserted elements have the same hash according to some modulo. For example, if you compare modulo 3 of hashes and your hashing function always returns a number which is 3k for some k, you'll have O(n).
  5. Nope, according to your table, it's worst case O(n).

I'll explain more in a minute.




回答2:


  1. If you want to delete an item from an array, first you have to search it (takes O(n)) and then you have to shift the items to fill the gap (takes O(n)). So, the effective time complexity is O(n).
  2. If you want to delete an item from a stack, you can only delete the topmost element (a property of the stack data structure). So, it can be done in O(1).
  3. Which type of linked list is more effective depends on your requirements. For instance, if you want to save memory, then you might not use doubly-linked lists due to the overhead of maintaining an extra pointer reference. But, if you want to be able to traverse your list in both directions, you have to use a doubly-linked list. Implementation of a doubly-linked list is a bit lengthy as you have to perform more pointer operations, but, many operations (like deleting the last element) can be performed with ease.
  4. We prefer to use Hash tables because we can achieve O(1) insertion and search time. O(n) insertion time is taken by almost all other data structures. (O(n) class includes O(log n), O(1), etc.). Suppose we use separate chaining in the hash table where each chain is a sorted linked list. Then, for each insertion, we need to search the linked list to find the right position to insert (just like Insertion Sort), which will take O(n) worst-case time.
  5. If you have to delete an element from a BST, first you have to search it (takes O(log n) in average case) and then you have to replace the deleted node with its inorder successor or predecessor (takes O(log n)). So, the effective average-case deletion time is O(log n).



回答3:


  1. When you do one operation and then the other, you don't multiply their complexity - the max one wins. In this case O(n) + O(n) = O(n)
  2. Does stack have a 'delete arbitrary item' API? If you expose the underlying data structure that's a different thing, but a stack usually has a single way to "pop" elements
  3. What's the use case?
  4. Look up what happens during hash collisions.


来源:https://stackoverflow.com/questions/40718116/complexity-of-different-operations-on-different-data-structures-according-to-the

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!