What is the time complexity of indexing, inserting and removing from common data structures?

喜夏-厌秋 提交于 2019-12-20 07:58:39

问题


There is no summary available of the big O notation for operations on the most common data structures including arrays, linked lists, hash tables etc.


回答1:


Information on this topic is now available on Wikipedia at: Search data structure

+----------------------+----------+------------+----------+--------------+
|                      |  Insert  |   Delete   |  Search  | Space Usage  |
+----------------------+----------+------------+----------+--------------+
| Unsorted array       | O(1)     | O(1)       | O(n)     | O(n)         |
| Value-indexed array  | O(1)     | O(1)       | O(1)     | O(n)         |
| Sorted array         | O(n)     | O(n)       | O(log n) | O(n)         |
| Unsorted linked list | O(1)*    | O(1)*      | O(n)     | O(n)         |
| Sorted linked list   | O(n)*    | O(1)*      | O(n)     | O(n)         |
| Balanced binary tree | O(log n) | O(log n)   | O(log n) | O(n)         |
| Heap                 | O(log n) | O(log n)** | O(n)     | O(n)         |
| Hash table           | O(1)     | O(1)       | O(1)     | O(n)         |
+----------------------+----------+------------+----------+--------------+

 * The cost to add or delete an element into a known location in the list 
   (i.e. if you have an iterator to the location) is O(1). If you don't 
   know the location, then you need to traverse the list to the location
   of deletion/insertion, which takes O(n) time. 

** The deletion cost is O(log n) for the minimum or maximum, O(n) for an
   arbitrary element.



回答2:


I guess I will start you off with the time complexity of a linked list:

Indexing---->O(n)
Inserting / Deleting at end---->O(1) or O(n)
Inserting / Deleting in middle--->O(1) with iterator O(n) with out

The time complexity for the Inserting at the end depends if you have the location of the last node, if you do, it would be O(1) other wise you will have to search through the linked list and the time complexity would jump to O(n).




回答3:


Keep in mind that unless you're writing your own data structure (e.g. linked list in C), it can depend dramatically on the implementation of data structures in your language/framework of choice. As an example, take a look at the benchmarks of Apple's CFArray over at Ridiculous Fish. In this case, the data type, a CFArray from Apple's CoreFoundation framework, actually changes data structures depending on how many objects are actually in the array - changing from linear time to constant time at around 30,000 objects.

This is actually one of the beautiful things about object-oriented programming - you don't need to know how it works, just that it works, and the 'how it works' can change depending on requirements.




回答4:


Red-Black trees:

  • Insert - O(log n)
  • Retrieve - O(log n)
  • Delete - O(log n)



回答5:


Nothing as useful as this: Common Data Structure Operations:




回答6:


Amortized Big-O for hashtables:

  • Insert - O(1)
  • Retrieve - O(1)
  • Delete - O(1)

Note that there is a constant factor for the hashing algorithm, and the amortization means that actual measured performance may vary dramatically.



来源:https://stackoverflow.com/questions/122799/what-is-the-time-complexity-of-indexing-inserting-and-removing-from-common-data

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!