complexity-theory

Java: what's the big-O time of declaring an array of size n?

。_饼干妹妹 提交于 2019-11-26 09:02:11
问题 What is the running time of declaring an array of size n in Java? I suppose this would depend on whether the memory is zero\'ed out on garbage collection (in which case it could be O(1) ) or on initialization (in which case it\'d have to be O(n) ). 回答1: It's O(n) . Consider this simple program: public class ArrayTest { public static void main(String[] args) { int[] var = new int[5]; } } The bytecode generated is: Compiled from "ArrayTest.java" public class ArrayTest extends java.lang.Object{

multiset, map and hash map complexity

你。 提交于 2019-11-26 08:45:40
问题 I would like to know the complexity in Big O notation of the STL multiset, map and hash map classes when: inserting entries accessing entries retrieving entries comparing entries 回答1: map, set, multimap, and multiset These are implemented using a red-black tree, a type of balanced binary search tree. They have the following asymptotic run times: Insertion: O(log n) Lookup: O(log n) Deletion: O(log n) hash_map, hash_set, hash_multimap, and hash_multiset These are implemented using hash tables.

Time complexity of python set operations?

假如想象 提交于 2019-11-26 08:11:55
问题 What is the the time complexity of each of python\'s set operations in Big O notation? I am using Python\'s set type for an operation on a large number of items. I want to know how each operation\'s performance will be affected by the size of the set. For example, add, and the test for membership: myset = set() myset.add(\'foo\') \'foo\' in myset Googling around hasn\'t turned up any resources, but it seems reasonable that the time complexity for Python\'s set implementation would have been

Is there anything that guarantees constant time for accessing a property of an object in JavaScript?

风流意气都作罢 提交于 2019-11-26 07:49:52
问题 This is in regards to a debate I had with an interviewer when I was interviewing at Amazon. Let\'s I create an object: var Obj = {}; Obj[\'SomeProperty\'] = function ( ) { console.log(\"Accessed some property\"); }; Obj[69] = true; Is there anything in the JavaScript guarantee ing that when I subsequently access those 2 properties like Obj[\'SomeProperty\'] and Obj[69] the respective values function ( ) { console.log(\"Accessed some property\"); }; and 69 are looked up in O(1) time? I know

Time complexity of accessing a Python dict

和自甴很熟 提交于 2019-11-26 07:36:53
问题 I am writing a simple Python program. My program seems to suffer from linear access to dictionaries, its run-time grows exponentially even though the algorithm is quadratic. I use a dictionary to memoize values. That seems to be a bottleneck. The values I\'m hashing are tuples of points. Each point is: (x,y), 0 <= x,y <= 50 Each key in the dictionary is: A tuple of 2-5 points: ((x1,y1),(x2,y2),(x3,y3),(x4,y4)) The keys are read many times more often than they are written. Am I correct that

Python dictionary keys. “In” complexity

馋奶兔 提交于 2019-11-26 07:25:17
问题 Quick question to mainly satisfy my curiosity on the topic. I am writing some large python programs with an SQlite database backend and will be dealing with a large number of records in the future, so I need to optimize as much as I can. For a few functions, I am searching through keys in a dictionary. I have been using the \"in\" keyword for prototyping and was planning on going back and optimizing those searches later as I know the \"in\" keyword is generally O(n) (as this just translates

What&#39;s the Time Complexity of Average Regex algorithms?

六月ゝ 毕业季﹏ 提交于 2019-11-26 07:24:14
问题 I\'m not new to using regular expressions, and I understand the basic theory they\'re based on--finite state machines. I\'m not so good at algorithmic analysis though and don\'t understand how a regex compares to say, a basic linear search. I\'m asking because on the surface it seems like a linear array search. (If the regex is simple.) Where could I go to learn more about implementing a regex engine? 回答1: This is one of the most popular outlines: Regular Expression Matching Can Be Simple And

What would cause an algorithm to have O(log log n) complexity?

柔情痞子 提交于 2019-11-26 06:54:47
问题 This earlier question addresses some of the factors that might cause an algorithm to have O(log n) complexity. What would cause an algorithm to have time complexity O(log log n)? 回答1: O(log log n) terms can show up in a variety of different places, but there are typically two main routes that will arrive at this runtime. Shrinking by a Square Root As mentioned in the answer to the linked question, a common way for an algorithm to have time complexity O(log n) is for that algorithm to work by

What is the best way to get the minimum or maximum value from an Array of numbers?

回眸只為那壹抹淺笑 提交于 2019-11-26 06:32:09
问题 Let\'s say I have an Array of numbers: [2,3,3,4,2,2,5,6,7,2] What is the best way to find the minimum or maximum value in that Array? Right now, to get the maximum, I am looping through the Array, and resetting a variable to the value if it is greater than the existing value: var myArray:Array /* of Number */ = [2,3,3,4,2,2,5,6,7,2]; var maxValue:Number = 0; for each (var num:Number in myArray) { if (num > maxValue) maxValue = num; } This just doesn\'t seem like the best performing way to do

What&#39;s the fastest algorithm for sorting a linked list?

☆樱花仙子☆ 提交于 2019-11-26 05:55:08
问题 I\'m curious if O(n log n) is the best a linked list can do. 回答1: It is reasonable to expect that you cannot do any better than O(N log N) in running time . However, the interesting part is to investigate whether you can sort it in-place, stably, its worst-case behavior and so on. Simon Tatham, of Putty fame, explains how to sort a linked list with merge sort. He concludes with the following comments: Like any self-respecting sort algorithm, this has running time O(N log N). Because this is