[SciPy-user] How to start with SciPy and NumPy
David Cournapeau
cournape@gmail....
Mon Jan 26 10:05:50 CST 2009
On Tue, Jan 27, 2009 at 12:32 AM, alex <argriffi@ncsu.edu> wrote:
> David Cournapeau wrote:
>> On Mon, Jan 26, 2009 at 11:52 PM, Sturla Molden <sturla@molden.no> wrote:
>>
>>>> Sturla Molden wrote:
>>>>
>>>> This is independent of the list implementation, isn't it ? I am quite
>>>> curious to understand how you could get O(1) complexity for a "growable"
>>>> container: if you don't know in advance the number of items, and you add
>>>> O(N) items, how come you can get O(1) complexity ?
>>>>
>>> Each time the array is re-sized, you add in some extra empty slots. Make
>>> sure the number of extra slots is proportional to the size of the array.
>>>
>>
>> So this is the well known method of over allocating when you need to
>> grow, right ? This is not constant time, and depends on the number of
>> items you are adding I think (sublinearly, but still)
>>
>>
> I think you guys are talking about different N. If N is the number of
> items already in the list, then adding a single item to the list could
> be O(N) if you use arrays to represent lists and you do not
> over-allocate when you need to grow. By over-allocating when you need
> to grow, you can get amortized O(1) for the operation of adding a single
> element (not N elements) to the list.
I meant that adding N items to a list requires O(log(N)) malloc when
using over allocation (double the size at ever allocation). I am not
sure to understand how the number of items already in the list would
influence the complexity of growing, at least when complexity =
counting the number of malloc.
David
More information about the SciPy-user
mailing list