The implementation of delete and delete[] is composed of two phases:
- recursive call to destructors (if any)
- memory deallocation for deleted object
Let alone the chain of calls to destructors, whose complexity is essentially governed by you, we are left with how the memory is freed to consider.
The second point is not covered by the C++ specification. So, any compiler suite/OS is free to adopt its own strategy.
A common memory allocation/deallocation strategy is allocating a whole memory page when needed from the OS, then at each new/new[], returning a chunk of the appropriate size, whose length and attributes are then stored inside the page as a header/footer. A corresponding delete/delete[] can be as simple as marking that same chunk as "free", which is clearly O(1).
If the complexity of memory deallocation is O(1), then the complexity of a delete is essentially governed by calls to destructors. The default implementation does (almost) nothing, and it's a O(1) for a single call, thus an overall O(n), where n is the toal number of calls (e.g. if the object being destructed has two fields whose destructor is called, then n = 1 (object) + 2 (o. fields) = 3).
Putting all pieces together: you can arbitrarily increment complexity by performing operations in the destructor (which can be written by you), but you cannot "perform better"¹ than O(n) (n defined in the previous paragraph). The formally correct way to state this is: "the complexity of delete is an Omega(n)".
¹ allow me to be a bit informal on this point