This gives the expected result
x = random.rand(1) + random.rand(1)*1j
print x.dtype
print x, x.real, x.imag
and this works
Actually, none of the proposed solutions worked in my case (Python 2.7.6, NumPy 1.8.2).
But I've found out, that change of dtype from complex (standard Python library) to numpy.complex_ may help:
>>> import numpy as np
>>> x = 1 + 2 * 1j
>>> C = np.zeros((2,2),dtype=np.complex_)
>>> C
array([[ 0.+0.j, 0.+0.j],
[ 0.+0.j, 0.+0.j]])
>>> C[0,0] = 1+1j + x
>>> C
array([[ 2.+3.j, 0.+0.j],
[ 0.+0.j, 0.+0.j]])