This gives the expected result
x = random.rand(1) + random.rand(1)*1j
print x.dtype
print x, x.real, x.imag
and this works
To insert complex x or x + something into C, you apparently need to treat it as if it were an array, so either index into x or assign it to a slice of C:
>>> C
array([[ 0.+0.j, 0.+0.j],
[ 0.+0.j, 0.+0.j]])
>>> C[0, 0:1] = x
>>> C
array([[ 0.47229555+0.7957525j, 0.00000000+0.j ],
[ 0.00000000+0.j , 0.00000000+0.j ]])
>>> C[1, 1] = x[0] + 1+1j
>>> C
array([[ 0.47229555+0.7957525j, 0.00000000+0.j ],
[ 0.00000000+0.j , 1.47229555+1.7957525j]])
It looks like NumPy isn't handling this case correctly. Consider submitting a bug report.