I am running test but I want to run 2 functions at the same time. I have a camera and I am telling it to move via suds, I am then logging into the camera via SSH to check th
There can only be one thread running at the same time. This has been answered extensively here. One solution will be to use two separate processes. The above answer provides some tips.
Import the threading
module and run SudsMove()
like so:
threading.Thread(target = SudsMove).start()
That will create and start a background thread which does the movement.
ANSWER TO EDITED QUESTION:
As far as I understand this, TestAbsoluteMove.Ssh(self)
polls the speed once and stores the result in self.Value
?! And you're testing the expected end tilt/rotation/position with self.assertEqual(self.Value, '3500')
?!
If that's correct, you should wait for the camera to start its movement. You could probably poll the speed in a certain interval:
# Move camera in background thread
threading.Thread(target = SudsMove).start()
# What does this do?
self.command = './ptzpanposition -c 0 -u degx10'
# Poll the current speed in an interval of 250 ms
import time
measuredSpeedsList = []
for i in xrange(20):
# Assuming that this call will put the result in self.Value
TestAbsoluteMove.Ssh(self)
measuredSpeedsList.append(self.Value)
time.sleep(0.25)
print "Measured movement speeds: ", measuredSpeedsList
The movement speed will be the biggest value in measuredSpeedsList
(i.e. max(measuredSpeedsList)
). Hope that makes sense...
If you can get your code to run under Jython or IronPython, then you can run several threads simultaneously; they don't have that goofy "Global Interpreter Lock" thing of CPython.
If you want to use the common Python implementation (CPython), you can certainly use the multiprocessing module, which does wonders (you can pass non-pickleable arguments to subprocesses, kill tasks,…), offers an interface similar to that of threads, and does not suffer from the Global Interpreter Lock.
The downside is that subprocesses are spawned, which takes more time than creating threads; this should only be a problem if you have many, many short tasks. Also, since data is passed (via serialization) between processes, large data both takes a long time to pass around and ends up having a large memory footprint (as it is duplicated between each process). In situations where each task takes a "long" time and the data in and out of each task is not too large, the multiprocessing module should be great.