I am working on a project which needs bit automation and web-scrapping for which I am using Selenium and BeautifulSoup (python2.7).
I you are using the script to automatically submit forms (simply said doing GET and POST requests), I would recommend you to look at requests. You can easily capture Post requests from your Browser (Network tab in Developer Pane on both Firefox and Chrome), and submit them. Something like:
session = requests.session()
response = session.get('https://stackoverflow.com/')
soup = BeautifulSoup(response.text)
and even POST data like:
postdata = {'username':'John','password':password}
response=session.post('example.com',data=postdata,allow_redirects=True)
It can be easily threaded, Multiple times faster than using selenium, the only problem is there is no JavaScript or Form support, so you need to do it the old fashioned way.
EDIT: Also take a look at ThreadPoolExecutor