First of all, I\'m sorry if this is a too silly question... this is the first time I\'m trying to use any of the technologies involved in this script (Python, the drive api,
I filed an issue on the google-api-python-client project, and according to Joe Gregorio from google, the problem is in the backend:
"This is an issue with the backend and not with the API or with your code. As you deduced, if the upload goes too long the access_token expires and at that point the resumable upload can't be continued. There is work on progress to fix this issue right now, I will update this bug once the issue is fixed on the server side."
I assume the problem is that after the 1-2 hour limit your access token to your remote database expires; cutting off your connection with the remote server. I think what you could do is look at your hosts API manual... They should have something in there about 'refresh tokens'(They get you another Access Token, note some hosts only allow you to use one refresh token per session), if they are allowed an unlimited amount you can use a combination of a timer and AJAX to keep asking for more access tokens.
If not then you would have a make an AJAX request for another Authorization Token and exchange that for another Access token every hour. That sounds like a very rigorous process but I think that is the only way if your token keeps expiring.
Also just on another note have you tried other methods of uploading? If you said the above script ran for 1-2 hours and it only uploaded 1.44% of the file that could take 100+ hours to fully upload (Way too long for only 3 Gigs).