Python3 CGI HTTPS server fails on Unix

拈花ヽ惹草 提交于 2019-12-01 10:35:18

问题


This Python3 CGI HTTPS server used to work a few weeks (or months) ago, but now no longer works under Linux (Ubuntu). I tried on Ubuntu 10.04 and Ubuntu 14.04 and the behavior is the same.

Now when I try to access any CGI script I am getting:

Secure Connection Failed

An error occurred during a connection to 127.0.0.1:4443. SSL received a record that exceeded the maximum permissible length. (Error code: ssl_error_rx_record_too_long) 

Below is the code for the server:

import http.server
import ssl
import os

server_address = ('', 4443)
cert = os.path.abspath('./server.pem')

handler = http.server.CGIHTTPRequestHandler
handler.cgi_directories = ['/cgi-bin']

httpd = http.server.HTTPServer(server_address, handler)
httpd.socket = ssl.wrap_socket(httpd.socket, server_side=True, 
                                certfile=cert)

print ("Server started...")
httpd.serve_forever()

The server logs the following error:

File "/usr/lib/python3.4/ssl.py", line 618, in read
v = self._sslobj.read(len, buffer)
ssl.SSLError: [SSL: SSLV3_ALERT_UNEXPECTED_MESSAGE] sslv3 alert unexpected message (_ssl.c:1767)

This works if I disable SSL and works fine on Windows with SSL. Tested with Python 3.4. The strange thing is that this worked a few months back Can anyone get this script (or any python3 CGI HTTPS server) to run on updated Linux systems?


回答1:


I found the answer at:
http://www.castro.aus.net/~maurice/OddsAndEnds/blog/files/d2baf24c48b972f18836cac7a27734e2-35.html

The solution is to add:

http.server.CGIHTTPRequestHandler.have_fork=False # Force the use of a subprocess

before starting the server.

This is required for Mac and Unix implementation because, for efficiency reasons, they employ a fork to start the process that executes the CGI rather than creating a subprocess as used by other implementations (i.e. Windows). In a non-wrapped CGI implementation the fork works fine and the output is sent to the socket correctly, however, when the socket is SSL wrapped things go terribly wrong.

The solution is to force the Unix and Mac implementations to use a subprocess leaving the SSL socket happily working and having the Python Server transfer the output of the CGI script to the client while translating the output into SSL.

I still have no clue why this used to work!




回答2:


Although the OP found the solution already, here are a few more details why it behaves that way:

  • Plain sockets are kernel only, but sslwraped sockets put an additional user-space layer on top.
  • http.server does a fork (on platforms supporting fork, that is not on windows) and a remapping of the file descriptors to stdin/stdout before finally executing the cgi program. This way the executed program works on the plain (kernel only, no ssl) file descriptors
  • All writes of the program thus go directly to the kernel socket, that is plain unencrypted data.
  • The peer will croak on this plain data because it expects SSL frames. The kind of error it produces depends on the data it gets, e.g. ssl_error_rx_record_too_long or "wrong version number" or something like this.


来源:https://stackoverflow.com/questions/27303343/python3-cgi-https-server-fails-on-unix

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!