问题
I'm following this tutorial, and adjusting the Celery-background related code to my project.
In my case I am operating in a Docker environment, and I have a secured site (i.e. https://localhost).
I adjusted the code for secure connection as follows:
key_file = '/etc/nginx/ssl/localhost.key'
cert_file = '/etc/nginx/ssl/localhost.crt'
ca_file = '/etc/nginx/ssl/localhost.ca.crt'
app.config['CELERY_BROKER_URL'] = 'rediss://redis:6380/0'
app.config['CELERY_RESULT_BACKEND'] = 'rediss://redis:6380/0'
def make_celery(app):
"""Setup celery."""
celery = Celery(app.import_name,
backend=app.config['CELERY_RESULT_BACKEND'],
broker=app.config['CELERY_BROKER_URL'],
broker_use_ssl = {
'ssl_keyfile': key_file,
'ssl_certfile': cert_file,
'ssl_ca_certs': ca_file,
'ssl_cert_reqs': ssl.CERT_REQUIRED
},
redis_backend_use_ssl = {
'ssl_keyfile': key_file,
'ssl_certfile': cert_file,
'ssl_ca_certs': ca_file,
'ssl_cert_reqs': ssl.CERT_REQUIRED
})
My docker-compose file look like this:
version: '3'
services:
web:
restart: always
build:
context: ./web
dockerfile: Dockerfile
expose:
- "8000"
volumes:
- /home/avner/avner/constructionOverlay/code/meshlabjs/branches/meshlabjs_avnerV1/webServer/web:/home/flask/app/web
- data2:/home/flask/app/web/project/avner/img
command: /usr/local/bin/gunicorn -w 2 -t 3600 -b :8000 project:app
depends_on:
- postgres
stdin_open: true
tty: true
nginx:
restart: always
build:
context: ./nginx
dockerfile: Dockerfile
ports:
- "80:80"
- "443:443"
volumes:
- /home/avner/avner/constructionOverlay/code/meshlabjs/branches/meshlabjs_avnerV1/webServer/web:/home/flask/app/web
- data2:/home/flask/app/web/project/avner/img
depends_on:
- web
postgres:
restart: always
build:
context: ./postgresql
dockerfile: Dockerfile
volumes:
- data1:/var/lib/postgresql/data
expose:
- "5432"
redis:
container_name: redis
hostname: redis
image: "redis:alpine"
command: --port 6380
restart: always
expose:
- '6380'
ports:
- "6380:6380"
volumes:
- /home/avner/avner/constructionOverlay/code/meshlabjs/branches/meshlabjs_avnerV1/webServer/redis/redis.conf:/usr/local/etc/redis/redis.conf
celery:
build:
context: ./web
command: watchmedo auto-restart --directory=./ --pattern=*.py --recursive -- celery worker -A project.celery --loglevel=info
volumes:
- /home/avner/avner/constructionOverlay/code/meshlabjs/branches/meshlabjs_avnerV1/webServer/web:/home/flask/app/web
- /home/avner/avner/constructionOverlay/code/meshlabjs/branches/meshlabjs_avnerV1/webServer/nginx/ssl:/etc/nginx/ssl
- data2:/home/flask/app/web/project/avner/img
depends_on:
- redis
volumes:
data1:
data2:
The files: key_file, cert_file, and ca_file are generated using a self-CA with the following steps, similar to the steps here:
- create a self CA
- sign a certificate to my localhost, outside of Docker
- mount the files to Docker
After doing this change, I don't see connection errors. The containers start ok, and the log file is quite!
But at run time, when calling a Celery task from my web container I get an error, in my web container.
From my javascript I have a POST call:
let queryUrl = 'http://localhost/api/v1_2/create_zip_file3';
let fetchData = {
method: 'POST',
};
let response = await fetch(queryUrl, fetchData);
Here is the code in Flask web container:
@sites_api_blueprint.route('/api/v1_2/create_zip_file', methods=['POST'])
def create_zip_file():
print( 'BEG create_zip_file' )
task = create_zip_file_task.delay(current_user_id=current_user.id)
return jsonify({}), 202, {'Location': url_for('create_zip_file_taskstatus', task_id=task.id)}
@celery.task(bind=True)
def create_zip_file_task(self, current_user_id):
print( 'BEG create_zip_file_task' )
# do some stuff
# ...
# taskstatus -> create_zip_file_taskstatus
@app.route('/status/<task_id>')
def create_zip_file_taskstatus(task_id):
# return the progress status
# ...
The error in the web container:
docker logs -f webserver_web_1
...
BEG create_zip_file
172.20.0.5 - - [14/Feb/2020 05:48:32] "POST /api/v1_2/create_zip_file HTTP/1.0" 500 -
Traceback (most recent call last):
File "/usr/local/lib/python3.7/site-packages/redis/connection.py", line 492, in connect
sock = self._connect()
File "/usr/local/lib/python3.7/site-packages/redis/connection.py", line 742, in _connect
keyfile=self.keyfile)
FileNotFoundError: [Errno 2] No such file or directory
What am I doing wrong?
Thanks Avner
回答1:
The problem with secure connection was because the ssl keys were not available to the web container.
I fixed it by changing docker-compose.yml
cat docker-compose.yml
...
services:
web:
volumes:
- /home/webServer/nginx/ssl:/etc/nginx/ssl
...
After doing the change, there are no errors in the log file.
But there still seems to be problems with the connection, and when triggering the task, there is no activity on the celery container.
This is a different problem and I track it separately in here.
来源:https://stackoverflow.com/questions/60219700/cannot-make-a-secured-connection-from-celery-to-redis