Ansible: copying one unique file to each server in a group

百般思念 提交于 2019-12-04 18:44:16

For example:

  tasks:
    - set_fact:
        padded_host_index: "{{ '{0:03d}'.format(play_hosts.index(inventory_hostname)) }}"

    - copy: src=/mine/split_{{ padded_host_index }}.xz dest=/data/

You can do this with Ansible. However, this seems like the wrong general approach to me.

You have a number of jobs. You need them each to be processed, and you don't care which server processes which job as long as they only process each job once (and ideally do the whole batch as efficiently as possible). This is precisely the situation a distributed queueing system is designed to work in.

You'll have workers running on each server and one master node (which may run on one of the servers) that knows about all of the workers. When you need to add tasks to get done, you queue them up with the master, and the master distributes them out to workers as they become available - so you don't have to worry about having an equal number of servers as jobs.

There are many, many options for this, including beanstalkd, Celery, Gearman, and SQS. You'll have to do the legwork to find out which one works best for your situation. But this is definitely the architecture best suited to your problem.

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!