Ansible wait_for module, start at end of file

久未见 提交于 2019-11-29 12:24:55

Regular expression in search_regex of wait_for module is by default set to multiline.

You can register the contents of the last line and then search for the string appearing after that line (this assumes there are no duplicate lines in the log file, i.e. each one contains a time stamp):

vars:
  log_file_to_check: <path_to_log_file>
  wanted_pattern: <pattern_to_match>

tasks:
  - name: Get the contents of the last line in {{ log_file_to_check }}
    shell: tail -n 1 {{ log_file_to_check }}
    register: tail_output

  - name: Create a variable with a meaningful name, just for clarity
    set_fact:
      last_line_of_the_log_file: "{{ tail_output.stdout }}"

  ### do some other tasks ###

  - name: Match "{{ wanted_pattern }}" appearing after "{{ last_line_of_the_log_file }}" in {{ log_file_to_check }}
    wait_for:
      path: "{{ log_file_to_check }}"
      search_regex: "{{ last_line_of_the_log_file }}\r(.*\r)*.*{{ wanted_pattern }}"

techraf's answer would work if every line inside the log file is time stamped. Otherwise, the log file may have multiple lines that are identical to the last one.

A more robust/durable approach would be to check how many lines the log file currently has, and then search for the regex/pattern occurring after the 'nth' line.


vars:
  log_file: <path_to_log_file>
  pattern_to_match: <pattern_to_match>

tasks:
  - name: "Get contents of log file: {{ log_file }}"
    command: "cat {{ log_file }}"
    changed_when: false  # Do not show that state was "changed" since we are simply reading the file!
    register: cat_output

  - name: "Create variable to store line count (for clarity)"
    set_fact:
      line_count: "{{ cat_output.stdout_lines | length }}"

##### DO SOME OTHER TASKS (LIKE DEPLOYING APP) #####

  - name: "Wait until '{{ pattern_to_match}}' is found inside log file: {{ log_file }}"
    wait_for:
      path: "{{ log_file }}"
      search_regex: "^{{ pattern_to_skip_preexisting_lines }}{{ pattern_to_match }}$"
      state: present
    vars:
      pattern_to_skip_preexisting_lines : "(.*\\n){% raw %}{{% endraw %}{{ line_count }},{% raw %}}{% endraw %}"  # i.e. if line_count=100, then this would equal "(.*\\n){100,}"

Actually, if you can force a log rotation on your java app log file then straightforward wait_for will achieve what you want since there won't be any historical log lines to match

I am using this approach with rolling upgrade of mongodb and waiting for "waiting for connections" in the mongod logs before proceeding.

sample tasks:

tasks:
  - name: Rotate mongod logs
    shell: kill -SIGUSR1 $(pidof mongod)
    args:
      executable: /bin/bash

  - name: Wait for mongod being ready
    wait_for:
      path: /var/log/mongodb/mongod.log
      search_regex: 'waiting for connections'

One more method, using intermediate temp file for tailing new records:

- name: Create tempfile for log tailing
  tempfile:
    state: file
  register: tempfile
- name: Asynchronous tail log to temp file 
  shell: tail -n 0 -f /path/to/logfile > {{ tempfile.path }}
  async: 60
  poll: 0
- name: Wait for regex in log
  wait_for:
    path: "{{ tempfile.path }}"
    search_regex: 'some regex here'
- name: Remove tempfile
  file:
    path: "{{ tempfile.path }}"
    state: absent
易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!