问题
The following fragment will pick one server at a time. Is there a way to hit them all at once?
upstream backend {
server 17.0.0.1:8000;
server 17.0.0.1:8001;
server 17.0.0.1:8002;
server 17.0.0.1:8003;
}
server {
location / {
proxy_pass http://backend;
}
}
回答1:
Here is a solution using ngx_http_mirror_module (available since nginx 1.13.4):
server {
location / {
proxy_pass http://17.0.0.1:8000;
mirror /s1;
mirror /s2;
mirror /s3;
}
location /s1 { internal; proxy_pass http://17.0.0.1:8001$request_uri; }
location /s2 { internal; proxy_pass http://17.0.0.1:8002$request_uri; }
location /s3 { internal; proxy_pass http://17.0.0.1:8003$request_uri; }
}
nginx will:
- send the same request to all servers
- wait for all of them to finish
- respond with the http://17.0.0.1:8000 response (and ignore the others)
回答2:
If you've built nginx with the Lua module (or you're using OpenResty), the following snippet will allow you to broadcast a response to all servers in an upstream group.
Requests are made in parallel, and once they have completed it will return the individual HTTP responses and response headers.
upstream @api {
server 10.0.0.1:80;
server 10.0.0.2:80 backup;
}
server {
server_name _;
listen 443 ssl;
location ~ ^/broadcast(?<proxy_path>/.*)$ {
lua_need_request_body on;
content_by_lua '
local upstream = require "ngx.upstream"
local servers = upstream.get_servers("@api")
local requests = {}
for _, srv in ipairs(servers) do
local addr = srv.addr
table.insert(requests, { "/proxy", { method = ngx["HTTP_" .. ngx.var.request_method], always_forward_body = true, copy_all_vars = true, vars = { proxy_host = addr } } })
end
local responses = { ngx.location.capture_multi(requests) }
for i, res in ipairs(responses) do
local addr = servers[i].addr
ngx.say(addr, " HTTP/1.1 ", res.status)
for header, value in pairs(res.header) do
ngx.say(header, ": ", value)
end
ngx.say()
ngx.print(res.body)
ngx.say()
end
';
}
location /proxy {
internal;
proxy_pass http://$proxy_host$proxy_path$is_args$args;
}
}
回答3:
I ran into a similar situation today and wanted to broadcast a GET against a number of my servers that are listening on APIs to trigger git pulls. I couldn't find anything in Nginx to handle this sort of behavior and I didn't want to jump to writing an extension just yet.
Instead, I hacked together something using Python's Flask and Requests modules:
from flask import Flask, jsonify, request
import requests
app = Flask(__name__)
# add whatever you need here to the methods
@app.route("/http/replay/<method>", methods=["POST", "GET", "PUT", "OPTIONS"])
def http_replay(method):
"""Replay an incoming request of type <method> against the parameter list of endpoints"""
endpoint_list = request.args.get("host", "")
endpoint_list = endpoint_list.split(";")
timeout = request.args.get("timeout", None) or 5
if not endpoint_list:
return jsonify(status=500,
message="Expected parameters in the form of ?host=http://host/path;host=http://host2/path")
else:
responses = []
for ep in endpoint_list:
try:
_r = requests.__getattribute__(method.lower())(ep, timeout=timeout)
status_code = _r.status_code
except requests.exceptions.Timeout:
status_code = 408 # request timeout
except requests.exceptions.RequestException:
status_code = 520
responses.append(status_code)
_r_status = set(responses)
if len(_r_status) == 1 and _r_status.pop == 200:
status = 200
else:
status = 520
return jsonify(status=status)
This simply listens on a path and allows me to send a request, such as:
curl "http://localhost:5000/plugins/http/replay/POST?host=http://localhost:80/api;host=http://dev.localhost:8080/api?timeout=10"
and returns an HTTP 200 status code if all the endpoints in the URL parameters did the same.
来源:https://stackoverflow.com/questions/22948017/is-there-a-way-to-configure-nginx-to-broadcast-incoming-requests-to-multiple-ups