Sending information to a ngnix from php on the same server without http

我与影子孤独终老i 提交于 2019-12-04 03:07:22

I am assuming the following:

Current work flow:

  1. User run php script from command line, which communicate with a server side script/cgi setup in Nginx using http request
  2. Server side script/cgi in Nginx will take the incoming data, process it and put it in database, or send out to end user

OP concern:

Efficiency of command line php script communicating with Nginx server side script using http protocol, which maybe overkill as the communication happen within the same server.

Proposal 1

  1. Command line php script will write all information into file(s), then send one http request to Nginx server side cgi script
  2. Nginx server cgi script, upon receiving the request, will pick up all information from file(s), then process it
  3. ramfs (ram disk) can be use to minimize I/O to physical HD

Proposal 2

Combine your command line php script into the Nginx server side script, and create a web interface for it. Current command line user will login webpage to control the process they used to do it with command line tool.

Pro: No more inter-scripts/inter-process communication. The whole work flow is in one process. This maybe more scalable for the future also, as multiple users can log in through web interface and handle the process remotely. Additionally, they do not require OS level accounts.

Con: May need more development time. (But you only have to maintain one code base instead of two.)

Why don't you consider using socket.io and Amazon SNS?

In our infrastructure when we want to send a notification to a specific client subscribed on a socket.io channel, we send a payload to an Amazon SNS topic. This payload has "channel" attribute and the "message" to send to the client. I give just a snippet from our code that's easy to understand

$msg = array(
                'channel' => $receiver->getCometChannel(), //Channel id of the client to send the message 
                'data' => json_encode($payload) //The message to send to the client
    );
$client = $this->getSNSObject();
$client->publish(array(
            'TopicArn' => $topicArn,
            'Message' => json_encode($msg)
        ));

We have a node.js script that creates and endpoint on the port 8002 (http://your_ip:8002/receive) When Amazon SNS receives a payload from PHP backends, it forwards this payload to this endpoint and then the only thing to do is processing the payload and send the message to the corresponding client via socket.js. Here goes the node.js script:

var fs = require('fs');

var options = {
    pfx:fs.readFileSync('/etc/ssl/certificate.pfx') //optional, for SSL support for socket.js
};


var io = require('socket.io')(8001);


// open the socket connection
io.sockets.on('connection', function(socket) {
        socket.on('subscribe', function(data) { socket.join(data.channel); });
        socket.on('unsubscribe', function(data) { socket.leave(data.channel); });
        socket.on('message', function (data) {
                io.sockets.in(data.channel).emit('message', data.message);
        });
})

var http=require('http');
http.createServer(function(req, res) {
    if(req.method === 'POST' && req.url === '/receive') {
        return client(req, res);
    }
    res.writeHead(404);
    res.end('Not found.');
}).listen(8002);

var SNSClient = require('aws-snsclient');
var client = SNSClient(function(err, message) {
        try{
                var body=JSON.parse(message.Message)
                var channel=body.channel,data=(body.data);
                console.log(channel);
                io.sockets.in(channel).emit('message', {channel: channel, data: data});
        } catch(e) {
                console.log(e);
        }
});

Maybe it seems complicated but i the idea is clear.

Let me answer step by step:

  1. Is sending 30+ http requests per php script a good practise?

It's not a problem until you satisfied with the speed. You have possible two potential problems with that solution:

a. high timing to reestablish http connection each request; 
b. when concurrent requests reach its maximum nginx can skip some of your requests;
  1. What is a best practise to handle this kind of communications?

As you said, best practice to use Query interface. But I am not sure, is there a way to handle it on nginx side (I am not clear with technology you using on nginx side).

Also you can use long polling connection to send requests to nginx, that will decrease latency from problem (a), but it can issue some new problems.

What about using PHP-FPM connected to Nginx over Unix domain sockets using the FastCGI protocol? That's the fastest way to do IPC between Nginx and PHP — there's very little IO overhead, compared to an Internet socket.

Another solution that we tried before was deploying an ejabberd server (you can customize it) write a small javascript client using strophe client. http://blog.wolfspelz.de/2010/09/website-chat-made-easy-with-xmpp-and.html?m=1 good blog post about the topic. If you want to develop a chat application i would go for this option.

Another advantage, your users can also use xmpp clients to connect to your chat platform.

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!