load-balancing

Load balanced service serving both internal and external users GCP

戏子无情 提交于 2020-04-14 14:08:42
问题 We are in the process of setting up a service on GCP that will serve requests from both the internet and from other services inside of our VPC. We already have a global load-balancer setup and want all traffic to our new service to be load-balanced as well. Is it advisable to have our internal services use the global LB address when trying to reach the new service? Or should we be setting up internal LBs behind the global LB for internal services to use? If we were to use the global LB for

Load balanced service serving both internal and external users GCP

China☆狼群 提交于 2020-04-14 14:08:21
问题 We are in the process of setting up a service on GCP that will serve requests from both the internet and from other services inside of our VPC. We already have a global load-balancer setup and want all traffic to our new service to be load-balanced as well. Is it advisable to have our internal services use the global LB address when trying to reach the new service? Or should we be setting up internal LBs behind the global LB for internal services to use? If we were to use the global LB for

HaProxy Configuration file

六眼飞鱼酱① 提交于 2020-02-06 07:57:57
问题 I have 2 springboot applications running on ports 9000 and 9001. I have also run HaProxy using a docker container. My config file is as follows: global defaults mode http timeout connect 5000ms timeout client 5000ms timeout server 5000ms frontend http-in bind *:80 acl has_web1 path_beg /web1 acl has_web2 path_beg /web2 use_backend web1 if has_web1 use_backend web2 if has_web2 default_backend web1 backend web1 server web1 127.0.0.1:9000 check backend web2 server web2 127.0.0.1:9001 check When

AWS Load Balancing a Node.js App on port 3000

跟風遠走 提交于 2020-02-03 04:45:09
问题 I've got a Node.js Express web app that is using the default port 3000 and responds fine on an Ubuntu EC2 instance by elastic ip. I'm trying to setup Load Balancing built into AWS and can't seem to get a good health check to pass Setup 2 ubuntu servers that server the app fine on port 3000. Set the load balancer listeners for port 80 to route to Instance port 3000 and also tried routing 3000 to 3000. Added the amazon-elb/amazon-elb-sg security group to my instance security groups just in case

Load balancing server, how can I implement it?

半城伤御伤魂 提交于 2020-02-01 04:51:42
问题 I googled for load balancing but the only thing I can find is the working theory, which at the moment, is the "easy" part for me. But zero examples of how to implement one. I have several questions pertaining load balancing: I have a domain (example.com) and I behind it I have a load balancing server (lets call it A ) which, according to the theory, will ask the client to close the connection with A, and connect to B, a sub-server and carry on the request with B. Will the client, in a web

Static Website Redirect HTTP to HTTPS with GCP Load Balancer with

大憨熊 提交于 2020-01-25 08:50:06
问题 I need to redirect my website from http to https , I have my static website in Google Cloud Storage Bucket pointed to Load Balance with http & https enabled. Example: http://ex.com = > https://ex.com http://www.ex.com = > https://www.ex.com The https://ex.com, https://www.ex.com both work just fine, however, I just need the http redirect so that I can reach Secure Service Since I am having a static website I hope I can only handle this in Load Balance, Can some one help me with this. 回答1: At

nginx as load balancer server out 404 page based on HTTP response from app server

无人久伴 提交于 2020-01-25 04:55:11
问题 I'm using Nginx as a load balancer with app servers behind it. If the app servers return a response of 404 not found, I would like nginx to then server out a 404 page that is local on that server. I would like to do this so my app servers don't get tied up serving a static HTML file and can instead just quickly return a response and let nginx handle serving the static stuff. Does anyone know how I can accomplish this? Basically I need some kind of conditional check based on the HTTP response.

How do I get EC2 load balancing properly set up to allow for real time file syncing?

筅森魡賤 提交于 2020-01-21 00:42:54
问题 I'm new to EC2. I have read a lot about it, watched many videos and tutorial and pretty much familiar with how everything work. I still have few question that I can't seem to find direct answers to. If I have 3 instances (linux) with load balancing all serving the same site and the site is a dynamic php/mysql driven where users post files forum threads every second, how is the database and files synced to all 3 instances in real time. Do I need to have the database on RDS where every instance

HAProxy 1.4 - Don't log 2xx, only log 5xx

柔情痞子 提交于 2020-01-14 02:22:06
问题 Good evening, I'm using HAProxy (ver 1.4.24) as a load balancer for ~3000 requests per second. I am trying to log only 5xx responses, but I am unable to achive that. I am using the following configuration: http://pastebin.com/TsTk9GQE This configuration also logs 2xx requests, as long as 5xx and 4xx. I need only 5xx and 4xx, or just 5xx. Thanks, 回答1: There is an option called "dontlog-normal" in HAProxy to log only errors (whatever type of errors). Just enable it in your defaults section.

Load balancing requests to a Weblogic 10gR3 RMI server on Amazon EC2

﹥>﹥吖頭↗ 提交于 2020-01-13 13:10:27
问题 I am participating in the development of a distributed solution, based on RMI, and deployed on multiple Weblogic 10gR3 (10.3.0.0) nodes. One of the nodes hosts a RMI server, and other nodes access it through a foreign JNDI provider. While trying to improve our infrastructure by adding additional RMI servers, we faced some issues. Details of our infrastructure: -RMI server is running on a managed server, on port 7005. -RMI clients access it through a remote JNDI provider which points to