Kubernetes create a load balancer, for each service; automatically in GCE. How can I manage something similar on AWS?
Kubernetes service basically use the kubeproxy
Minimal example:
kind: Service
apiVersion: v1
metadata:
name: my-service
spec:
type: LoadBalancer
selector:
app: MyApp
ports:
- protocol: TCP
port: 80
targetPort: 9376
The relevant docs:
As of writing the best way to learn about all the service.beta.kubernetes.io annotations is to read the source code:
For the controller to be able to manage the ELB it will need permissions set in the master instances IAM Role, e.g.:
...
{
"Action": "elasticloadbalancing:*",
"Resource": "*",
"Effect": "Allow"
},
{
"Action": [
"ecr:GetAuthorizationToken",
"ecr:BatchCheckLayerAvailability",
"ecr:GetDownloadUrlForLayer",
"ecr:GetRepositoryPolicy",
"ecr:DescribeRepositories",
"ecr:ListImages",
"ecr:BatchGetImage"
],
"Resource": "*",
"Effect": "Allow"
},
...
The cloud provider should be set with --cloud-provider=aws on kube-apiserver.
In your service definition, set its type field to LoadBalancer, and kubernetes will automatically create an AWS Elastic Load Balancer for you if you're running on AWS. This feature should work on GCE/GKE, AWS, and OpenStack.
For an example, check out the guestbook-go example.