Spark executors not able to access ignite nodes inside kubernetes cluster

别等时光非礼了梦想. 提交于 2019-12-02 13:12:13

The similar issue was discussed here.

Most likely you need to grant more permissions to a service account which is used for running Ignite.

This way you are able to create and bind one more role to the service account:

apiVersion: rbac.authorization.k8s.io/v1beta1
kind: ClusterRole
metadata:
  name: ignite
  namespace: default
rules:
- apiGroups:
  - ""
  resources: # Here is resources you can access
  - pods
  - endpoints
  verbs: # That is what you can do with them
  - get
  - list
  - watch

apiVersion: rbac.authorization.k8s.io/v1beta1
kind: ClusterRoleBinding
metadata:
  name: ignite
roleRef:
  kind: ClusterRole
  name: ignite
  apiGroup: rbac.authorization.k8s.io
subjects:
- kind: ServiceAccount
  name: <service account name>
  namespace: default

Also, if your namespace is not default you need to update that one in yaml-files and specify it in TcpDiscoveryKubernetesIpFinder configuration.

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!