post image :date_long | 1 min Read

Dask

https://docs.dask.org/en/latest/setup/kubernetes-helm.html#launch-kubernetes-cluster

cat extra-config.yaml
worker:
  replicas: 4
  resources:
    limits:
      cpu: 1
      memory: 0.5G
    requests:
      cpu: 1
      memory: 0.5G
  env:
    - name: EXTRA_CONDA_PACKAGES
      value: numba xarray -c conda-forge
    - name: EXTRA_PIP_PACKAGES
      value: sklearn matplotlib s3fs dask-ml --upgrade

# We want to keep the same packages on the worker and jupyter environments
jupyter:
  enabled: true
  serviceType: NodePort
  env:
    - name: EXTRA_CONDA_PACKAGES
      value: numba xarray matplotlib -c conda-forge
    - name: EXTRA_PIP_PACKAGES
      value: dask_kubernetes s3fs dask-ml --upgrade

Install dask to Kubernetes

helm install k3sdask dask/dask -f extra-config.yaml
helm upgrade k3sdask dask/dask -f extra-config.yaml

In Jupyter Notebooks

from dask_kubernetes import KubeCluster
cluster = KubeCluster.from_yaml('pod.yaml')
cluster.scale(1)
author image

Jan Toth

I have been in DevOps related jobs for past 6 years dealing mainly with Kubernetes in AWS and on-premise as well. I spent quite a lot …

comments powered by Disqus