скаффолд с kaniko: адрес реестра не разрешен - PullRequest
0 голосов
/ 14 апреля 2020

Я развернул службу реестра в пространстве имен registry:

$ helm install registry stable/docker-registry

Служба:

$ kubectl get service
NAME                       TYPE        CLUSTER-IP     EXTERNAL-IP   PORT(S)    AGE
registry-docker-registry   ClusterIP   10.43.119.11   <none>        5000/TCP   18h

это мой skaffold.yaml:

apiVersion: skaffold/v2beta1
kind: Config
metadata:
  name: spring-boot-slab
build:
  artifacts:
  - image: skaffold-covid-backend
    kaniko:
      dockerfile: Dockerfile-multistage
      image: gcr.io/kaniko-project/executor:debug
      cache: {}
  cluster: {}
deploy:
  kubectl:
    manifests:
    - k8s/*

Все работает нормально, до тех пор, пока kaniko не пытается создать sh изображение в реестре выше:

Get "http://registry-docker-registry.registry.svc.cluster.local:5000/v2/": dial tcp: lookup registry-docker-registry.registry.svc.cluster.local on 127.0.0.53:53: no such host

Команда Skaffold:

$ skaffold build --default-repo=registry-docker-registry.registry.svc.cluster.local:5000 

Это журнал:

$ skaffold build --default-repo=registry-docker-registry.registry.svc.cluster.local:5000 
INFO[0000] Skaffold &{Version:v1.7.0 ConfigVersion:skaffold/v2beta1 GitVersion: GitCommit:145f59579470eb1f0a7f40d8e0924f8716c6f05b GitTreeState:clean BuildDate:2020-04-02T21:49:58Z GoVersion:go1.14 Compiler:gc Platform:linux/amd64} 
DEBU[0000] validating yamltags of struct SkaffoldConfig 
DEBU[0000] validating yamltags of struct Metadata       
DEBU[0000] validating yamltags of struct Pipeline       
DEBU[0000] validating yamltags of struct BuildConfig    
DEBU[0000] validating yamltags of struct Artifact       
DEBU[0000] validating yamltags of struct ArtifactType   
DEBU[0000] validating yamltags of struct KanikoArtifact 
DEBU[0000] validating yamltags of struct KanikoCache    
DEBU[0000] validating yamltags of struct TagPolicy      
DEBU[0000] validating yamltags of struct GitTagger      
DEBU[0000] validating yamltags of struct BuildType      
DEBU[0000] validating yamltags of struct ClusterDetails 
DEBU[0000] validating yamltags of struct DeployConfig   
DEBU[0000] validating yamltags of struct DeployType     
DEBU[0000] validating yamltags of struct KubectlDeploy  
DEBU[0000] validating yamltags of struct KubectlFlags   
INFO[0000] Using kubectl context: k3s-traefik-v2        
DEBU[0000] Using builder: cluster                       
DEBU[0000] setting Docker user agent to skaffold-v1.7.0 
Generating tags...
 - skaffold-covid-backend -> DEBU[0000] Running command: [git describe --tags --always] 
DEBU[0000] Command output: [c5dfd81
]                   
DEBU[0000] Running command: [git status . --porcelain]  
DEBU[0000] Command output: [ M Dockerfile-multistage
 M skaffold.yaml
?? k8s/configmap.yaml
?? kaniko-pod.yaml
?? run_in_docker.sh
] 
registry-docker-registry.registry.svc.cluster.local:5000/skaffold-covid-backend:c5dfd81-dirty
INFO[0000] Tags generated in 3.479451ms                 
Checking cache...
DEBU[0000] Found dependencies for dockerfile: [{pom.xml /tmp true} {src /tmp/src true}] 
 - skaffold-covid-backend: Not found. Building
INFO[0000] Cache check complete in 3.995675ms           
Building [skaffold-covid-backend]...
DEBU[0000] getting client config for kubeContext: ``    
INFO[0000] Waiting for kaniko-rjsn5 to be initialized   
DEBU[0001] Running command: [kubectl --context k3s-traefik-v2 exec -i kaniko-rjsn5 -c kaniko-init-container -n registry -- tar -xf - -C /kaniko/buildcontext] 
DEBU[0001] Found dependencies for dockerfile: [{pom.xml /tmp true} {src /tmp/src true}] 
DEBU[0001] Running command: [kubectl --context k3s-traefik-v2 exec kaniko-rjsn5 -c kaniko-init-container -n registry -- touch /tmp/complete] 
INFO[0001] Waiting for kaniko-rjsn5 to be complete      
DEBU[0001] unable to get kaniko pod logs: container "kaniko" in pod "kaniko-rjsn5" is waiting to start: PodInitializing 
DEBU[0002] unable to get kaniko pod logs: container "kaniko" in pod "kaniko-rjsn5" is waiting to start: PodInitializing 
DEBU[0000] Getting source context from dir:///kaniko/buildcontext 
DEBU[0000] Build context located at /kaniko/buildcontext 
DEBU[0000] Copying file /kaniko/buildcontext/Dockerfile-multistage to /kaniko/Dockerfile 
DEBU[0000] Skip resolving path /kaniko/Dockerfile       
DEBU[0000] Skip resolving path /kaniko/buildcontext     
DEBU[0000] Skip resolving path /cache                   
DEBU[0000] Skip resolving path                          
DEBU[0000] Skip resolving path                          
DEBU[0000] Skip resolving path                          
INFO[0000] Resolved base name maven:3-jdk-8-slim to maven:3-jdk-8-slim 
INFO[0000] Resolved base name java:8-jre-alpine to java:8-jre-alpine 
INFO[0000] Resolved base name maven:3-jdk-8-slim to maven:3-jdk-8-slim 
INFO[0000] Resolved base name java:8-jre-alpine to java:8-jre-alpine 
INFO[0000] Retrieving image manifest maven:3-jdk-8-slim 
DEBU[0003] No file found for cache key sha256:53ce0b73ff3596b4feb23cd8417cf458276fd72464c790c4f732124878e6038f stat /cache/sha256:53ce0b73ff3596b4feb23cd8417cf458276fd72464c790c4f732124878e6038f: no such file or directory 
DEBU[0003] Image maven:3-jdk-8-slim not found in cache  
INFO[0003] Retrieving image manifest maven:3-jdk-8-slim 
INFO[0005] Retrieving image manifest java:8-jre-alpine  
DEBU[0007] No file found for cache key sha256:6a8cbe4335d1a5711a52912b684e30d6dbfab681a6733440ff7241b05a5deefd stat /cache/sha256:6a8cbe4335d1a5711a52912b684e30d6dbfab681a6733440ff7241b05a5deefd: no such file or directory 
DEBU[0007] Image java:8-jre-alpine not found in cache   
INFO[0007] Retrieving image manifest java:8-jre-alpine  
DEBU[0009] Resolved /tmp/target/*.jar to /tmp/target/*.jar 
DEBU[0009] Resolved /app/spring-boot-application.jar to /app/spring-boot-application.jar 
INFO[0009] Built cross stage deps: map[0:[/tmp/target/*.jar]] 
INFO[0009] Retrieving image manifest maven:3-jdk-8-slim 
DEBU[0011] No file found for cache key sha256:53ce0b73ff3596b4feb23cd8417cf458276fd72464c790c4f732124878e6038f stat /cache/sha256:53ce0b73ff3596b4feb23cd8417cf458276fd72464c790c4f732124878e6038f: no such file or directory 
DEBU[0011] Image maven:3-jdk-8-slim not found in cache  
INFO[0011] Retrieving image manifest maven:3-jdk-8-slim 
DEBU[0012] Resolved pom.xml to pom.xml                  
DEBU[0012] Resolved /tmp/ to /tmp/                      
DEBU[0012] Getting files and contents at root /kaniko/buildcontext for /kaniko/buildcontext/pom.xml 
DEBU[0012] Using files from context: [/kaniko/buildcontext/pom.xml] 
DEBU[0012] optimize: composite key for command COPY pom.xml /tmp/ {[sha256:53ce0b73ff3596b4feb23cd8417cf458276fd72464c790c4f732124878e6038f COPY pom.xml /tmp/ 7176510dcac61a3d406beab8d864708f21db23201dba11185866015a8dcd55b0]} 
DEBU[0012] optimize: cache key for command COPY pom.xml /tmp/ fc6a0ec8876277261e83ab9b647595b1df258352ba9acf92ec19c761415fb23e 
INFO[0012] Checking for cached layer registry-docker-registry.registry.svc.cluster.local:5000/skaffold-covid-backend/cache:fc6a0ec8876277261e83ab9b647595b1df258352ba9acf92ec19c761415fb23e... 
INFO[0012] Using caching version of cmd: COPY pom.xml /tmp/ 
DEBU[0012] optimize: composite key for command RUN mvn -B dependency:go-offline -f /tmp/pom.xml -s /usr/share/maven/ref/settings-docker.xml {[sha256:53ce0b73ff3596b4feb23cd8417cf458276fd72464c790c4f732124878e6038f COPY pom.xml /tmp/ 7176510dcac61a3d406beab8d864708f21db23201dba11185866015a8dcd55b0 RUN mvn -B dependency:go-offline -f /tmp/pom.xml -s /usr/share/maven/ref/settings-docker.xml]} 
DEBU[0012] optimize: cache key for command RUN mvn -B dependency:go-offline -f /tmp/pom.xml -s /usr/share/maven/ref/settings-docker.xml 18ffc2eda5a9ef5481cc865da06e9a4e3d543bf9befb35bd7ac3cb9dc3b62fc7 
INFO[0012] Checking for cached layer registry-docker-registry.registry.svc.cluster.local:5000/skaffold-covid-backend/cache:18ffc2eda5a9ef5481cc865da06e9a4e3d543bf9befb35bd7ac3cb9dc3b62fc7... 
INFO[0012] Using caching version of cmd: RUN mvn -B dependency:go-offline -f /tmp/pom.xml -s /usr/share/maven/ref/settings-docker.xml 
DEBU[0012] Resolved src to src                          
DEBU[0012] Resolved /tmp/src/ to /tmp/src/              
DEBU[0012] Using files from context: [/kaniko/buildcontext/src] 
DEBU[0012] optimize: composite key for command COPY src /tmp/src/ {[sha256:53ce0b73ff3596b4feb23cd8417cf458276fd72464c790c4f732124878e6038f COPY pom.xml /tmp/ 7176510dcac61a3d406beab8d864708f21db23201dba11185866015a8dcd55b0 RUN mvn -B dependency:go-offline -f /tmp/pom.xml -s /usr/share/maven/ref/settings-docker.xml COPY src /tmp/src/ 13724ad65fa9678727cdfb4446f71ed586605178d3252371934493e90d7fc7c5]} 
DEBU[0012] optimize: cache key for command COPY src /tmp/src/ 177d8852ce5ec30e7ac1944b43363857d249c3fb4cdb4a26724ea88660102e52 
INFO[0012] Checking for cached layer registry-docker-registry.registry.svc.cluster.local:5000/skaffold-covid-backend/cache:177d8852ce5ec30e7ac1944b43363857d249c3fb4cdb4a26724ea88660102e52... 
INFO[0012] Using caching version of cmd: COPY src /tmp/src/ 
DEBU[0012] optimize: composite key for command WORKDIR /tmp/ {[sha256:53ce0b73ff3596b4feb23cd8417cf458276fd72464c790c4f732124878e6038f COPY pom.xml /tmp/ 7176510dcac61a3d406beab8d864708f21db23201dba11185866015a8dcd55b0 RUN mvn -B dependency:go-offline -f /tmp/pom.xml -s /usr/share/maven/ref/settings-docker.xml COPY src /tmp/src/ 13724ad65fa9678727cdfb4446f71ed586605178d3252371934493e90d7fc7c5 WORKDIR /tmp/]} 
DEBU[0012] optimize: cache key for command WORKDIR /tmp/ cc93f6a4e941f6eb0b907172ea334a00cdd93ba12f07fe5c6b2cddd89f1ac16c 
DEBU[0012] optimize: composite key for command RUN mvn -B -s /usr/share/maven/ref/settings-docker.xml package {[sha256:53ce0b73ff3596b4feb23cd8417cf458276fd72464c790c4f732124878e6038f COPY pom.xml /tmp/ 7176510dcac61a3d406beab8d864708f21db23201dba11185866015a8dcd55b0 RUN mvn -B dependency:go-offline -f /tmp/pom.xml -s /usr/share/maven/ref/settings-docker.xml COPY src /tmp/src/ 13724ad65fa9678727cdfb4446f71ed586605178d3252371934493e90d7fc7c5 WORKDIR /tmp/ RUN mvn -B -s /usr/share/maven/ref/settings-docker.xml package]} 
DEBU[0012] optimize: cache key for command RUN mvn -B -s /usr/share/maven/ref/settings-docker.xml package f09ec8d47c0476fe4623fbb7bedd628466d43cd623c82a298c84d43c028c4518 
INFO[0012] Checking for cached layer registry-docker-registry.registry.svc.cluster.local:5000/skaffold-covid-backend/cache:f09ec8d47c0476fe4623fbb7bedd628466d43cd623c82a298c84d43c028c4518... 
INFO[0012] Using caching version of cmd: RUN mvn -B -s /usr/share/maven/ref/settings-docker.xml package 
DEBU[0012] Mounted directories: [{/kaniko false} {/etc/mtab false} {/tmp/apt-key-gpghome true} {/var/run false} {/proc false} {/dev false} {/dev/pts false} {/dev/mqueue false} {/sys false} {/sys/fs/cgroup false} {/sys/fs/cgroup/systemd false} {/sys/fs/cgroup/cpu,cpuacct false} {/sys/fs/cgroup/devices false} {/sys/fs/cgroup/net_cls,net_prio false} {/sys/fs/cgroup/pids false} {/sys/fs/cgroup/rdma false} {/sys/fs/cgroup/memory false} {/sys/fs/cgroup/freezer false} {/sys/fs/cgroup/cpuset false} {/sys/fs/cgroup/perf_event false} {/sys/fs/cgroup/blkio false} {/sys/fs/cgroup/hugetlb false} {/busybox false} {/kaniko/buildcontext false} {/etc/hosts false} {/dev/termination-log false} {/etc/hostname false} {/etc/resolv.conf false} {/dev/shm false} {/var/run/secrets/kubernetes.io/serviceaccount false} {/proc/asound false} {/proc/bus false} {/proc/fs false} {/proc/irq false} {/proc/sys false} {/proc/sysrq-trigger false} {/proc/acpi false} {/proc/kcore false} {/proc/keys false} {/proc/timer_list false} {/proc/sched_debug false} {/proc/scsi false} {/sys/firmware false}] 
DEBU[0014] Not adding /dev because it is whitelisted    
DEBU[0014] Not adding /etc/hostname because it is whitelisted 
DEBU[0014] Not adding /etc/resolv.conf because it is whitelisted 
DEBU[0018] Not adding /proc because it is whitelisted   
DEBU[0019] Not adding /sys because it is whitelisted    
DEBU[0026] Not adding /var/run because it is whitelisted 
DEBU[0080] Whiting out /var/lib/apt/lists/.wh.auxfiles  
DEBU[0080] not including whiteout files                 
INFO[0085] Taking snapshot of full filesystem...        
INFO[0085] Resolving paths                              
FATA[0095] build failed: building [skaffold-covid-backend]: getting image: Get "http://registry-docker-registry.registry.svc.cluster.local:5000/v2/": dial tcp: lookup registry-docker-registry.registry.svc.cluster.local on 127.0.0.53:53: no such host

В то же время, когда запущен kaniko Pod, я смог выполнить некоторые действия:

$ kubectl exec -ti kaniko-8nph4 -c kaniko -- sh
/ # wget registry-docker-registry.registry.svc.cluster.local:5000/v2/_catalog
Connecting to registry-docker-registry.registry.svc.cluster.local:5000 (10.43.119.11:5000)
saving to '_catalog'
_catalog             100% |**************************************************************************************************************|    75  0:00:00 ETA
'_catalog' saved
/ # cat _catalog
{"repositories":["skaffold-covid-backend","skaffold-covid-backend/cache"]}

Итак кажется, что он может подключиться к нему, но в журналах говорится, что он не может подключиться к нему.

Есть идеи о том, как получить доступ к этому реестру, развернутому в тех же самых kubernetes?

Я пытался получить доступ к реестру из другого модуля:

$ kubectl exec -ti graylog-1 -- curl registry-docker-registry.registry:5000/v2/_catalog
{"repositories":["skaffold-covid-backend","skaffold-covid-backend/cache"]}

Как вы можете видеть, он может получить доступ к реестру.

Я также посмотрел на контейнере /etc/resolv.conf:

$ kubectl exec -ti kaniko-zqhgf -c kaniko -- cat /etc/resolv.conf
search registry.svc.cluster.local svc.cluster.local cluster.local
nameserver 10.43.0.10
options ndots:5

Я также проверил соединения во время работы контейнера:

$ kubectl exec -ti kaniko-sgs5x -c kaniko -- netstat
Active Internet connections (w/o servers)
Proto Recv-Q Send-Q Local Address           Foreign Address         State       
tcp        0    210 kaniko-sgs5x:40104      104.18.124.25:443       ESTABLISHED 
tcp        0      0 kaniko-sgs5x:46006      registry-docker-registry.registry.svc.cluster.local:5000 ESTABLISHED 
tcp        0      0 kaniko-sgs5x:45884      registry-docker-registry.registry.svc.cluster.local:5000 ESTABLISHED 
tcp        0      0 kaniko-sgs5x:39772      ec2-52-3-104-67.compute-1.amazonaws.com:443 ESTABLISHED 
Active UNIX domain sockets (w/o servers)
Proto RefCnt Flags       Type       State         I-Node Path

Как видите, кажется, что он может NER установил соединение с egistry-docker-registry.registry.svc.cluster.local:5000. Однако, когда он пытается извлечь sh из реестра, появляется журнал ошибок ...

Это действительно странно.

1 Ответ

0 голосов
/ 14 апреля 2020

Если вы посмотрите на номера журналов, они переместятся с 0020 на 0080. Я подозреваю, что строки из [0080,0085] взяты из вашего локального Skaffold, который пытается получить детали изображения из удаленного реестра, который недоступен из вашего реестра. machine.

Вы можете рассмотреть описание вашей ситуации по следующей проблеме: https://github.com/GoogleContainerTools/skaffold/issues/3841#issuecomment -603582206

...