Error lines from build-log.txt
... skipping 261 lines ...
2024/11/18 18:05:11 Building github.com/tektoncd/pipeline/cmd/webhook for linux/amd64
2024/11/18 18:05:11 git is in a dirty state
Please check in your pipeline what can be changing the following files:
?? kind.yaml
2024/11/18 18:05:11 Building github.com/tektoncd/pipeline/cmd/resolvers for linux/amd64
Error: error processing import paths in "config/resolvers/resolvers-deployment.yaml": error resolving image references: build: go build: exit status 1: # github.com/tektoncd/pipeline/pkg/resolution/resolver/bundle
pkg/resolution/resolver/bundle/bundle.go:19:2: "errors" imported and not used
ERROR: Pipeline image resolve failed
***************************************
*** E2E TEST FAILED ***
*** Start of information dump ***
***************************************
>>> All resources:
NAMESPACE NAME READY STATUS RESTARTS AGE
kube-system pod/coredns-76f75df574-64thb 1/1 Running 0 4m3s
kube-system pod/coredns-76f75df574-kjptg 1/1 Running 0 4m3s
... skipping 92 lines ...
kube-system 3m54s Normal Started pod/kindnet-k2rkg Started container kindnet-cni
kube-system 4m1s Normal Scheduled pod/kindnet-kzng8 Successfully assigned kube-system/kindnet-kzng8 to kind-worker
kube-system 3m59s Normal Pulled pod/kindnet-kzng8 Container image "docker.io/kindest/kindnetd:v20240513-cd2ac642" already present on machine
kube-system 3m57s Normal Created pod/kindnet-kzng8 Created container kindnet-cni
kube-system 3m56s Normal Started pod/kindnet-kzng8 Started container kindnet-cni
kube-system 4m4s Normal Scheduled pod/kindnet-l72d6 Successfully assigned kube-system/kindnet-l72d6 to kind-control-plane
kube-system 4m4s Warning FailedMount pod/kindnet-l72d6 MountVolume.SetUp failed for volume "kube-api-access-qnrqk" : configmap "kube-root-ca.crt" not found
kube-system 4m3s Normal Pulled pod/kindnet-l72d6 Container image "docker.io/kindest/kindnetd:v20240513-cd2ac642" already present on machine
kube-system 4m2s Normal Created pod/kindnet-l72d6 Created container kindnet-cni
kube-system 4m1s Normal Started pod/kindnet-l72d6 Started container kindnet-cni
kube-system 3m59s Normal Scheduled pod/kindnet-ndc7m Successfully assigned kube-system/kindnet-ndc7m to kind-worker2
kube-system 3m56s Normal Pulled pod/kindnet-ndc7m Container image "docker.io/kindest/kindnetd:v20240513-cd2ac642" already present on machine
kube-system 3m54s Normal Created pod/kindnet-ndc7m Created container kindnet-cni
... skipping 13 lines ...
kube-system 3m55s Normal Started pod/kube-proxy-4n562 Started container kube-proxy
kube-system 3m59s Normal Scheduled pod/kube-proxy-mgmrr Successfully assigned kube-system/kube-proxy-mgmrr to kind-worker2
kube-system 3m56s Normal Pulled pod/kube-proxy-mgmrr Container image "registry.k8s.io/kube-proxy:v1.29.4" already present on machine
kube-system 3m55s Normal Created pod/kube-proxy-mgmrr Created container kube-proxy
kube-system 3m55s Normal Started pod/kube-proxy-mgmrr Started container kube-proxy
kube-system 4m4s Normal Scheduled pod/kube-proxy-srw6g Successfully assigned kube-system/kube-proxy-srw6g to kind-control-plane
kube-system 4m4s Warning FailedMount pod/kube-proxy-srw6g MountVolume.SetUp failed for volume "kube-api-access-t6jj4" : configmap "kube-root-ca.crt" not found
kube-system 4m3s Normal Pulled pod/kube-proxy-srw6g Container image "registry.k8s.io/kube-proxy:v1.29.4" already present on machine
kube-system 4m2s Normal Created pod/kube-proxy-srw6g Created container kube-proxy
kube-system 4m2s Normal Started pod/kube-proxy-srw6g Started container kube-proxy
kube-system 4m4s Normal SuccessfulCreate daemonset/kube-proxy Created pod: kube-proxy-srw6g
kube-system 4m1s Normal SuccessfulCreate daemonset/kube-proxy Created pod: kube-proxy-44mrs
kube-system 3m59s Normal SuccessfulCreate daemonset/kube-proxy Created pod: kube-proxy-4n562
... skipping 12 lines ...
metallb-system 3m32s Normal Pulled pod/controller-5c6b6c8447-vq28f Successfully pulled image "quay.io/metallb/controller:v0.13.10" in 5.762s (20.991s including waiting)
metallb-system 3m32s Normal Created pod/controller-5c6b6c8447-vq28f Created container controller
metallb-system 3m32s Normal Started pod/controller-5c6b6c8447-vq28f Started container controller
metallb-system 3m56s Normal SuccessfulCreate replicaset/controller-5c6b6c8447 Created pod: controller-5c6b6c8447-vq28f
metallb-system 3m56s Normal ScalingReplicaSet deployment/controller Scaled up replica set controller-5c6b6c8447 to 1
metallb-system 3m51s Normal Scheduled pod/speaker-f7vvs Successfully assigned metallb-system/speaker-f7vvs to kind-worker2
metallb-system 3m50s Warning FailedMount pod/speaker-f7vvs MountVolume.SetUp failed for volume "memberlist" : failed to sync secret cache: timed out waiting for the condition
metallb-system 3m50s Warning FailedMount pod/speaker-f7vvs MountVolume.SetUp failed for volume "metallb-excludel2" : failed to sync configmap cache: timed out waiting for the condition
metallb-system 3m49s Normal Pulling pod/speaker-f7vvs Pulling image "quay.io/metallb/speaker:v0.13.10"
metallb-system 3m35s Normal Pulled pod/speaker-f7vvs Successfully pulled image "quay.io/metallb/speaker:v0.13.10" in 13.832s (13.832s including waiting)
metallb-system 3m35s Normal Created pod/speaker-f7vvs Created container speaker
metallb-system 3m34s Normal Started pod/speaker-f7vvs Started container speaker
metallb-system 3m56s Normal Scheduled pod/speaker-hmq6d Successfully assigned metallb-system/speaker-hmq6d to kind-control-plane
metallb-system 3m55s Normal Pulling pod/speaker-hmq6d Pulling image "quay.io/metallb/speaker:v0.13.10"
... skipping 12 lines ...
metallb-system 3m37s Normal Started pod/speaker-p8jf7 Started container speaker
metallb-system 3m56s Normal SuccessfulCreate daemonset/speaker Created pod: speaker-hmq6d
metallb-system 3m55s Normal SuccessfulCreate daemonset/speaker Created pod: speaker-p8jf7
metallb-system 3m51s Normal SuccessfulCreate daemonset/speaker Created pod: speaker-m79w8
metallb-system 3m51s Normal SuccessfulCreate daemonset/speaker Created pod: speaker-f7vvs
***************************************
*** E2E TEST FAILED ***
*** End of information dump ***
***************************************
+ EXIT_VALUE=1
+ set +o xtrace
Cleaning up after docker in docker.
================================================================================
... skipping 4 lines ...