Acceptance tests using different clients concurrently. Environment deployed from packages.

Build: #3228 failed

Job: Permission POSIX was successful

Job result summary

Completed
Duration
19 minutes
Agent
bamboo-agent-ad-08
Revision
d704831f382422ac250de6a2a1b215bef0202e9c
Total tests
18
Successful since
#3064 ()

Tests

  • 18 tests in total
  • 9 minutes taken in total.

Error summary

The build generated some errors. See the full build log for more details.

  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                 Dload  Upload   Total   Spent    Left  Speed

  0     0    0     0    0     0      0      0 --:--:-- --:--:-- --:--:--     0
100  4822  100  4822    0     0  52413      0 --:--:-- --:--:-- --:--:-- 52413
Unable to find image 'alpine:latest' locally
latest: Pulling from library/alpine
c6a83fedfae6: Pulling fs layer
c6a83fedfae6: Verifying Checksum
c6a83fedfae6: Download complete
c6a83fedfae6: Pull complete
Digest: sha256:0a4eaa0eecf5f8c050e5bba433f58c052be7587ee8af3e8b3910ef9ab5fbe9f5
Status: Downloaded newer image for alpine:latest
Error: Kubernetes cluster unreachable: Get "http://localhost:8080/version?timeout=32s": dial tcp 127.0.0.1:8080: connect: connection refused
E0905 10:45:24.074098    8315 memcache.go:265] couldn't get current server API group list: Get "http://localhost:8080/api?timeout=32s": dial tcp 127.0.0.1:8080: connect: connection refused
E0905 10:45:24.074999    8315 memcache.go:265] couldn't get current server API group list: Get "http://localhost:8080/api?timeout=32s": dial tcp 127.0.0.1:8080: connect: connection refused
E0905 10:45:24.076731    8315 memcache.go:265] couldn't get current server API group list: Get "http://localhost:8080/api?timeout=32s": dial tcp 127.0.0.1:8080: connect: connection refused
E0905 10:45:24.078206    8315 memcache.go:265] couldn't get current server API group list: Get "http://localhost:8080/api?timeout=32s": dial tcp 127.0.0.1:8080: connect: connection refused
E0905 10:45:24.080035    8315 memcache.go:265] couldn't get current server API group list: Get "http://localhost:8080/api?timeout=32s": dial tcp 127.0.0.1:8080: connect: connection refused
The connection to the server localhost:8080 was refused - did you specify the right host or port?
E0905 10:45:24.149195    8325 memcache.go:265] couldn't get current server API group list: Get "http://localhost:8080/api?timeout=32s": dial tcp 127.0.0.1:8080: connect: connection refused
E0905 10:45:24.149730    8325 memcache.go:265] couldn't get current server API group list: Get "http://localhost:8080/api?timeout=32s": dial tcp 127.0.0.1:8080: connect: connection refused
E0905 10:45:24.151559    8325 memcache.go:265] couldn't get current server API group list: Get "http://localhost:8080/api?timeout=32s": dial tcp 127.0.0.1:8080: connect: connection refused
E0905 10:45:24.153087    8325 memcache.go:265] couldn't get current server API group list: Get "http://localhost:8080/api?timeout=32s": dial tcp 127.0.0.1:8080: connect: connection refused
E0905 10:45:24.154156    8325 memcache.go:265] couldn't get current server API group list: Get "http://localhost:8080/api?timeout=32s": dial tcp 127.0.0.1:8080: connect: connection refused
The connection to the server localhost:8080 was refused - did you specify the right host or port?
E0905 10:45:24.219004    8331 memcache.go:265] couldn't get current server API group list: Get "http://localhost:8080/api?timeout=32s": dial tcp 127.0.0.1:8080: connect: connection refused
E0905 10:45:24.219501    8331 memcache.go:265] couldn't get current server API group list: Get "http://localhost:8080/api?timeout=32s": dial tcp 127.0.0.1:8080: connect: connection refused
E0905 10:45:24.221233    8331 memcache.go:265] couldn't get current server API group list: Get "http://localhost:8080/api?timeout=32s": dial tcp 127.0.0.1:8080: connect: connection refused
E0905 10:45:24.223111    8331 memcache.go:265] couldn't get current server API group list: Get "http://localhost:8080/api?timeout=32s": dial tcp 127.0.0.1:8080: connect: connection refused
E0905 10:45:24.224916    8331 memcache.go:265] couldn't get current server API group list: Get "http://localhost:8080/api?timeout=32s": dial tcp 127.0.0.1:8080: connect: connection refused
The connection to the server localhost:8080 was refused - did you specify the right host or port?
E0905 10:45:24.289194    8337 memcache.go:265] couldn't get current server API group list: Get "http://localhost:8080/api?timeout=32s": dial tcp 127.0.0.1:8080: connect: connection refused
E0905 10:45:24.289983    8337 memcache.go:265] couldn't get current server API group list: Get "http://localhost:8080/api?timeout=32s": dial tcp 127.0.0.1:8080: connect: connection refused
E0905 10:45:24.291636    8337 memcache.go:265] couldn't get current server API group list: Get "http://localhost:8080/api?timeout=32s": dial tcp 127.0.0.1:8080: connect: connection refused
E0905 10:45:24.293178    8337 memcache.go:265] couldn't get current server API group list: Get "http://localhost:8080/api?timeout=32s": dial tcp 127.0.0.1:8080: connect: connection refused
E0905 10:45:24.294852    8337 memcache.go:265] couldn't get current server API group list: Get "http://localhost:8080/api?timeout=32s": dial tcp 127.0.0.1:8080: connect: connection refused
The connection to the server localhost:8080 was refused - did you specify the right host or port?
E0905 10:45:24.363455    8344 memcache.go:265] couldn't get current server API group list: Get "http://localhost:8080/api?timeout=32s": dial tcp 127.0.0.1:8080: connect: connection refused
E0905 10:45:24.363798    8344 memcache.go:265] couldn't get current server API group list: Get "http://localhost:8080/api?timeout=32s": dial tcp 127.0.0.1:8080: connect: connection refused
E0905 10:45:24.365209    8344 memcache.go:265] couldn't get current server API group list: Get "http://localhost:8080/api?timeout=32s": dial tcp 127.0.0.1:8080: connect: connection refused
E0905 10:45:24.366625    8344 memcache.go:265] couldn't get current server API group list: Get "http://localhost:8080/api?timeout=32s": dial tcp 127.0.0.1:8080: connect: connection refused
E0905 10:45:24.368193    8344 memcache.go:265] couldn't get current server API group list: Get "http://localhost:8080/api?timeout=32s": dial tcp 127.0.0.1:8080: connect: connection refused
The connection to the server localhost:8080 was refused - did you specify the right host or port?
Error response from daemon: Cannot kill container: fc8393e1d3f7: Container fc8393e1d3f7afc61d5948af69200ee9da7c06634905ae50533b0f7daf1b56b4 is not running
Unable to find image 'ubuntu:14.10' locally
14.10: Pulling from library/ubuntu
[DEPRECATION NOTICE] Docker Image Format v1, and Docker Image manifest version 2, schema 1 support will be removed in an upcoming release. Suggest the author of docker.io/library/ubuntu:14.10 to upgrade the image to the OCI Format, or Docker Image manifest v2, schema 2. More information at https://docs.docker.com/go/deprecated-image-specs/
b0efe5c05b4c: Pulling fs layer
0a1f1b169319: Pulling fs layer
1ceb0a3c7c48: Pulling fs layer
a3ed95caeb02: Pulling fs layer
a3ed95caeb02: Waiting
1ceb0a3c7c48: Download complete
0a1f1b169319: Verifying Checksum
0a1f1b169319: Download complete
a3ed95caeb02: Verifying Checksum
a3ed95caeb02: Download complete
b0efe5c05b4c: Verifying Checksum
b0efe5c05b4c: Download complete
b0efe5c05b4c: Pull complete
0a1f1b169319: Pull complete
1ceb0a3c7c48: Pull complete
a3ed95caeb02: Pull complete
Digest: sha256:6341c688b4b0b82ec735389b3c97df8cf2831b8cb8bd1856779130a86574ac5c
Status: Downloaded newer image for ubuntu:14.10
*
W0905 10:47:48.206470   12565 out.go:239] ! The 'none' driver is designed for experts who need to integrate with an existing VM
! The 'none' driver is designed for experts who need to integrate with an existing VM
W0905 10:47:48.206478   12565 out.go:239] * Most users should use the newer 'docker' driver instead, which does not require root!
* Most users should use the newer 'docker' driver instead, which does not require root!
W0905 10:47:48.206485   12565 out.go:239] * For more information, see: https://minikube.sigs.k8s.io/docs/reference/drivers/none/
* For more information, see: https://minikube.sigs.k8s.io/docs/reference/drivers/none/
W0905 10:47:48.206492   12565 out.go:239] *
*
W0905 10:47:48.206531   12565 out.go:239] ! kubectl and minikube configuration will be stored in /root
! kubectl and minikube configuration will be stored in /root
W0905 10:47:48.206541   12565 out.go:239] ! To use kubectl or minikube commands as your own user, you may need to relocate them. For example, to overwrite your own settings, run:
! To use kubectl or minikube commands as your own user, you may need to relocate them. For example, to overwrite your own settings, run:
W0905 10:47:48.206549   12565 out.go:239] *
*
W0905 10:47:48.206575   12565 out.go:239]   - sudo mv /root/.kube /root/.minikube $HOME
  - sudo mv /root/.kube /root/.minikube $HOME
W0905 10:47:48.206585   12565 out.go:239]   - sudo chown -R $USER $HOME/.kube $HOME/.minikube
  - sudo chown -R $USER $HOME/.kube $HOME/.minikube
W0905 10:47:48.206594   12565 out.go:239] *
*
W0905 10:47:48.206604   12565 out.go:239] * This can also be done automatically by setting the env var CHANGE_MINIKUBE_NONE_USER=true
* This can also be done automatically by setting the env var CHANGE_MINIKUBE_NONE_USER=true
I0905 10:47:48.206456   12565 kubeconfig.go:92] found "minikube" server: "https://10.87.23.82:8443"
I0905 10:47:48.211431   12565 api_server.go:166] Checking apiserver status ...
I0905 10:47:48.211475   12565 exec_runner.go:51] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
I0905 10:47:48.234978   12565 exec_runner.go:51] Run: sudo egrep ^[0-9]+:freezer: /proc/14111/cgroup
I0905 10:47:48.244234   12565 exec_runner.go:51] Run: sudo egrep ^[0-9]+:freezer: /proc/14111/cgroup
I0905 10:47:48.252983   12565 api_server.go:182] apiserver freezer: "8:freezer:/kubepods/burstable/pod45e0f6854c56fb3a8249741829170341/09d9563dc70ed11067873d9da734d75ca986bd123bde9cdca52db018c95ecbac"
I0905 10:47:48.253141   12565 exec_runner.go:51] Run: sudo cat /sys/fs/cgroup/freezer/kubepods/burstable/pod45e0f6854c56fb3a8249741829170341/09d9563dc70ed11067873d9da734d75ca986bd123bde9cdca52db018c95ecbac/freezer.state
I0905 10:47:48.257468   12565 api_server.go:182] apiserver freezer: "8:freezer:/kubepods/burstable/pod45e0f6854c56fb3a8249741829170341/09d9563dc70ed11067873d9da734d75ca986bd123bde9cdca52db018c95ecbac"
I0905 10:47:48.257536   12565 exec_runner.go:51] Run: sudo cat /sys/fs/cgroup/freezer/kubepods/burstable/pod45e0f6854c56fb3a8249741829170341/09d9563dc70ed11067873d9da734d75ca986bd123bde9cdca52db018c95ecbac/freezer.state
I0905 10:47:48.275764   12565 api_server.go:204] freezer state: "THAWED"
I0905 10:47:48.275826   12565 api_server.go:253] Checking apiserver healthz at https://10.87.23.82:8443/healthz ...
I0905 10:47:48.282636   12565 api_server.go:204] freezer state: "THAWED"
I0905 10:47:48.282668   12565 api_server.go:253] Checking apiserver healthz at https://10.87.23.82:8443/healthz ...
I0905 10:47:48.290694   12565 api_server.go:279] https://10.87.23.82:8443/healthz returned 200:
ok
I0905 10:47:48.321393   12565 addons.go:231] Setting addon default-storageclass=true in "minikube"
I0905 10:47:48.321447   12565 host.go:66] Checking if "minikube" exists ...
I0905 10:47:48.323066   12565 kubeconfig.go:92] found "minikube" server: "https://10.87.23.82:8443"
I0905 10:47:48.323085   12565 api_server.go:166] Checking apiserver status ...
I0905 10:47:48.323124   12565 exec_runner.go:51] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
I0905 10:47:48.326253   12565 api_server.go:279] https://10.87.23.82:8443/healthz returned 200:
ok
I0905 10:47:48.357559   12565 out.go:177]   - Using image gcr.io/k8s-minikube/storage-provisioner:v5
I0905 10:47:48.344879   12565 kapi.go:248] "coredns" deployment in "kube-system" namespace and "minikube" context rescaled to 1 replicas
I0905 10:47:48.355407   12565 exec_runner.go:51] Run: sudo egrep ^[0-9]+:freezer: /proc/14111/cgroup
I0905 10:47:48.381565   12565 exec_runner.go:51] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.28.1/kubectl --kubeconfig=/var/lib/minikube/kubeconfig -n kube-system get configmap coredns -o yaml | sed -e '/^        forward . \/etc\/resolv.conf.*/i \        hosts {\n           127.0.0.1 host.minikube.internal\n           fallthrough\n        }' -e '/^        errors *$/i \        log' | sudo /var/lib/minikube/binaries/v1.28.1/kubectl --kubeconfig=/var/lib/minikube/kubeconfig replace -f -"
I0905 10:47:48.384545   12565 addons.go:423] installing /etc/kubernetes/addons/storage-provisioner.yaml
I0905 10:47:48.384578   12565 exec_runner.go:144] found /etc/kubernetes/addons/storage-provisioner.yaml, removing ...
I0905 10:47:48.384588   12565 exec_runner.go:203] rm: /etc/kubernetes/addons/storage-provisioner.yaml
I0905 10:47:48.384847   12565 start.go:223] Will wait 6m0s for node &{Name: IP:10.87.23.82 Port:8443 KubernetesVersion:v1.28.1 ContainerRuntime:docker ControlPlane:true Worker:true}
I0905 10:47:48.384877   12565 exec_runner.go:151] cp: memory --> /etc/kubernetes/addons/storage-provisioner.yaml (2676 bytes)
I0905 10:47:48.385160   12565 exec_runner.go:51] Run: sudo cp -a /tmp/minikube4140070512 /etc/kubernetes/addons/storage-provisioner.yaml
I0905 10:47:48.401569   12565 api_server.go:182] apiserver freezer: "8:freezer:/kubepods/burstable/pod45e0f6854c56fb3a8249741829170341/09d9563dc70ed11067873d9da734d75ca986bd123bde9cdca52db018c95ecbac"
I0905 10:47:48.404817   12565 exec_runner.go:51] Run: sudo cat /sys/fs/cgroup/freezer/kubepods/burstable/pod45e0f6854c56fb3a8249741829170341/09d9563dc70ed11067873d9da734d75ca986bd123bde9cdca52db018c95ecbac/freezer.state
I0905 10:47:48.405046   12565 out.go:177] * Verifying Kubernetes components...
I0905 10:47:48.447264   12565 exec_runner.go:51] Run: sudo systemctl is-active --quiet service kubelet
I0905 10:47:48.423802   12565 api_server.go:204] freezer state: "THAWED"
I0905 10:47:48.447377   12565 api_server.go:253] Checking apiserver healthz at https://10.87.23.82:8443/healthz ...
I0905 10:47:48.425573   12565 exec_runner.go:51] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.28.1/kubectl apply -f /etc/kubernetes/addons/storage-provisioner.yaml
I0905 10:47:48.454827   12565 api_server.go:279] https://10.87.23.82:8443/healthz returned 200:
ok
I0905 10:47:48.455381   12565 addons.go:423] installing /etc/kubernetes/addons/storageclass.yaml
I0905 10:47:48.455423   12565 exec_runner.go:151] cp: memory --> /etc/kubernetes/addons/storageclass.yaml (271 bytes)
I0905 10:47:48.455570   12565 exec_runner.go:51] Run: sudo cp -a /tmp/minikube3886492365 /etc/kubernetes/addons/storageclass.yaml
I0905 10:47:48.469639   12565 api_server.go:52] waiting for apiserver process to appear ...
I0905 10:47:48.469701   12565 exec_runner.go:51] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
I0905 10:47:48.485137   12565 exec_runner.go:51] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.28.1/kubectl apply -f /etc/kubernetes/addons/storageclass.yaml
I0905 10:47:48.498802   12565 api_server.go:72] duration metric: took 113.890189ms to wait for apiserver process to appear ...
I0905 10:47:48.498831   12565 api_server.go:88] waiting for apiserver healthz status ...
I0905 10:47:48.498852   12565 api_server.go:253] Checking apiserver healthz at https://10.87.23.82:8443/healthz ...
I0905 10:47:48.507972   12565 api_server.go:279] https://10.87.23.82:8443/healthz returned 200:
ok
I0905 10:47:48.509595   12565 api_server.go:141] control plane version: v1.28.1
I0905 10:47:48.509632   12565 api_server.go:131] duration metric: took 10.792284ms to wait for apiserver health ...
I0905 10:47:48.509643   12565 system_pods.go:43] waiting for kube-system pods to appear ...
I0905 10:47:48.516959   12565 system_pods.go:59] 4 kube-system pods found
I0905 10:47:48.517205   12565 system_pods.go:61] "etcd-bamboo-agent-ad-08" [3f570dba-f5d0-4b43-8614-a7370399038a] Pending
I0905 10:47:48.517385   12565 system_pods.go:61] "kube-apiserver-bamboo-agent-ad-08" [b9389aee-0d03-470d-9cd5-7f4dc2250cd6] Running / Ready:ContainersNotReady (containers with unready status: [kube-apiserver]) / ContainersReady:ContainersNotReady (containers with unready status: [kube-apiserver])
I0905 10:47:48.517514   12565 system_pods.go:61] "kube-controller-manager-bamboo-agent-ad-08" [6142e627-d078-4bf4-bc64-ac23aaf92b8b] Pending
I0905 10:47:48.517621   12565 system_pods.go:61] "kube-scheduler-bamboo-agent-ad-08" [ad8faea0-d8da-47aa-bfdb-bf2127d4dbdd] Running / Ready:ContainersNotReady (containers with unready status: [kube-scheduler]) / ContainersReady:ContainersNotReady (containers with unready status: [kube-scheduler])
I0905 10:47:48.517745   12565 system_pods.go:74] duration metric: took 8.090632ms to wait for pod list to return data ...
I0905 10:47:48.523240   12565 kubeadm.go:581] duration metric: took 138.318091ms to wait for : map[apiserver:true system_pods:true] ...
I0905 10:47:48.523386   12565 node_conditions.go:102] verifying NodePressure condition ...
I0905 10:47:48.527022   12565 node_conditions.go:122] node storage ephemeral capacity is 25215872Ki
I0905 10:47:48.527065   12565 node_conditions.go:123] node cpu capacity is 4
I0905 10:47:48.527080   12565 node_conditions.go:105] duration metric: took 3.607221ms to run NodePressure ...
I0905 10:47:48.527093   12565 start.go:228] waiting for startup goroutines ...
I0905 10:47:49.386769   12565 exec_runner.go:84] Completed: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.28.1/kubectl --kubeconfig=/var/lib/minikube/kubeconfig -n kube-system get configmap coredns -o yaml | sed -e '/^        forward . \/etc\/resolv.conf.*/i \        hosts {\n           127.0.0.1 host.minikube.internal\n           fallthrough\n        }' -e '/^        errors *$/i \        log' | sudo /var/lib/minikube/binaries/v1.28.1/kubectl --kubeconfig=/var/lib/minikube/kubeconfig replace -f -": (1.004611358s)
I0905 10:47:49.386800   12565 start.go:901] {"host.minikube.internal": 127.0.0.1} host record injected into CoreDNS's ConfigMap
I0905 10:47:49.623294   12565 exec_runner.go:84] Completed: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.28.1/kubectl apply -f /etc/kubernetes/addons/storage-provisioner.yaml: (1.175733515s)
I0905 10:47:49.628024   12565 out.go:177] * Enabled addons: default-storageclass, storage-provisioner
I0905 10:47:49.633081   12565 addons.go:502] enable addons completed in 1.436326706s: enabled=[default-storageclass storage-provisioner]
I0905 10:47:49.633137   12565 start.go:233] waiting for cluster config update ...
I0905 10:47:49.633155   12565 start.go:242] writing updated cluster config ...
I0905 10:47:49.633518   12565 exec_runner.go:51] Run: rm -f paused
I0905 10:47:49.698730   12565 start.go:600] kubectl: 1.28.1, cluster: 1.28.1 (minor skew: 0)
I0905 10:47:49.704490   12565 out.go:177] * Done! kubectl is now configured to use "minikube" cluster and "default" namespace by default
Submodule 'automation-examples' (ssh://git@git.onedata.org:7999/vfs/automation-examples.git) registered for path 'automation-examples'
Submodule 'bamboos' (ssh://git@git.onedata.org:7999/vfs/bamboos.git) registered for path 'bamboos'
Submodule 'cdmi_swagger' (ssh://git@git.onedata.org:7999/vfs/cdmi-swagger.git) registered for path 'cdmi_swagger'
Submodule 'one_env' (ssh://git@git.onedata.org:7999/vfs/one-env.git) registered for path 'one_env'
Submodule 'onepanel_swagger' (ssh://git@git.onedata.org:7999/vfs/onepanel-swagger.git) registered for path 'onepanel_swagger'
Submodule 'oneprovider_swagger' (ssh://git@git.onedata.org:7999/vfs/oneprovider-swagger.git) registered for path 'oneprovider_swagger'
Submodule 'onezone_swagger' (ssh://git@git.onedata.org:7999/vfs/onezone-swagger.git) registered for path 'onezone_swagger'
Cloning into '/mnt/storage/bamboo-agent-home/xml-data/build-dir/ODSRV-MAOPT-CPPT/onedata-acceptance/automation-examples'...
Cloning into '/mnt/storage/bamboo-agent-home/xml-data/build-dir/ODSRV-MAOPT-CPPT/onedata-acceptance/bamboos'...
Cloning into '/mnt/storage/bamboo-agent-home/xml-data/build-dir/ODSRV-MAOPT-CPPT/onedata-acceptance/cdmi_swagger'...
Cloning into '/mnt/storage/bamboo-agent-home/xml-data/build-dir/ODSRV-MAOPT-CPPT/onedata-acceptance/one_env'...
Cloning into '/mnt/storage/bamboo-agent-home/xml-data/build-dir/ODSRV-MAOPT-CPPT/onedata-acceptance/onepanel_swagger'...
Cloning into '/mnt/storage/bamboo-agent-home/xml-data/build-dir/ODSRV-MAOPT-CPPT/onedata-acceptance/oneprovider_swagger'...
Cloning into '/mnt/storage/bamboo-agent-home/xml-data/build-dir/ODSRV-MAOPT-CPPT/onedata-acceptance/onezone_swagger'...
Submodule 'bamboos' (ssh://git@git.onedata.org:7999/vfs/bamboos.git) registered for path 'onepanel_swagger/bamboos'
Cloning into '/mnt/storage/bamboo-agent-home/xml-data/build-dir/ODSRV-MAOPT-CPPT/onedata-acceptance/onepanel_swagger/bamboos'...
Submodule 'bamboos' (ssh://git@git.onedata.org:7999/vfs/bamboos.git) registered for path 'oneprovider_swagger/bamboos'
Cloning into '/mnt/storage/bamboo-agent-home/xml-data/build-dir/ODSRV-MAOPT-CPPT/onedata-acceptance/oneprovider_swagger/bamboos'...
Submodule 'bamboos' (ssh://git@git.onedata.org:7999/vfs/bamboos.git) registered for path 'onezone_swagger/bamboos'
Cloning into '/mnt/storage/bamboo-agent-home/xml-data/build-dir/ODSRV-MAOPT-CPPT/onedata-acceptance/onezone_swagger/bamboos'...
Unable to find image 'docker.onedata.org/one_env:v39' locally
v39: Pulling from one_env
41af1b5f0f51: Pulling fs layer
da812d3cf979: Pulling fs layer
6ea2f9c9eaa1: Pulling fs layer
ba301be5517a: Pulling fs layer
c41cf17cec29: Pulling fs layer
718cbc35a0e0: Pulling fs layer
7640532158c9: Pulling fs layer
6eba1fc2d1b6: Pulling fs layer
caf52bb4b365: Pulling fs layer
3dfbdb225edd: Pulling fs layer
b62f693fa900: Pulling fs layer
79ae03a86415: Pulling fs layer
7640532158c9: Waiting
6eba1fc2d1b6: Waiting
caf52bb4b365: Waiting
3dfbdb225edd: Waiting
b62f693fa900: Waiting
79ae03a86415: Waiting
ba301be5517a: Waiting
c41cf17cec29: Waiting
718cbc35a0e0: Waiting
6ea2f9c9eaa1: Download complete
41af1b5f0f51: Verifying Checksum
41af1b5f0f51: Download complete
ba301be5517a: Verifying Checksum
ba301be5517a: Download complete
c41cf17cec29: Verifying Checksum
c41cf17cec29: Download complete
718cbc35a0e0: Download complete
41af1b5f0f51: Pull complete
7640532158c9: Verifying Checksum
7640532158c9: Download complete
6eba1fc2d1b6: Verifying Checksum
6eba1fc2d1b6: Download complete
caf52bb4b365: Download complete
3dfbdb225edd: Verifying Checksum
3dfbdb225edd: Download complete
b62f693fa900: Download complete
79ae03a86415: Verifying Checksum
79ae03a86415: Download complete
da812d3cf979: Download complete
da812d3cf979: Pull complete
6ea2f9c9eaa1: Pull complete
ba301be5517a: Pull complete
c41cf17cec29: Pull complete
718cbc35a0e0: Pull complete
7640532158c9: Pull complete
6eba1fc2d1b6: Pull complete
caf52bb4b365: Pull complete
3dfbdb225edd: Pull complete
b62f693fa900: Pull complete
79ae03a86415: Pull complete
Digest: sha256:e39dbddda3d96a874a4a2aebf2f188bfeb8ffdeb25e3fd27678ab5197f387940
Status: Downloaded newer image for docker.onedata.org/one_env:v39
error: failed to create clusterrolebinding: clusterrolebindings.rbac.authorization.k8s.io "serviceaccounts-cluster-admin" already exists
Unable to find image 'docker.onedata.org/swagger-aggregator:1.5.0' locally
1.5.0: Pulling from swagger-aggregator
3e51df1a11ab: Pulling fs layer
afe1a672db39: Pulling fs layer
80bad90eec10: Pulling fs layer
ca328de5836f: Pulling fs layer
4f4fb700ef54: Pulling fs layer
f9032f2640a0: Pulling fs layer
982a4aef97ed: Pulling fs layer
aca789ddad1c: Pulling fs layer
a3e17b3016f6: Pulling fs layer
b67d253c6e22: Pulling fs layer
302c9eabb2bb: Pulling fs layer
bb6e4824f633: Pulling fs layer
ca328de5836f: Waiting
4f4fb700ef54: Waiting
982a4aef97ed: Waiting
aca789ddad1c: Waiting
a3e17b3016f6: Waiting
f9032f2640a0: Waiting
b67d253c6e22: Waiting
302c9eabb2bb: Waiting
bb6e4824f633: Waiting
80bad90eec10: Verifying Checksum
80bad90eec10: Download complete
afe1a672db39: Download complete
ca328de5836f: Verifying Checksum
ca328de5836f: Download complete
4f4fb700ef54: Verifying Checksum
4f4fb700ef54: Download complete
f9032f2640a0: Verifying Checksum
f9032f2640a0: Download complete
aca789ddad1c: Verifying Checksum
aca789ddad1c: Download complete
a3e17b3016f6: Verifying Checksum
a3e17b3016f6: Download complete
b67d253c6e22: Verifying Checksum
b67d253c6e22: Download complete
302c9eabb2bb: Verifying Checksum
302c9eabb2bb: Download complete
bb6e4824f633: Verifying Checksum
bb6e4824f633: Download complete
3e51df1a11ab: Verifying Checksum
3e51df1a11ab: Download complete
982a4aef97ed: Verifying Checksum
982a4aef97ed: Download complete
3e51df1a11ab: Pull complete
afe1a672db39: Pull complete
80bad90eec10: Pull complete
ca328de5836f: Pull complete
4f4fb700ef54: Pull complete
f9032f2640a0: Pull complete
982a4aef97ed: Pull complete
aca789ddad1c: Pull complete
a3e17b3016f6: Pull complete
b67d253c6e22: Pull complete
302c9eabb2bb: Pull complete
bb6e4824f633: Pull complete
Digest: sha256:e2e8e762a03a0acdd49e63c4168157cb4e0e79f31f4e815561e9f4c65dbf8ac8
Status: Downloaded newer image for docker.onedata.org/swagger-aggregator:1.5.0
Unable to find image 'swaggerapi/swagger-codegen-cli:2.4.20' locally
2.4.20: Pulling from swaggerapi/swagger-codegen-cli
e7c96db7181b: Pulling fs layer
f910a506b6cb: Pulling fs layer
b6abafe80f63: Pulling fs layer
0d9127f69a1f: Pulling fs layer
0d9127f69a1f: Waiting
f910a506b6cb: Verifying Checksum
f910a506b6cb: Download complete
e7c96db7181b: Verifying Checksum
e7c96db7181b: Download complete
0d9127f69a1f: Verifying Checksum
0d9127f69a1f: Download complete
e7c96db7181b: Pull complete
b6abafe80f63: Verifying Checksum
b6abafe80f63: Download complete
f910a506b6cb: Pull complete
b6abafe80f63: Pull complete
0d9127f69a1f: Pull complete
Digest: sha256:e961c734f4a232ea050293e9b16aed4cc131ffecf4a7d8671f15f1d79bca8796
Status: Downloaded newer image for swaggerapi/swagger-codegen-cli:2.4.20
/bin/sh: 2: [[: not found
e8c6e757d9d8: Verifying Checksum
e8c6e757d9d8: Download complete
c313dba7f09c: Download complete
cdf99cc302c9: Verifying Checksum
cdf99cc302c9: Download complete
635970228e5b: Verifying Checksum
635970228e5b: Download complete
v12: Pulling from onedata/acceptance_mixed
d5fd17ec1767: Pulling fs layer
635970228e5b: Pulling fs layer
e6b8b518c4b7: Pulling fs layer
262a85701f26: Pulling fs layer
56fb40ac7433: Pulling fs layer
72938d7adcac: Pulling fs layer
cdf99cc302c9: Pulling fs layer
99e960dff935: Pulling fs layer
543cdc7d97b9: Pulling fs layer
c5eae658fb83: Pulling fs layer
e8c6e757d9d8: Pulling fs layer
c313dba7f09c: Pulling fs layer
262a85701f26: Waiting
56fb40ac7433: Waiting
72938d7adcac: Waiting
cdf99cc302c9: Waiting
99e960dff935: Waiting
543cdc7d97b9: Waiting
c5eae658fb83: Waiting
e8c6e757d9d8: Waiting
c313dba7f09c: Waiting
e6b8b518c4b7: Verifying Checksum
e6b8b518c4b7: Download complete
d5fd17ec1767: Verifying Checksum
d5fd17ec1767: Download complete
56fb40ac7433: Verifying Checksum
56fb40ac7433: Download complete
262a85701f26: Verifying Checksum
262a85701f26: Download complete
72938d7adcac: Verifying Checksum
72938d7adcac: Download complete
d5fd17ec1767: Pull complete
99e960dff935: Verifying Checksum
99e960dff935: Download complete
543cdc7d97b9: Verifying Checksum
543cdc7d97b9: Download complete
cdf99cc302c9: Verifying Checksum
cdf99cc302c9: Download complete
e8c6e757d9d8: Download complete
c313dba7f09c: Verifying Checksum
c313dba7f09c: Download complete
635970228e5b: Verifying Checksum
635970228e5b: Download complete
c5eae658fb83: Verifying Checksum
c5eae658fb83: Download complete
635970228e5b: Pull complete
e6b8b518c4b7: Pull complete
262a85701f26: Pull complete
56fb40ac7433: Pull complete
72938d7adcac: Pull complete
cdf99cc302c9: Pull complete
99e960dff935: Pull complete
543cdc7d97b9: Pull complete
c5eae658fb83: Pull complete
e8c6e757d9d8: Pull complete
c313dba7f09c: Pull complete
Digest: sha256:96db83e9518bd8b75168f158bad519b54bb4b61336e7ba414e24be2483b152bb
Status: Downloaded newer image for onedata/acceptance_mixed:v12
/usr/local/lib/python3.8/dist-packages/pytest_selenium/drivers/crossbrowsertesting.py:72: SyntaxWarning: "is not" with a literal. Did you mean "!="?
  if report.when == 'setup' or info.get('test_score') is not 'fail':
error: failed to create clusterrolebinding: clusterrolebindings.rbac.authorization.k8s.io "serviceaccounts-cluster-admin" already exists
Defaulted container "oneclient" out of: oneclient, wait-for-onezone (init), wait-for-token-dispenser (init)
id: ‘user1’: no such user
command terminated with exit code 1
Defaulted container "oneclient" out of: oneclient, wait-for-onezone (init), wait-for-token-dispenser (init)
Defaulted container "oneclient" out of: oneclient, wait-for-onezone (init), wait-for-token-dispenser (init)
id: ‘user2’: no such user
command terminated with exit code 1
Defaulted container "oneclient" out of: oneclient, wait-for-onezone (init), wait-for-token-dispenser (init)
Defaulted container "oneclient" out of: oneclient, wait-for-onezone (init), wait-for-token-dispenser (init)
command terminated with exit code 1
Defaulted container "oneclient" out of: oneclient, wait-for-onezone (init), wait-for-token-dispenser (init)
Defaulted container "oneclient" out of: oneclient, wait-for-onezone (init), wait-for-token-dispenser (init)
Defaulted container "oneclient" out of: oneclient, wait-for-onezone (init), wait-for-token-dispenser (init)
Defaulted container "oneclient" out of: oneclient, wait-for-onezone (init), wait-for-token-dispenser (init)
command terminated with exit code 1
Defaulted container "oneclient" out of: oneclient, wait-for-onezone (init), wait-for-token-dispenser (init)
Defaulted container "oneclient" out of: oneclient, wait-for-onezone (init), wait-for-token-dispenser (init)
Defaulted container "oneclient" out of: oneclient, wait-for-onezone (init), wait-for-token-dispenser (init)
Defaulted container "oneclient" out of: oneclient, wait-for-onezone (init), wait-for-token-dispenser (init)
command terminated with exit code 1
Defaulted container "oneclient" out of: oneclient, wait-for-onezone (init), wait-for-token-dispenser (init)
Defaulted container "oneclient" out of: oneclient, wait-for-onezone (init), wait-for-token-dispenser (init)
Defaulted container "oneclient" out of: oneclient, wait-for-onezone (init), wait-for-token-dispenser (init)
id: ‘user1’: no such user
command terminated with exit code 1
id: ‘user2’: no such user
command terminated with exit code 1
command terminated with exit code 1
command terminated with exit code 1
command terminated with exit code 1
Defaulted container "oneclient" out of: oneclient, wait-for-onezone (init), wait-for-token-dispenser (init)
cp: cannot stat 'onedata-acceptance/one_env/sources_info.yaml': No such file or directory
  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                 Dload  Upload   Total   Spent    Left  Speed

  0     0    0     0    0     0      0      0 --:--:-- --:--:-- --:--:--     0
100  4822  100  4822    0     0  53577      0 --:--:-- --:--:-- --:--:-- 53577
Error: No such object: 062191d874f5
Error: No such object: 062191d874f5
Error: No such object: ebe9a2bea75c
Error: No such object: ebe9a2bea75c
Error: No such object: 66d68c610dfa
Error: No such object: 66d68c610dfa
Error: No such object: 8ee90881518e
Error: No such object: 8ee90881518e
Error: No such object: 145835c2c7e3
Error: No such object: 2736a3b95f32
Error: No such object: 2736a3b95f32
Error: No such object: dd76504412d5
Error: No such object: ab4f07505fa3
Error: No such object: ab4f07505fa3
Error response from daemon: Cannot kill container: 802f945b2c09: Container 802f945b2c096039c4d2c7534d210543f1e3a8642a382aaf3e8982b772e3dccf is not running
Error response from daemon: Cannot kill container: 062191d874f5: No such container: 062191d874f5
Error response from daemon: No such container: 062191d874f5
Error response from daemon: Cannot kill container: ebe9a2bea75c: No such container: ebe9a2bea75c
Error response from daemon: No such container: ebe9a2bea75c
Error response from daemon: Cannot kill container: 66d68c610dfa: No such container: 66d68c610dfa
Error response from daemon: No such container: 66d68c610dfa
Error response from daemon: Cannot kill container: 8ee90881518e: No such container: 8ee90881518e
Error response from daemon: No such container: 8ee90881518e
Error response from daemon: Cannot kill container: 145835c2c7e3: No such container: 145835c2c7e3
Error response from daemon: No such container: 145835c2c7e3
Error response from daemon: Cannot kill container: a955bd406213: No such container: a955bd406213
Error response from daemon: No such container: a955bd406213
Error response from daemon: Cannot kill container: 2736a3b95f32: No such container: 2736a3b95f32
Error response from daemon: No such container: 2736a3b95f32
Error response from daemon: Cannot kill container: 7eae78db27b7: No such container: 7eae78db27b7
Error response from daemon: No such container: 7eae78db27b7
Error response from daemon: Cannot kill container: dd76504412d5: No such container: dd76504412d5
Error response from daemon: No such container: dd76504412d5
Error response from daemon: Cannot kill container: ab4f07505fa3: No such container: ab4f07505fa3
Error response from daemon: No such container: ab4f07505fa3