Build: #1610 was successful

Job: 20.02.1 - current was successful

Job result summary

Completed
Duration
33 minutes
Agent
bamboo-agent-amd-01-do-not-disable
Revision
c18700de471305139e99164a5a310688c8d70eab
Total tests
2
Successful since
#1609 ()

Tests

  • 2 tests in total
  • 32 minutes taken in total.

Error summary

The build generated some errors. See the full build log for more details.

  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                 Dload  Upload   Total   Spent    Left  Speed

  0     0    0     0    0     0      0      0 --:--:-- --:--:-- --:--:--     0
100  4822  100  4822    0     0  50757      0 --:--:-- --:--:-- --:--:-- 50757
Error response from daemon: Cannot kill container: 0d8197ba7cdb: Container 0d8197ba7cdb6778f2feadaa10c15788526f2107d9447e1af4d695bf2ae5290d is not running
Submodule 'automation-examples' (ssh://git@git.onedata.org:7999/vfs/automation-examples.git) registered for path 'automation-examples'
Submodule 'bamboos' (ssh://git@git.onedata.org:7999/vfs/bamboos.git) registered for path 'bamboos'
Submodule 'cdmi_swagger' (ssh://git@git.onedata.org:7999/vfs/cdmi-swagger.git) registered for path 'cdmi_swagger'
Submodule 'one_env' (ssh://git@git.onedata.org:7999/vfs/one-env.git) registered for path 'one_env'
Submodule 'onepanel_swagger' (ssh://git@git.onedata.org:7999/vfs/onepanel-swagger.git) registered for path 'onepanel_swagger'
Submodule 'oneprovider_swagger' (ssh://git@git.onedata.org:7999/vfs/oneprovider-swagger.git) registered for path 'oneprovider_swagger'
Submodule 'onezone_swagger' (ssh://git@git.onedata.org:7999/vfs/onezone-swagger.git) registered for path 'onezone_swagger'
Cloning into '/mnt/storage/bamboo-agent-home/xml-data/build-dir/ODSRV-UP-RFCD/onedata/automation-examples'...
Cloning into '/mnt/storage/bamboo-agent-home/xml-data/build-dir/ODSRV-UP-RFCD/onedata/bamboos'...
Cloning into '/mnt/storage/bamboo-agent-home/xml-data/build-dir/ODSRV-UP-RFCD/onedata/cdmi_swagger'...
Cloning into '/mnt/storage/bamboo-agent-home/xml-data/build-dir/ODSRV-UP-RFCD/onedata/one_env'...
Cloning into '/mnt/storage/bamboo-agent-home/xml-data/build-dir/ODSRV-UP-RFCD/onedata/onepanel_swagger'...
Cloning into '/mnt/storage/bamboo-agent-home/xml-data/build-dir/ODSRV-UP-RFCD/onedata/oneprovider_swagger'...
Cloning into '/mnt/storage/bamboo-agent-home/xml-data/build-dir/ODSRV-UP-RFCD/onedata/onezone_swagger'...
Submodule 'bamboos' (ssh://git@git.onedata.org:7999/vfs/bamboos.git) registered for path 'onepanel_swagger/bamboos'
Cloning into '/mnt/storage/bamboo-agent-home/xml-data/build-dir/ODSRV-UP-RFCD/onedata/onepanel_swagger/bamboos'...
Submodule 'bamboos' (ssh://git@git.onedata.org:7999/vfs/bamboos.git) registered for path 'oneprovider_swagger/bamboos'
Cloning into '/mnt/storage/bamboo-agent-home/xml-data/build-dir/ODSRV-UP-RFCD/onedata/oneprovider_swagger/bamboos'...
Submodule 'bamboos' (ssh://git@git.onedata.org:7999/vfs/bamboos.git) registered for path 'onezone_swagger/bamboos'
Cloning into '/mnt/storage/bamboo-agent-home/xml-data/build-dir/ODSRV-UP-RFCD/onedata/onezone_swagger/bamboos'...
error: failed to create clusterrolebinding: clusterrolebindings.rbac.authorization.k8s.io "serviceaccounts-cluster-admin" already exists
coalesce.go:200: warning: cannot overwrite table with non table for external-volume-mounts (map[])
coalesce.go:200: warning: cannot overwrite table with non table for external-volumes (map[])
coalesce.go:200: warning: cannot overwrite table with non table for external-volume-mounts (map[])
coalesce.go:200: warning: cannot overwrite table with non table for external-storages (map[])
coalesce.go:200: warning: cannot overwrite table with non table for external-storages (map[])
coalesce.go:200: warning: cannot overwrite table with non table for external-volumes (map[])
coalesce.go:200: warning: cannot overwrite table with non table for external-volume-mounts (map[])
coalesce.go:200: warning: cannot overwrite table with non table for external-volumes (map[])
coalesce.go:200: warning: cannot overwrite table with non table for external-volume-mounts (map[])
coalesce.go:200: warning: cannot overwrite table with non table for external-volume-mounts (map[])
coalesce.go:200: warning: cannot overwrite table with non table for external-volumes (map[])
coalesce.go:200: warning: cannot overwrite table with non table for external-volumes (map[])
coalesce.go:200: warning: cannot overwrite table with non table for external-volume-mounts (map[])
Error: UPGRADE FAILED: cannot patch "dev-cross-support-job-3p-users" with kind Job: Job.batch "dev-cross-support-job-3p-users" is invalid: spec.template: Invalid value: core.PodTemplateSpec{ObjectMeta:v1.ObjectMeta{Name:"dev", GenerateName:"", Namespace:"", SelfLink:"", UID:"", ResourceVersion:"", Generation:0, CreationTimestamp:time.Date(1, time.January, 1, 0, 0, 0, 0, time.UTC), DeletionTimestamp:<nil>, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app":"dev-cross-support-job-3p", "batch.kubernetes.io/controller-uid":"bec6b405-bda1-4d96-bd45-f6944d6fe9b5", "batch.kubernetes.io/job-name":"dev-cross-support-job-3p-users", "chart":"cross-support-job-3p", "component":"cross-support-job-3p-users", "controller-uid":"bec6b405-bda1-4d96-bd45-f6944d6fe9b5", "dependency-level":"4", "heritage":"Helm", "job":"dev-cross-support-job-3p-users", "job-name":"dev-cross-support-job-3p-users", "release":"dev"}, Annotations:map[string]string{"version":"0.2.18-rc78"}, OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:core.PodSpec{Volumes:[]core.Volume(nil), InitContainers:[]core.Container{core.Container{Name:"wait-for-release", Image:"groundnuty/k8s-wait-for:v2.0", Command:[]string(nil), Args:[]string{"pod", "-l release in (dev), chart notin (cross-support-job-3p, oneclient, jupyter-notebook)\n"}, WorkingDir:"", Ports:[]core.ContainerPort(nil), EnvFrom:[]core.EnvFromSource(nil), Env:[]core.EnvVar(nil), Resources:core.ResourceRequirements{Limits:core.ResourceList(nil), Requests:core.ResourceList(nil), Claims:[]core.ResourceClaim(nil)}, ResizePolicy:[]core.ContainerResizePolicy(nil), RestartPolicy:(*core.ContainerRestartPolicy)(nil), VolumeMounts:[]core.VolumeMount(nil), VolumeDevices:[]core.VolumeDevice(nil), LivenessProbe:(*core.Probe)(nil), ReadinessProbe:(*core.Probe)(nil), StartupProbe:(*core.Probe)(nil), Lifecycle:(*core.Lifecycle)(nil), TerminationMessagePath:"/dev/termination-log", TerminationMessagePolicy:"File", ImagePullPolicy:"IfNotPresent", SecurityContext:(*core.SecurityContext)(nil), Stdin:false, StdinOnce:false, TTY:false}}, Containers:[]core.Container{core.Container{Name:"cross-support-job-3p", Image:"onedata/rest-cli:20.02.12", Command:[]string{"bash", "-c", "set -e; echo \"-k\" > ~/.curlrc ; echo \"-f\" >> ~/.curlrc ; export KEYCLOAK_VARS_INITIALIZED=\"False\" ;\nexit 0;\n"}, Args:[]string(nil), WorkingDir:"", Ports:[]core.ContainerPort(nil), EnvFrom:[]core.EnvFromSource(nil), Env:[]core.EnvVar{core.EnvVar{Name:"ONEZONE_HOST", Value:"https://dev-onezone", ValueFrom:(*core.EnvVarSource)(nil)}, core.EnvVar{Name:"ONEPANEL_BASIC_AUTH", Value:"onepanel:password", ValueFrom:(*core.EnvVarSource)(nil)}, core.EnvVar{Name:"ONEZONE_BASIC_AUTH", Value:"admin:password", ValueFrom:(*core.EnvVarSource)(nil)}, core.EnvVar{Name:"ONEPROVIDER_HOST", Value:"https://dev-oneprovider", ValueFrom:(*core.EnvVarSource)(nil)}, core.EnvVar{Name:"ONEPANEL_HOST", Value:"https://dev-onezone:9443", ValueFrom:(*core.EnvVarSource)(nil)}, core.EnvVar{Name:"KEYCLOAK_HOST", Value:"http://dev-keycloak.default.svc.cluster.local", ValueFrom:(*core.EnvVarSource)(nil)}, core.EnvVar{Name:"TERM", Value:"xterm-256color", ValueFrom:(*core.EnvVarSource)(nil)}}, Resources:core.ResourceRequirements{Limits:core.ResourceList(nil), Requests:core.ResourceList(nil), Claims:[]core.ResourceClaim(nil)}, ResizePolicy:[]core.ContainerResizePolicy(nil), RestartPolicy:(*core.ContainerRestartPolicy)(nil), VolumeMounts:[]core.VolumeMount(nil), VolumeDevices:[]core.VolumeDevice(nil), LivenessProbe:(*core.Probe)(nil), ReadinessProbe:(*core.Probe)(nil), StartupProbe:(*core.Probe)(nil), Lifecycle:(*core.Lifecycle)(nil), TerminationMessagePath:"/dev/termination-log", TerminationMessagePolicy:"File", ImagePullPolicy:"IfNotPresent", SecurityContext:(*core.SecurityContext)(nil), Stdin:false, StdinOnce:false, TTY:false}}, EphemeralContainers:[]core.EphemeralContainer(nil), RestartPolicy:"Never", TerminationGracePeriodSeconds:(*int64)(0xc00e2fea20), ActiveDeadlineSeconds:(*int64)(nil), DNSPolicy:"ClusterFirst", NodeSelector:map[string]string(nil), ServiceAccountName:"", AutomountServiceAccountToken:(*bool)(nil), NodeName:"", SecurityContext:(*core.PodSecurityContext)(0xc00e833170), ImagePullSecrets:[]core.LocalObjectReference{}, Hostname:"", Subdomain:"", SetHostnameAsFQDN:(*bool)(nil), Affinity:(*core.Affinity)(nil), SchedulerName:"default-scheduler", Tolerations:[]core.Toleration(nil), HostAliases:[]core.HostAlias(nil), PriorityClassName:"", Priority:(*int32)(nil), PreemptionPolicy:(*core.PreemptionPolicy)(nil), DNSConfig:(*core.PodDNSConfig)(nil), ReadinessGates:[]core.PodReadinessGate(nil), RuntimeClassName:(*string)(nil), Overhead:core.ResourceList(nil), EnableServiceLinks:(*bool)(nil), TopologySpreadConstraints:[]core.TopologySpreadConstraint(nil), OS:(*core.PodOS)(nil), SchedulingGates:[]core.PodSchedulingGate(nil), ResourceClaims:[]core.PodResourceClaim(nil)}}: field is immutable
coalesce.go:200: warning: cannot overwrite table with non table for external-volume-mounts (map[])
coalesce.go:200: warning: cannot overwrite table with non table for external-volumes (map[])
coalesce.go:200: warning: cannot overwrite table with non table for external-storages (map[])
coalesce.go:200: warning: cannot overwrite table with non table for external-volumes (map[])
coalesce.go:200: warning: cannot overwrite table with non table for external-volume-mounts (map[])
coalesce.go:200: warning: cannot overwrite table with non table for external-volume-mounts (map[])
coalesce.go:200: warning: cannot overwrite table with non table for external-volumes (map[])
coalesce.go:200: warning: cannot overwrite table with non table for external-storages (map[])
coalesce.go:200: warning: cannot overwrite table with non table for external-volumes (map[])
coalesce.go:200: warning: cannot overwrite table with non table for external-volume-mounts (map[])
coalesce.go:200: warning: cannot overwrite table with non table for external-volumes (map[])
coalesce.go:200: warning: cannot overwrite table with non table for external-volume-mounts (map[])
coalesce.go:200: warning: cannot overwrite table with non table for external-volumes (map[])
coalesce.go:200: warning: cannot overwrite table with non table for external-volume-mounts (map[])
coalesce.go:200: warning: cannot overwrite table with non table for external-volumes (map[])
coalesce.go:200: warning: cannot overwrite table with non table for external-volume-mounts (map[])
coalesce.go:200: warning: cannot overwrite table with non table for external-storages (map[])
coalesce.go:200: warning: cannot overwrite table with non table for external-volumes (map[])
coalesce.go:200: warning: cannot overwrite table with non table for external-volume-mounts (map[])
coalesce.go:200: warning: cannot overwrite table with non table for external-volumes (map[])
coalesce.go:200: warning: cannot overwrite table with non table for external-volume-mounts (map[])
coalesce.go:200: warning: cannot overwrite table with non table for external-storages (map[])
coalesce.go:200: warning: cannot overwrite table with non table for external-volume-mounts (map[])
coalesce.go:200: warning: cannot overwrite table with non table for external-volumes (map[])
coalesce.go:200: warning: cannot overwrite table with non table for external-volume-mounts (map[])
coalesce.go:200: warning: cannot overwrite table with non table for external-volumes (map[])
coalesce.go:200: warning: cannot overwrite table with non table for external-volumes (map[])
coalesce.go:200: warning: cannot overwrite table with non table for external-volume-mounts (map[])
Error: UPGRADE FAILED: cannot patch "dev-cross-support-job-3p-users" with kind Job: Job.batch "dev-cross-support-job-3p-users" is invalid: spec.template: Invalid value: core.PodTemplateSpec{ObjectMeta:v1.ObjectMeta{Name:"dev", GenerateName:"", Namespace:"", SelfLink:"", UID:"", ResourceVersion:"", Generation:0, CreationTimestamp:time.Date(1, time.January, 1, 0, 0, 0, 0, time.UTC), DeletionTimestamp:<nil>, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app":"dev-cross-support-job-3p", "batch.kubernetes.io/controller-uid":"bec6b405-bda1-4d96-bd45-f6944d6fe9b5", "batch.kubernetes.io/job-name":"dev-cross-support-job-3p-users", "chart":"cross-support-job-3p", "component":"cross-support-job-3p-users", "controller-uid":"bec6b405-bda1-4d96-bd45-f6944d6fe9b5", "dependency-level":"4", "heritage":"Helm", "job":"dev-cross-support-job-3p-users", "job-name":"dev-cross-support-job-3p-users", "release":"dev"}, Annotations:map[string]string{"version":"0.2.18-rc78"}, OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:core.PodSpec{Volumes:[]core.Volume(nil), InitContainers:[]core.Container{core.Container{Name:"wait-for-release", Image:"groundnuty/k8s-wait-for:v2.0", Command:[]string(nil), Args:[]string{"pod", "-l release in (dev), chart notin (cross-support-job-3p, oneclient, jupyter-notebook)\n"}, WorkingDir:"", Ports:[]core.ContainerPort(nil), EnvFrom:[]core.EnvFromSource(nil), Env:[]core.EnvVar(nil), Resources:core.ResourceRequirements{Limits:core.ResourceList(nil), Requests:core.ResourceList(nil), Claims:[]core.ResourceClaim(nil)}, ResizePolicy:[]core.ContainerResizePolicy(nil), RestartPolicy:(*core.ContainerRestartPolicy)(nil), VolumeMounts:[]core.VolumeMount(nil), VolumeDevices:[]core.VolumeDevice(nil), LivenessProbe:(*core.Probe)(nil), ReadinessProbe:(*core.Probe)(nil), StartupProbe:(*core.Probe)(nil), Lifecycle:(*core.Lifecycle)(nil), TerminationMessagePath:"/dev/termination-log", TerminationMessagePolicy:"File", ImagePullPolicy:"IfNotPresent", SecurityContext:(*core.SecurityContext)(nil), Stdin:false, StdinOnce:false, TTY:false}}, Containers:[]core.Container{core.Container{Name:"cross-support-job-3p", Image:"onedata/rest-cli:20.02.12", Command:[]string{"bash", "-c", "set -e; echo \"-k\" > ~/.curlrc ; echo \"-f\" >> ~/.curlrc ; export KEYCLOAK_VARS_INITIALIZED=\"False\" ;\nexit 0;\n"}, Args:[]string(nil), WorkingDir:"", Ports:[]core.ContainerPort(nil), EnvFrom:[]core.EnvFromSource(nil), Env:[]core.EnvVar{core.EnvVar{Name:"ONEZONE_HOST", Value:"https://dev-onezone", ValueFrom:(*core.EnvVarSource)(nil)}, core.EnvVar{Name:"ONEPANEL_BASIC_AUTH", Value:"onepanel:password", ValueFrom:(*core.EnvVarSource)(nil)}, core.EnvVar{Name:"ONEZONE_BASIC_AUTH", Value:"admin:password", ValueFrom:(*core.EnvVarSource)(nil)}, core.EnvVar{Name:"ONEPROVIDER_HOST", Value:"https://dev-oneprovider", ValueFrom:(*core.EnvVarSource)(nil)}, core.EnvVar{Name:"ONEPANEL_HOST", Value:"https://dev-onezone:9443", ValueFrom:(*core.EnvVarSource)(nil)}, core.EnvVar{Name:"KEYCLOAK_HOST", Value:"http://dev-keycloak.default.svc.cluster.local", ValueFrom:(*core.EnvVarSource)(nil)}, core.EnvVar{Name:"TERM", Value:"xterm-256color", ValueFrom:(*core.EnvVarSource)(nil)}}, Resources:core.ResourceRequirements{Limits:core.ResourceList(nil), Requests:core.ResourceList(nil), Claims:[]core.ResourceClaim(nil)}, ResizePolicy:[]core.ContainerResizePolicy(nil), RestartPolicy:(*core.ContainerRestartPolicy)(nil), VolumeMounts:[]core.VolumeMount(nil), VolumeDevices:[]core.VolumeDevice(nil), LivenessProbe:(*core.Probe)(nil), ReadinessProbe:(*core.Probe)(nil), StartupProbe:(*core.Probe)(nil), Lifecycle:(*core.Lifecycle)(nil), TerminationMessagePath:"/dev/termination-log", TerminationMessagePolicy:"File", ImagePullPolicy:"IfNotPresent", SecurityContext:(*core.SecurityContext)(nil), Stdin:false, StdinOnce:false, TTY:false}}, EphemeralContainers:[]core.EphemeralContainer(nil), RestartPolicy:"Never", TerminationGracePeriodSeconds:(*int64)(0xc00cefed60), ActiveDeadlineSeconds:(*int64)(nil), DNSPolicy:"ClusterFirst", NodeSelector:map[string]string(nil), ServiceAccountName:"", AutomountServiceAccountToken:(*bool)(nil), NodeName:"", SecurityContext:(*core.PodSecurityContext)(0xc00d02fc20), ImagePullSecrets:[]core.LocalObjectReference{}, Hostname:"", Subdomain:"", SetHostnameAsFQDN:(*bool)(nil), Affinity:(*core.Affinity)(nil), SchedulerName:"default-scheduler", Tolerations:[]core.Toleration(nil), HostAliases:[]core.HostAlias(nil), PriorityClassName:"", Priority:(*int32)(nil), PreemptionPolicy:(*core.PreemptionPolicy)(nil), DNSConfig:(*core.PodDNSConfig)(nil), ReadinessGates:[]core.PodReadinessGate(nil), RuntimeClassName:(*string)(nil), Overhead:core.ResourceList(nil), EnableServiceLinks:(*bool)(nil), TopologySpreadConstraints:[]core.TopologySpreadConstraint(nil), OS:(*core.PodOS)(nil), SchedulingGates:[]core.PodSchedulingGate(nil), ResourceClaims:[]core.PodResourceClaim(nil)}}: field is immutable
Defaulted container "oneclient" out of: oneclient, wait-for-onezone (init), wait-for-token-dispenser (init)
id: ‘user1’: no such user
command terminated with exit code 1
Defaulted container "oneclient" out of: oneclient, wait-for-onezone (init), wait-for-token-dispenser (init)
Defaulted container "oneclient" out of: oneclient, wait-for-onezone (init), wait-for-token-dispenser (init)
id: ‘user2’: no such user
command terminated with exit code 1
Defaulted container "oneclient" out of: oneclient, wait-for-onezone (init), wait-for-token-dispenser (init)
Defaulted container "oneclient" out of: oneclient, wait-for-onezone (init), wait-for-token-dispenser (init)
command terminated with exit code 1
Defaulted container "oneclient" out of: oneclient, wait-for-onezone (init), wait-for-token-dispenser (init)
Defaulted container "oneclient" out of: oneclient, wait-for-onezone (init), wait-for-token-dispenser (init)
Defaulted container "oneclient" out of: oneclient, wait-for-onezone (init), wait-for-token-dispenser (init)
Defaulted container "oneclient" out of: oneclient, wait-for-onezone (init), wait-for-token-dispenser (init)
command terminated with exit code 1
Defaulted container "oneclient" out of: oneclient, wait-for-onezone (init), wait-for-token-dispenser (init)
Defaulted container "oneclient" out of: oneclient, wait-for-onezone (init), wait-for-token-dispenser (init)
Defaulted container "oneclient" out of: oneclient, wait-for-onezone (init), wait-for-token-dispenser (init)
Defaulted container "oneclient" out of: oneclient, wait-for-onezone (init), wait-for-token-dispenser (init)
command terminated with exit code 1
Defaulted container "oneclient" out of: oneclient, wait-for-onezone (init), wait-for-token-dispenser (init)
Defaulted container "oneclient" out of: oneclient, wait-for-onezone (init), wait-for-token-dispenser (init)
Defaulted container "oneclient" out of: oneclient, wait-for-onezone (init), wait-for-token-dispenser (init)
Defaulted container "oneclient" out of: oneclient, wait-for-onezone (init), wait-for-token-dispenser (init)
id: ‘user1’: no such user
command terminated with exit code 1
Defaulted container "oneclient" out of: oneclient, wait-for-onezone (init), wait-for-token-dispenser (init)
Defaulted container "oneclient" out of: oneclient, wait-for-onezone (init), wait-for-token-dispenser (init)
id: ‘user2’: no such user
command terminated with exit code 1
Defaulted container "oneclient" out of: oneclient, wait-for-onezone (init), wait-for-token-dispenser (init)
Defaulted container "oneclient" out of: oneclient, wait-for-onezone (init), wait-for-token-dispenser (init)
command terminated with exit code 1
Defaulted container "oneclient" out of: oneclient, wait-for-onezone (init), wait-for-token-dispenser (init)
Defaulted container "oneclient" out of: oneclient, wait-for-onezone (init), wait-for-token-dispenser (init)
Defaulted container "oneclient" out of: oneclient, wait-for-onezone (init), wait-for-token-dispenser (init)
Defaulted container "oneclient" out of: oneclient, wait-for-onezone (init), wait-for-token-dispenser (init)
command terminated with exit code 1
Defaulted container "oneclient" out of: oneclient, wait-for-onezone (init), wait-for-token-dispenser (init)
Defaulted container "oneclient" out of: oneclient, wait-for-onezone (init), wait-for-token-dispenser (init)
Defaulted container "oneclient" out of: oneclient, wait-for-onezone (init), wait-for-token-dispenser (init)
Defaulted container "oneclient" out of: oneclient, wait-for-onezone (init), wait-for-token-dispenser (init)
command terminated with exit code 1
Defaulted container "oneclient" out of: oneclient, wait-for-onezone (init), wait-for-token-dispenser (init)
Defaulted container "oneclient" out of: oneclient, wait-for-onezone (init), wait-for-token-dispenser (init)
Defaulted container "oneclient" out of: oneclient, wait-for-onezone (init), wait-for-token-dispenser (init)
Defaulted container "oneclient" out of: oneclient, wait-for-onezone (init), wait-for-token-dispenser (init)
Defaulted container "oneclient" out of: oneclient, wait-for-onezone (init), wait-for-token-dispenser (init)
rsync: change_dir#3 "/tmp/logs" failed: No such file or directory (2)
rsync error: errors selecting input/output files, dirs (code 3) at main.c(828) [sender=3.1.3]
command terminated with exit code 3
Defaulted container "oneprovider" out of: oneprovider, wait-for-volume-s3-init (init), onezone-registration-token (init)
Defaulted container "oneprovider" out of: oneprovider, wait-for-volume-s3-init (init), onezone-registration-token (init)
Defaulted container "oneprovider" out of: oneprovider, wait-for-volume-s3-init (init), onezone-registration-token (init)
Defaulted container "oneprovider" out of: oneprovider, wait-for-volume-s3-init (init), onezone-registration-token (init)
Defaulted container "oneprovider" out of: oneprovider, wait-for-volume-s3-init (init), onezone-registration-token (init)
Defaulted container "oneprovider" out of: oneprovider, wait-for-volume-s3-init (init), onezone-registration-token (init)
cp: cannot stat 'onedata/one_env/sources_info.yaml': No such file or directory
mv: will not overwrite just-created 'onedata/tests/upgrade/logs/images.yaml' with 'onedata/tests/upgrade/logs/1op-compatibility-overlay/before_upgrade.upgrade_meta_test.1731021420.881956/images.yaml'
mv: will not overwrite just-created 'onedata/tests/upgrade/logs/images.yaml' with 'onedata/tests/upgrade/logs/1op-compatibility-overlay/before_upgrade.upgrade_meta_test.1731021447.6923122/images.yaml'
mv: will not overwrite just-created 'onedata/tests/upgrade/logs/images.yaml' with 'onedata/tests/upgrade/logs/2op-compatibility-overlay/after_upgrade.upgrade_meta_test.1731022759.573505/images.yaml'
mv: will not overwrite just-created 'onedata/tests/upgrade/logs/images.yaml' with 'onedata/tests/upgrade/logs/2op-compatibility-overlay/before_upgrade.upgrade_meta_test.1731022213.2394114/images.yaml'
mv: will not overwrite just-created 'onedata/tests/upgrade/logs/images.yaml' with 'onedata/tests/upgrade/logs/2op-compatibility-overlay/before_upgrade.upgrade_meta_test.1731022256.7069783/images.yaml'
  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                 Dload  Upload   Total   Spent    Left  Speed

  0     0    0     0    0     0      0      0 --:--:-- --:--:-- --:--:--     0
100  4822  100  4822    0     0  54179      0 --:--:-- --:--:-- --:--:-- 54179
Error response from daemon: Cannot kill container: f9dd4970afb0: Container f9dd4970afb0be32d5042cda2ef68e065415b2b15884d39c39f501ccacf3037e is not running
Error response from daemon: Cannot kill container: 8a7613269930: Container 8a7613269930346a705423a7e9bf0bbec82169bd640b0b6573a8649c7f785635 is not running
Error response from daemon: Cannot kill container: b032965fb324: Container b032965fb324d32ce135125124bd7ccceff19b4c331166951799ef6a0409ca3a is not running
Error response from daemon: Cannot kill container: 47776cb18061: Container 47776cb18061ac4cb4f267a268bc0dfba8d242b6e8b71b4c1294ab18d83502c6 is not running
Error response from daemon: Cannot kill container: d49999f553f0: Container d49999f553f07105651daea672439dd98f073c747b857ab3a7ca781348a1bfa5 is not running
Error response from daemon: Cannot kill container: b2815973dfec: Container b2815973dfecb7aaff5d5aa2fbe89676eafaf0d165ae1598abf2885006132fa3 is not running
Error response from daemon: Cannot kill container: 3fa93140646d: Container 3fa93140646df387a73761e28553e18f9c1fed570ec29ad9dc3f50d54920285d is not running
Error response from daemon: Cannot kill container: e4fce010d2d0: Container e4fce010d2d0f1b0bd0e08cd461ebfcf3a4feb97c10a2ae41256854e1c8be8ce is not running
Error response from daemon: Cannot kill container: 2d3b5f3a00b4: Container 2d3b5f3a00b47f732fa17d56ad44e836295aea7397b29b2bc1be2b8f079d4427 is not running
Error response from daemon: Cannot kill container: 665a66e1f22f: Container 665a66e1f22f62a5b75fba6e376b916dc1d6e219c0c8f6d1193750af8bdefdd8 is not running
Error response from daemon: Cannot kill container: 3e3e97d762bb: Container 3e3e97d762bbf6d8ebf347f76af16f49c2f238cde97681e04cec9e5fded24145 is not running
Error response from daemon: Cannot kill container: ea5806d36770: Container ea5806d367705f16c08a94cd99ebf1425fec982f7fc96f347a9443f16233150a is not running
Error response from daemon: Cannot kill container: 3f1ddb6c4034: Container 3f1ddb6c4034fe8ae02306d31d1b09ae731820e972a359a55ca1ce4fe695cfbf is not running
Error response from daemon: Cannot kill container: eb3e14abcdfb: Container eb3e14abcdfba8f079c8a424afe40e527996e530411c049835922d815f6f8ca4 is not running
Error response from daemon: Cannot kill container: c4eefbb85063: Container c4eefbb850632d291332aefe31a7419c8f1a1696bcc65c5ede03ace95d12c181 is not running
Error response from daemon: Cannot kill container: f9e3e6e64222: Container f9e3e6e6422215af897c44f7b82e0219c714a8e82af6a1d002bae19ca7f20215 is not running
Error response from daemon: Cannot kill container: 9e21125fd9d3: Container 9e21125fd9d30cc35dcb83ba3ddb585d138572bcedfaa483b25c1c96cc2f5690 is not running
Error response from daemon: Cannot kill container: 5641d7fca767: Container 5641d7fca767d133d671a9baf2489438fea7dd99c256f9ef781c1cba3aed40de is not running
Error response from daemon: Cannot kill container: 4fec489d9381: Container 4fec489d938182cef5d6dfd083536e6ef90af3155cc948f02a70d518d6b5e74a is not running
Error response from daemon: Cannot kill container: d3dd3666eeea: Container d3dd3666eeeae33c03be697468cd906051845cd12dcc35f571cfb79544611b29 is not running
Error response from daemon: Cannot kill container: 7049bedcb1c8: Container 7049bedcb1c89b65906fdf9363912a030f7b2e1ee20e1cec6ef55fc6116c3ae9 is not running
Error response from daemon: Cannot kill container: 5cde0779ef28: Container 5cde0779ef288e557658cc6eb9dd0385241e1cc315d19e16497d74caa5f5ae71 is not running
Error response from daemon: Cannot kill container: 79c4d3affcef: Container 79c4d3affcef4d0162813181922bc395264a1a4c59da56785b4bec552b133712 is not running
Error response from daemon: Cannot kill container: d96b5391a6e2: Container d96b5391a6e250462df30fd3a38c2dc4d2f97b426e62a09f03274fdb074f4946 is not running
Error response from daemon: Cannot kill container: 491e7f8fe3d7: Container 491e7f8fe3d7cdb27d8a44f443480a8c9a4e95f027acf4424a298a0d34dc5ce3 is not running