Build: #1613 was successful

Job: 20.02.19 - current was successful

Job result summary

Completed
Duration
21 minutes
Agent
bamboo-agent-ad-08
Revision
e08cf96f4a472cf11845ec3b04a704bc1a7e3253
Total tests
2
Successful since
#1611 ()

Tests

  • 2 tests in total
  • 20 minutes taken in total.

Error summary

The build generated some errors. See the full build log for more details.

  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                 Dload  Upload   Total   Spent    Left  Speed

  0     0    0     0    0     0      0      0 --:--:-- --:--:-- --:--:--     0
100  4822  100  4822    0     0  60275      0 --:--:-- --:--:-- --:--:-- 60275
Error response from daemon: Cannot kill container: bad4031629f7: Container bad4031629f7f5e14cefd3011a84175f2a1edda18e7a3b6d98e16e8b36e6fdb9 is not running
Submodule 'automation-examples' (ssh://git@git.onedata.org:7999/vfs/automation-examples.git) registered for path 'automation-examples'
Submodule 'bamboos' (ssh://git@git.onedata.org:7999/vfs/bamboos.git) registered for path 'bamboos'
Submodule 'cdmi_swagger' (ssh://git@git.onedata.org:7999/vfs/cdmi-swagger.git) registered for path 'cdmi_swagger'
Submodule 'one_env' (ssh://git@git.onedata.org:7999/vfs/one-env.git) registered for path 'one_env'
Submodule 'onepanel_swagger' (ssh://git@git.onedata.org:7999/vfs/onepanel-swagger.git) registered for path 'onepanel_swagger'
Submodule 'oneprovider_swagger' (ssh://git@git.onedata.org:7999/vfs/oneprovider-swagger.git) registered for path 'oneprovider_swagger'
Submodule 'onezone_swagger' (ssh://git@git.onedata.org:7999/vfs/onezone-swagger.git) registered for path 'onezone_swagger'
Cloning into '/mnt/storage/bamboo-agent-home/xml-data/build-dir/ODSRV-UP-U20026DEV/onedata/automation-examples'...
Cloning into '/mnt/storage/bamboo-agent-home/xml-data/build-dir/ODSRV-UP-U20026DEV/onedata/bamboos'...
Cloning into '/mnt/storage/bamboo-agent-home/xml-data/build-dir/ODSRV-UP-U20026DEV/onedata/cdmi_swagger'...
Cloning into '/mnt/storage/bamboo-agent-home/xml-data/build-dir/ODSRV-UP-U20026DEV/onedata/one_env'...
Cloning into '/mnt/storage/bamboo-agent-home/xml-data/build-dir/ODSRV-UP-U20026DEV/onedata/onepanel_swagger'...
Cloning into '/mnt/storage/bamboo-agent-home/xml-data/build-dir/ODSRV-UP-U20026DEV/onedata/oneprovider_swagger'...
Cloning into '/mnt/storage/bamboo-agent-home/xml-data/build-dir/ODSRV-UP-U20026DEV/onedata/onezone_swagger'...
Submodule 'bamboos' (ssh://git@git.onedata.org:7999/vfs/bamboos.git) registered for path 'onepanel_swagger/bamboos'
Cloning into '/mnt/storage/bamboo-agent-home/xml-data/build-dir/ODSRV-UP-U20026DEV/onedata/onepanel_swagger/bamboos'...
Submodule 'bamboos' (ssh://git@git.onedata.org:7999/vfs/bamboos.git) registered for path 'oneprovider_swagger/bamboos'
Cloning into '/mnt/storage/bamboo-agent-home/xml-data/build-dir/ODSRV-UP-U20026DEV/onedata/oneprovider_swagger/bamboos'...
Submodule 'bamboos' (ssh://git@git.onedata.org:7999/vfs/bamboos.git) registered for path 'onezone_swagger/bamboos'
Cloning into '/mnt/storage/bamboo-agent-home/xml-data/build-dir/ODSRV-UP-U20026DEV/onedata/onezone_swagger/bamboos'...
error: failed to create clusterrolebinding: clusterrolebindings.rbac.authorization.k8s.io "serviceaccounts-cluster-admin" already exists
coalesce.go:200: warning: cannot overwrite table with non table for external-volumes (map[])
coalesce.go:200: warning: cannot overwrite table with non table for external-volumes (map[])
coalesce.go:200: warning: cannot overwrite table with non table for external-volume-mounts (map[])
coalesce.go:200: warning: cannot overwrite table with non table for external-volumes (map[])
coalesce.go:200: warning: cannot overwrite table with non table for external-volume-mounts (map[])
coalesce.go:200: warning: cannot overwrite table with non table for external-storages (map[])
coalesce.go:200: warning: cannot overwrite table with non table for external-volume-mounts (map[])
coalesce.go:200: warning: cannot overwrite table with non table for external-volumes (map[])
coalesce.go:200: warning: cannot overwrite table with non table for external-volumes (map[])
coalesce.go:200: warning: cannot overwrite table with non table for external-volume-mounts (map[])
coalesce.go:200: warning: cannot overwrite table with non table for external-volumes (map[])
coalesce.go:200: warning: cannot overwrite table with non table for external-volume-mounts (map[])
coalesce.go:200: warning: cannot overwrite table with non table for external-volumes (map[])
Error: UPGRADE FAILED: cannot patch "dev-cross-support-job-3p-users" with kind Job: Job.batch "dev-cross-support-job-3p-users" is invalid: spec.template: Invalid value: core.PodTemplateSpec{ObjectMeta:v1.ObjectMeta{Name:"dev", GenerateName:"", Namespace:"", SelfLink:"", UID:"", ResourceVersion:"", Generation:0, CreationTimestamp:time.Date(1, time.January, 1, 0, 0, 0, 0, time.UTC), DeletionTimestamp:<nil>, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app":"dev-cross-support-job-3p", "batch.kubernetes.io/controller-uid":"66107d06-1146-4afe-a299-578180c251be", "batch.kubernetes.io/job-name":"dev-cross-support-job-3p-users", "chart":"cross-support-job-3p", "component":"cross-support-job-3p-users", "controller-uid":"66107d06-1146-4afe-a299-578180c251be", "dependency-level":"4", "heritage":"Helm", "job":"dev-cross-support-job-3p-users", "job-name":"dev-cross-support-job-3p-users", "release":"dev"}, Annotations:map[string]string{"version":"0.2.18-rc78"}, OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:core.PodSpec{Volumes:[]core.Volume(nil), InitContainers:[]core.Container{core.Container{Name:"wait-for-release", Image:"groundnuty/k8s-wait-for:v2.0", Command:[]string(nil), Args:[]string{"pod", "-l release in (dev), chart notin (cross-support-job-3p, oneclient, jupyter-notebook)\n"}, WorkingDir:"", Ports:[]core.ContainerPort(nil), EnvFrom:[]core.EnvFromSource(nil), Env:[]core.EnvVar(nil), Resources:core.ResourceRequirements{Limits:core.ResourceList(nil), Requests:core.ResourceList(nil), Claims:[]core.ResourceClaim(nil)}, ResizePolicy:[]core.ContainerResizePolicy(nil), RestartPolicy:(*core.ContainerRestartPolicy)(nil), VolumeMounts:[]core.VolumeMount(nil), VolumeDevices:[]core.VolumeDevice(nil), LivenessProbe:(*core.Probe)(nil), ReadinessProbe:(*core.Probe)(nil), StartupProbe:(*core.Probe)(nil), Lifecycle:(*core.Lifecycle)(nil), TerminationMessagePath:"/dev/termination-log", TerminationMessagePolicy:"File", ImagePullPolicy:"IfNotPresent", SecurityContext:(*core.SecurityContext)(nil), Stdin:false, StdinOnce:false, TTY:false}}, Containers:[]core.Container{core.Container{Name:"cross-support-job-3p", Image:"onedata/rest-cli:20.02.12", Command:[]string{"bash", "-c", "set -e; echo \"-k\" > ~/.curlrc ; echo \"-f\" >> ~/.curlrc ; export KEYCLOAK_VARS_INITIALIZED=\"False\" ;\nexit 0;\n"}, Args:[]string(nil), WorkingDir:"", Ports:[]core.ContainerPort(nil), EnvFrom:[]core.EnvFromSource(nil), Env:[]core.EnvVar{core.EnvVar{Name:"ONEZONE_HOST", Value:"https://dev-onezone", ValueFrom:(*core.EnvVarSource)(nil)}, core.EnvVar{Name:"ONEPANEL_BASIC_AUTH", Value:"onepanel:password", ValueFrom:(*core.EnvVarSource)(nil)}, core.EnvVar{Name:"ONEZONE_BASIC_AUTH", Value:"admin:password", ValueFrom:(*core.EnvVarSource)(nil)}, core.EnvVar{Name:"ONEPROVIDER_HOST", Value:"https://dev-oneprovider", ValueFrom:(*core.EnvVarSource)(nil)}, core.EnvVar{Name:"ONEPANEL_HOST", Value:"https://dev-onezone:9443", ValueFrom:(*core.EnvVarSource)(nil)}, core.EnvVar{Name:"KEYCLOAK_HOST", Value:"http://dev-keycloak.default.svc.cluster.local", ValueFrom:(*core.EnvVarSource)(nil)}, core.EnvVar{Name:"TERM", Value:"xterm-256color", ValueFrom:(*core.EnvVarSource)(nil)}}, Resources:core.ResourceRequirements{Limits:core.ResourceList(nil), Requests:core.ResourceList(nil), Claims:[]core.ResourceClaim(nil)}, ResizePolicy:[]core.ContainerResizePolicy(nil), RestartPolicy:(*core.ContainerRestartPolicy)(nil), VolumeMounts:[]core.VolumeMount(nil), VolumeDevices:[]core.VolumeDevice(nil), LivenessProbe:(*core.Probe)(nil), ReadinessProbe:(*core.Probe)(nil), StartupProbe:(*core.Probe)(nil), Lifecycle:(*core.Lifecycle)(nil), TerminationMessagePath:"/dev/termination-log", TerminationMessagePolicy:"File", ImagePullPolicy:"IfNotPresent", SecurityContext:(*core.SecurityContext)(nil), Stdin:false, StdinOnce:false, TTY:false}}, EphemeralContainers:[]core.EphemeralContainer(nil), RestartPolicy:"Never", TerminationGracePeriodSeconds:(*int64)(0xc013a13660), ActiveDeadlineSeconds:(*int64)(nil), DNSPolicy:"ClusterFirst", NodeSelector:map[string]string(nil), ServiceAccountName:"", AutomountServiceAccountToken:(*bool)(nil), NodeName:"", SecurityContext:(*core.PodSecurityContext)(0xc010ad2630), ImagePullSecrets:[]core.LocalObjectReference{}, Hostname:"", Subdomain:"", SetHostnameAsFQDN:(*bool)(nil), Affinity:(*core.Affinity)(nil), SchedulerName:"default-scheduler", Tolerations:[]core.Toleration(nil), HostAliases:[]core.HostAlias(nil), PriorityClassName:"", Priority:(*int32)(nil), PreemptionPolicy:(*core.PreemptionPolicy)(nil), DNSConfig:(*core.PodDNSConfig)(nil), ReadinessGates:[]core.PodReadinessGate(nil), RuntimeClassName:(*string)(nil), Overhead:core.ResourceList(nil), EnableServiceLinks:(*bool)(nil), TopologySpreadConstraints:[]core.TopologySpreadConstraint(nil), OS:(*core.PodOS)(nil), SchedulingGates:[]core.PodSchedulingGate(nil), ResourceClaims:[]core.PodResourceClaim(nil)}}: field is immutable
coalesce.go:200: warning: cannot overwrite table with non table for external-volume-mounts (map[])
coalesce.go:200: warning: cannot overwrite table with non table for external-storages (map[])
coalesce.go:200: warning: cannot overwrite table with non table for external-volume-mounts (map[])
coalesce.go:200: warning: cannot overwrite table with non table for external-volumes (map[])
coalesce.go:200: warning: cannot overwrite table with non table for external-volumes (map[])
coalesce.go:200: warning: cannot overwrite table with non table for external-volume-mounts (map[])
coalesce.go:200: warning: cannot overwrite table with non table for external-storages (map[])
coalesce.go:200: warning: cannot overwrite table with non table for external-volumes (map[])
coalesce.go:200: warning: cannot overwrite table with non table for external-volume-mounts (map[])
coalesce.go:200: warning: cannot overwrite table with non table for external-volumes (map[])
coalesce.go:200: warning: cannot overwrite table with non table for external-volume-mounts (map[])
coalesce.go:200: warning: cannot overwrite table with non table for external-volumes (map[])
coalesce.go:200: warning: cannot overwrite table with non table for external-volumes (map[])
coalesce.go:200: warning: cannot overwrite table with non table for external-volume-mounts (map[])
coalesce.go:200: warning: cannot overwrite table with non table for external-volumes (map[])
coalesce.go:200: warning: cannot overwrite table with non table for external-volume-mounts (map[])
coalesce.go:200: warning: cannot overwrite table with non table for external-storages (map[])
coalesce.go:200: warning: cannot overwrite table with non table for external-volumes (map[])
coalesce.go:200: warning: cannot overwrite table with non table for external-volume-mounts (map[])
coalesce.go:200: warning: cannot overwrite table with non table for external-volumes (map[])
coalesce.go:200: warning: cannot overwrite table with non table for external-volume-mounts (map[])
coalesce.go:200: warning: cannot overwrite table with non table for external-storages (map[])
coalesce.go:200: warning: cannot overwrite table with non table for external-volumes (map[])
coalesce.go:200: warning: cannot overwrite table with non table for external-volume-mounts (map[])
coalesce.go:200: warning: cannot overwrite table with non table for external-volume-mounts (map[])
coalesce.go:200: warning: cannot overwrite table with non table for external-volumes (map[])
coalesce.go:200: warning: cannot overwrite table with non table for external-volume-mounts (map[])
coalesce.go:200: warning: cannot overwrite table with non table for external-volumes (map[])
Error: UPGRADE FAILED: cannot patch "dev-cross-support-job-3p-users" with kind Job: Job.batch "dev-cross-support-job-3p-users" is invalid: spec.template: Invalid value: core.PodTemplateSpec{ObjectMeta:v1.ObjectMeta{Name:"dev", GenerateName:"", Namespace:"", SelfLink:"", UID:"", ResourceVersion:"", Generation:0, CreationTimestamp:time.Date(1, time.January, 1, 0, 0, 0, 0, time.UTC), DeletionTimestamp:<nil>, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app":"dev-cross-support-job-3p", "batch.kubernetes.io/controller-uid":"66107d06-1146-4afe-a299-578180c251be", "batch.kubernetes.io/job-name":"dev-cross-support-job-3p-users", "chart":"cross-support-job-3p", "component":"cross-support-job-3p-users", "controller-uid":"66107d06-1146-4afe-a299-578180c251be", "dependency-level":"4", "heritage":"Helm", "job":"dev-cross-support-job-3p-users", "job-name":"dev-cross-support-job-3p-users", "release":"dev"}, Annotations:map[string]string{"version":"0.2.18-rc78"}, OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:core.PodSpec{Volumes:[]core.Volume(nil), InitContainers:[]core.Container{core.Container{Name:"wait-for-release", Image:"groundnuty/k8s-wait-for:v2.0", Command:[]string(nil), Args:[]string{"pod", "-l release in (dev), chart notin (cross-support-job-3p, oneclient, jupyter-notebook)\n"}, WorkingDir:"", Ports:[]core.ContainerPort(nil), EnvFrom:[]core.EnvFromSource(nil), Env:[]core.EnvVar(nil), Resources:core.ResourceRequirements{Limits:core.ResourceList(nil), Requests:core.ResourceList(nil), Claims:[]core.ResourceClaim(nil)}, ResizePolicy:[]core.ContainerResizePolicy(nil), RestartPolicy:(*core.ContainerRestartPolicy)(nil), VolumeMounts:[]core.VolumeMount(nil), VolumeDevices:[]core.VolumeDevice(nil), LivenessProbe:(*core.Probe)(nil), ReadinessProbe:(*core.Probe)(nil), StartupProbe:(*core.Probe)(nil), Lifecycle:(*core.Lifecycle)(nil), TerminationMessagePath:"/dev/termination-log", TerminationMessagePolicy:"File", ImagePullPolicy:"IfNotPresent", SecurityContext:(*core.SecurityContext)(nil), Stdin:false, StdinOnce:false, TTY:false}}, Containers:[]core.Container{core.Container{Name:"cross-support-job-3p", Image:"onedata/rest-cli:20.02.12", Command:[]string{"bash", "-c", "set -e; echo \"-k\" > ~/.curlrc ; echo \"-f\" >> ~/.curlrc ; export KEYCLOAK_VARS_INITIALIZED=\"False\" ;\nexit 0;\n"}, Args:[]string(nil), WorkingDir:"", Ports:[]core.ContainerPort(nil), EnvFrom:[]core.EnvFromSource(nil), Env:[]core.EnvVar{core.EnvVar{Name:"ONEZONE_HOST", Value:"https://dev-onezone", ValueFrom:(*core.EnvVarSource)(nil)}, core.EnvVar{Name:"ONEPANEL_BASIC_AUTH", Value:"onepanel:password", ValueFrom:(*core.EnvVarSource)(nil)}, core.EnvVar{Name:"ONEZONE_BASIC_AUTH", Value:"admin:password", ValueFrom:(*core.EnvVarSource)(nil)}, core.EnvVar{Name:"ONEPROVIDER_HOST", Value:"https://dev-oneprovider", ValueFrom:(*core.EnvVarSource)(nil)}, core.EnvVar{Name:"ONEPANEL_HOST", Value:"https://dev-onezone:9443", ValueFrom:(*core.EnvVarSource)(nil)}, core.EnvVar{Name:"KEYCLOAK_HOST", Value:"http://dev-keycloak.default.svc.cluster.local", ValueFrom:(*core.EnvVarSource)(nil)}, core.EnvVar{Name:"TERM", Value:"xterm-256color", ValueFrom:(*core.EnvVarSource)(nil)}}, Resources:core.ResourceRequirements{Limits:core.ResourceList(nil), Requests:core.ResourceList(nil), Claims:[]core.ResourceClaim(nil)}, ResizePolicy:[]core.ContainerResizePolicy(nil), RestartPolicy:(*core.ContainerRestartPolicy)(nil), VolumeMounts:[]core.VolumeMount(nil), VolumeDevices:[]core.VolumeDevice(nil), LivenessProbe:(*core.Probe)(nil), ReadinessProbe:(*core.Probe)(nil), StartupProbe:(*core.Probe)(nil), Lifecycle:(*core.Lifecycle)(nil), TerminationMessagePath:"/dev/termination-log", TerminationMessagePolicy:"File", ImagePullPolicy:"IfNotPresent", SecurityContext:(*core.SecurityContext)(nil), Stdin:false, StdinOnce:false, TTY:false}}, EphemeralContainers:[]core.EphemeralContainer(nil), RestartPolicy:"Never", TerminationGracePeriodSeconds:(*int64)(0xc00e576470), ActiveDeadlineSeconds:(*int64)(nil), DNSPolicy:"ClusterFirst", NodeSelector:map[string]string(nil), ServiceAccountName:"", AutomountServiceAccountToken:(*bool)(nil), NodeName:"", SecurityContext:(*core.PodSecurityContext)(0xc00fdaf5f0), ImagePullSecrets:[]core.LocalObjectReference{}, Hostname:"", Subdomain:"", SetHostnameAsFQDN:(*bool)(nil), Affinity:(*core.Affinity)(nil), SchedulerName:"default-scheduler", Tolerations:[]core.Toleration(nil), HostAliases:[]core.HostAlias(nil), PriorityClassName:"", Priority:(*int32)(nil), PreemptionPolicy:(*core.PreemptionPolicy)(nil), DNSConfig:(*core.PodDNSConfig)(nil), ReadinessGates:[]core.PodReadinessGate(nil), RuntimeClassName:(*string)(nil), Overhead:core.ResourceList(nil), EnableServiceLinks:(*bool)(nil), TopologySpreadConstraints:[]core.TopologySpreadConstraint(nil), OS:(*core.PodOS)(nil), SchedulingGates:[]core.PodSchedulingGate(nil), ResourceClaims:[]core.PodResourceClaim(nil)}}: field is immutable
Defaulted container "oneclient" out of: oneclient, wait-for-onezone (init), wait-for-token-dispenser (init)
id: ‘user1’: no such user
command terminated with exit code 1
Defaulted container "oneclient" out of: oneclient, wait-for-onezone (init), wait-for-token-dispenser (init)
Defaulted container "oneclient" out of: oneclient, wait-for-onezone (init), wait-for-token-dispenser (init)
id: ‘user2’: no such user
command terminated with exit code 1
Defaulted container "oneclient" out of: oneclient, wait-for-onezone (init), wait-for-token-dispenser (init)
Defaulted container "oneclient" out of: oneclient, wait-for-onezone (init), wait-for-token-dispenser (init)
command terminated with exit code 1
Defaulted container "oneclient" out of: oneclient, wait-for-onezone (init), wait-for-token-dispenser (init)
Defaulted container "oneclient" out of: oneclient, wait-for-onezone (init), wait-for-token-dispenser (init)
Defaulted container "oneclient" out of: oneclient, wait-for-onezone (init), wait-for-token-dispenser (init)
Defaulted container "oneclient" out of: oneclient, wait-for-onezone (init), wait-for-token-dispenser (init)
command terminated with exit code 1
Defaulted container "oneclient" out of: oneclient, wait-for-onezone (init), wait-for-token-dispenser (init)
Defaulted container "oneclient" out of: oneclient, wait-for-onezone (init), wait-for-token-dispenser (init)
Defaulted container "oneclient" out of: oneclient, wait-for-onezone (init), wait-for-token-dispenser (init)
Defaulted container "oneclient" out of: oneclient, wait-for-onezone (init), wait-for-token-dispenser (init)
command terminated with exit code 1
Defaulted container "oneclient" out of: oneclient, wait-for-onezone (init), wait-for-token-dispenser (init)
Defaulted container "oneclient" out of: oneclient, wait-for-onezone (init), wait-for-token-dispenser (init)
Defaulted container "oneclient" out of: oneclient, wait-for-onezone (init), wait-for-token-dispenser (init)
Defaulted container "oneclient" out of: oneclient, wait-for-onezone (init), wait-for-token-dispenser (init)
id: ‘user1’: no such user
command terminated with exit code 1
Defaulted container "oneclient" out of: oneclient, wait-for-onezone (init), wait-for-token-dispenser (init)
Defaulted container "oneclient" out of: oneclient, wait-for-onezone (init), wait-for-token-dispenser (init)
id: ‘user2’: no such user
command terminated with exit code 1
Defaulted container "oneclient" out of: oneclient, wait-for-onezone (init), wait-for-token-dispenser (init)
Defaulted container "oneclient" out of: oneclient, wait-for-onezone (init), wait-for-token-dispenser (init)
command terminated with exit code 1
Defaulted container "oneclient" out of: oneclient, wait-for-onezone (init), wait-for-token-dispenser (init)
Defaulted container "oneclient" out of: oneclient, wait-for-onezone (init), wait-for-token-dispenser (init)
Defaulted container "oneclient" out of: oneclient, wait-for-onezone (init), wait-for-token-dispenser (init)
Defaulted container "oneclient" out of: oneclient, wait-for-onezone (init), wait-for-token-dispenser (init)
command terminated with exit code 1
Defaulted container "oneclient" out of: oneclient, wait-for-onezone (init), wait-for-token-dispenser (init)
Defaulted container "oneclient" out of: oneclient, wait-for-onezone (init), wait-for-token-dispenser (init)
Defaulted container "oneclient" out of: oneclient, wait-for-onezone (init), wait-for-token-dispenser (init)
Defaulted container "oneclient" out of: oneclient, wait-for-onezone (init), wait-for-token-dispenser (init)
command terminated with exit code 1
Defaulted container "oneclient" out of: oneclient, wait-for-onezone (init), wait-for-token-dispenser (init)
Defaulted container "oneclient" out of: oneclient, wait-for-onezone (init), wait-for-token-dispenser (init)
Defaulted container "oneclient" out of: oneclient, wait-for-onezone (init), wait-for-token-dispenser (init)
Defaulted container "oneclient" out of: oneclient, wait-for-onezone (init), wait-for-token-dispenser (init)
Defaulted container "oneclient" out of: oneclient, wait-for-onezone (init), wait-for-token-dispenser (init)
rsync: change_dir#3 "/tmp/logs" failed: No such file or directory (2)
rsync error: errors selecting input/output files, dirs (code 3) at main.c(828) [sender=3.1.3]
command terminated with exit code 3
Defaulted container "oneprovider" out of: oneprovider, wait-for-volume-s3-init (init), onezone-registration-token (init)
Defaulted container "oneprovider" out of: oneprovider, wait-for-volume-s3-init (init), onezone-registration-token (init)
Defaulted container "oneprovider" out of: oneprovider, wait-for-volume-s3-init (init), onezone-registration-token (init)
Defaulted container "oneprovider" out of: oneprovider, wait-for-volume-s3-init (init), onezone-registration-token (init)
Defaulted container "oneprovider" out of: oneprovider, wait-for-volume-s3-init (init), onezone-registration-token (init)
Defaulted container "oneprovider" out of: oneprovider, wait-for-volume-s3-init (init), onezone-registration-token (init)
cp: cannot stat 'onedata/one_env/sources_info.yaml': No such file or directory
mv: will not overwrite just-created 'onedata/tests/upgrade/logs/images.yaml' with 'onedata/tests/upgrade/logs/1op/before_upgrade.upgrade_meta_test.1731521037.956627/images.yaml'
mv: will not overwrite just-created 'onedata/tests/upgrade/logs/images.yaml' with 'onedata/tests/upgrade/logs/1op/before_upgrade.upgrade_meta_test.1731521057.7705758/images.yaml'
mv: will not overwrite just-created 'onedata/tests/upgrade/logs/images.yaml' with 'onedata/tests/upgrade/logs/2op/after_upgrade.upgrade_meta_test.1731521882.2994225/images.yaml'
mv: will not overwrite just-created 'onedata/tests/upgrade/logs/images.yaml' with 'onedata/tests/upgrade/logs/2op/before_upgrade.upgrade_meta_test.1731521564.2646592/images.yaml'
mv: will not overwrite just-created 'onedata/tests/upgrade/logs/images.yaml' with 'onedata/tests/upgrade/logs/2op/before_upgrade.upgrade_meta_test.1731521594.4798646/images.yaml'
  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                 Dload  Upload   Total   Spent    Left  Speed

  0     0    0     0    0     0      0      0 --:--:-- --:--:-- --:--:--     0
100  4822  100  4822    0     0  57404      0 --:--:-- --:--:-- --:--:-- 57404
Error response from daemon: Cannot kill container: 67ca91efce90: Container 67ca91efce906257b0f2dbdc3c3ce890f04107c75428f8f448356a1de9a12417 is not running
Error response from daemon: Cannot kill container: 3fa1972a7e90: Container 3fa1972a7e9078e8fe8471ec52c1482e407ce916dfb3b97998b43d528bb42d8e is not running
Error response from daemon: Cannot kill container: b0954a7cca63: Container b0954a7cca63ca75da307fc05762d05d828e5fe030caecf058c324ced87f9cce is not running
Error response from daemon: Cannot kill container: b877703171ef: Container b877703171ef58a76283b9c1daa009270bf98f7148b86ac3309392eb78c95db6 is not running
Error response from daemon: Cannot kill container: 4f5a30355d64: Container 4f5a30355d645d11cf486a327a1917193d4427470d7d4caad68ff0387ad971f7 is not running
Error response from daemon: Cannot kill container: e9bb0ff1c4d6: Container e9bb0ff1c4d6ea127ecaa0de0398639b360e9d51c6bf965f083da1f417cab515 is not running
Error response from daemon: Cannot kill container: 80a67259438b: Container 80a67259438bb8aebf90b14d3763d347dc9e9964fb91928873e045d5aea50109 is not running
Error response from daemon: Cannot kill container: b343b674ce84: Container b343b674ce84d8ecf53e0fd3c431ab497cc2446cbb5c766e645c6228458f6648 is not running
Error response from daemon: Cannot kill container: ee482ff72544: Container ee482ff725440f8387f5c2ac58b0d405a8a8836f76b118b8d1eaffb37ea69182 is not running
Error response from daemon: Cannot kill container: fc01db7bf008: Container fc01db7bf0083a10bf5cfe7aed339ca87d2efad7d1052f74f588c43e2594ddb4 is not running
Error response from daemon: Cannot kill container: 719a30a299af: Container 719a30a299afbc5e3b9d847552ea7520bf618ce35bec7fa8da7d216b3ab35405 is not running
Error response from daemon: Cannot kill container: 069df43e1188: Container 069df43e1188ce748e743790d140e50449993eca8d88c1b4a0da5b4bc7a199a7 is not running
Error response from daemon: Cannot kill container: f2e7fe07aa15: Container f2e7fe07aa15dfbcfbc898f9caf90e55232ea6f4af8ec0a75015f8fe2c442a54 is not running
Error response from daemon: Cannot kill container: 18336430ac22: Container 18336430ac22a253cafef007da5ce8af793515db65ebcea67c3dcd926b175d9b is not running
Error response from daemon: Cannot kill container: d5783a01098f: Container d5783a01098f785dc212d562042374f09b5944d6cc952ce9ed2458a99b40b2b6 is not running
Error response from daemon: Cannot kill container: 8006c92bb9a8: Container 8006c92bb9a8049cb7821f30cee8e01ce2be6f3663958648a2b8975847df0567 is not running
Error response from daemon: Cannot kill container: d88772e6f81b: Container d88772e6f81b9398383f35cd87ffd982fa8acdc0d302aa67253fd1a41fc6861f is not running
Error response from daemon: Cannot kill container: 123f94712960: Container 123f947129607ad7526c29ae562ce8e4c7687134d60743ca73227db5c74b96a8 is not running
Error response from daemon: Cannot kill container: df520e030db3: Container df520e030db3cc43ae154e0fe75df1b054a695681566a66260a6876f5a1d4192 is not running
Error response from daemon: Cannot kill container: 8c6e2f581a03: Container 8c6e2f581a0337e443a7fda81e9d42ba9e81dfcbaff7345499db3feeec72a42d is not running
Error response from daemon: Cannot kill container: e25e7d896679: Container e25e7d8966791954eb356233ed2c226bf0df545f65d5651b99504b211d18ae18 is not running
Error response from daemon: Cannot kill container: 135516e23e31: Container 135516e23e319245d5be291c07b3ed9baf7a63627fe8c44f59879878650b13fe is not running
Error response from daemon: Cannot kill container: b40407343a91: Container b40407343a911d0f1dd215e05246407a2f51bccf693ac048f4f2367c68da93a9 is not running
Error response from daemon: Cannot kill container: 265aeda6b5be: Container 265aeda6b5beb080a96ed3db33eafa693a1a57f0028322baddec3a894a7bcc1b is not running
Error response from daemon: Cannot kill container: c91e19874f72: Container c91e19874f72727720746b3763b88930d416cd1593d06d48d5a44d6f3770c8e1 is not running
Error response from daemon: Cannot kill container: cee9e3a6d290: Container cee9e3a6d2908f0598fc008071d35634184c029b7be37442df954d51a275d378 is not running
Error response from daemon: Cannot kill container: 570c825eceef: Container 570c825eceef48887e34ed739069a88f5ee462f1ce7f91432449238dd5a86be6 is not running
Error response from daemon: Cannot kill container: b94d0538b163: Container b94d0538b163a15b3bc76047c46ecb0ab40532facde47f1c747283fc180c894c is not running
Error response from daemon: Cannot kill container: f851dee3c4b6: Container f851dee3c4b6cb1f9884ed7469de82cecd7a515e590c0f9181c00ca49957c031 is not running
Error response from daemon: Cannot kill container: 794e4c2f1408: Container 794e4c2f140863ade43bfeb62e53a87e32dc7d4f841284e3039a991386b1a453 is not running
Error response from daemon: Cannot kill container: 8cfeb16ee83c: Container 8cfeb16ee83c327d7b9424f66a977e823fbf6e79809bb726b00eb705bc78bc48 is not running
Error response from daemon: Cannot kill container: 542f1bc23f6f: Container 542f1bc23f6f42664ceb7e78987608a40f889f9c7400b81e7854f574ae899b50 is not running
Error response from daemon: Cannot kill container: cee81d363c3c: Container cee81d363c3cc838a0f958d42a56ce1a215b521b9e25c31ec5ec0a37c4e367aa is not running
Error response from daemon: Cannot kill container: 00298e4bcc80: Container 00298e4bcc80f6eac4cfddfb7c8e19a35dac28bc70568b8bd88caf070cec64ba is not running
Error response from daemon: Cannot kill container: 574e95be8a76: Container 574e95be8a76315e8f2d3d9b4b1e626d71553d40630d09dfe2f282acc6c61f90 is not running
Error response from daemon: Cannot kill container: 228c49c2566c: Container 228c49c2566cfe23074aaf7d91bb188e5781bdc3e9d26ffd7536ffa4c010422a is not running
Error response from daemon: Cannot kill container: bd771352937c: Container bd771352937c56e51508d31a0d934fc2f1ca75f6d2147a3ab855ab440b211f29 is not running
Error response from daemon: Cannot kill container: 1ca099e0ee7f: Container 1ca099e0ee7f1f09000924fffaf6470842e3aab1bdd6e2390bef4131e577806f is not running
Error response from daemon: Cannot kill container: eeb0385b7f23: Container eeb0385b7f2357bb5f3932987a5f5f189f8b730bad2eeffb0053e6f83bfb43be is not running
Error response from daemon: Cannot kill container: 2c6d4b04765b: Container 2c6d4b04765b3cc09f6b286c5ebddebcf3a73f75c51434931e58c30ce06d2183 is not running
Error response from daemon: Cannot kill container: d1c481f76689: Container d1c481f76689def59537adf3dd4e340b550e7465ae65eaf727c3aa509de4a095 is not running
Error response from daemon: Cannot kill container: bbf1871ccdee: Container bbf1871ccdeeac8da9f9c27287ca93ff07aa5780039d5d637e5ae84f3199fdbb is not running
Error response from daemon: Cannot kill container: 5a7b550b54b8: Container 5a7b550b54b8028527943d5ec0da6d8a2da4fcfcdd96169edb53464c28a0f15e is not running
Error response from daemon: Cannot kill container: a67f86b86a1d: Container a67f86b86a1d2d417719a04f93664499643bc874a7fc29bbfe9d154f0e2e6334 is not running
Error response from daemon: Cannot kill container: 9989bd5af340: Container 9989bd5af3405da9d9c7eacb64c7429177967ee8723addf2a47906f24557dad6 is not running
Error response from daemon: Cannot kill container: 4031f2622b96: Container 4031f2622b968102d028623bdb9c1f0c78bedcbf0211594b26f76b18608312e6 is not running