Build: #1613 was successful

Job: 20.02.1 - current was successful

Job result summary

Completed
Duration
21 minutes
Agent
bamboo-agent-ad-09
Revision
e08cf96f4a472cf11845ec3b04a704bc1a7e3253
Total tests
2
Successful since
#1609 ()

Tests

  • 2 tests in total
  • 20 minutes taken in total.

Error summary

The build generated some errors. See the full build log for more details.

  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                 Dload  Upload   Total   Spent    Left  Speed

  0     0    0     0    0     0      0      0 --:--:-- --:--:-- --:--:--     0
100  4822  100  4822    0     0  60275      0 --:--:-- --:--:-- --:--:-- 60275
Error response from daemon: Cannot kill container: 112d334660c4: Container 112d334660c42859b952fde4eb6041c3efdd37b822f5ff1ef3e6ecec306775b3 is not running
Submodule 'automation-examples' (ssh://git@git.onedata.org:7999/vfs/automation-examples.git) registered for path 'automation-examples'
Submodule 'bamboos' (ssh://git@git.onedata.org:7999/vfs/bamboos.git) registered for path 'bamboos'
Submodule 'cdmi_swagger' (ssh://git@git.onedata.org:7999/vfs/cdmi-swagger.git) registered for path 'cdmi_swagger'
Submodule 'one_env' (ssh://git@git.onedata.org:7999/vfs/one-env.git) registered for path 'one_env'
Submodule 'onepanel_swagger' (ssh://git@git.onedata.org:7999/vfs/onepanel-swagger.git) registered for path 'onepanel_swagger'
Submodule 'oneprovider_swagger' (ssh://git@git.onedata.org:7999/vfs/oneprovider-swagger.git) registered for path 'oneprovider_swagger'
Submodule 'onezone_swagger' (ssh://git@git.onedata.org:7999/vfs/onezone-swagger.git) registered for path 'onezone_swagger'
Cloning into '/mnt/storage/bamboo-agent-home/xml-data/build-dir/ODSRV-UP-RFCD/onedata/automation-examples'...
Cloning into '/mnt/storage/bamboo-agent-home/xml-data/build-dir/ODSRV-UP-RFCD/onedata/bamboos'...
Cloning into '/mnt/storage/bamboo-agent-home/xml-data/build-dir/ODSRV-UP-RFCD/onedata/cdmi_swagger'...
Cloning into '/mnt/storage/bamboo-agent-home/xml-data/build-dir/ODSRV-UP-RFCD/onedata/one_env'...
Cloning into '/mnt/storage/bamboo-agent-home/xml-data/build-dir/ODSRV-UP-RFCD/onedata/onepanel_swagger'...
Cloning into '/mnt/storage/bamboo-agent-home/xml-data/build-dir/ODSRV-UP-RFCD/onedata/oneprovider_swagger'...
Cloning into '/mnt/storage/bamboo-agent-home/xml-data/build-dir/ODSRV-UP-RFCD/onedata/onezone_swagger'...
Submodule 'bamboos' (ssh://git@git.onedata.org:7999/vfs/bamboos.git) registered for path 'onepanel_swagger/bamboos'
Cloning into '/mnt/storage/bamboo-agent-home/xml-data/build-dir/ODSRV-UP-RFCD/onedata/onepanel_swagger/bamboos'...
Submodule 'bamboos' (ssh://git@git.onedata.org:7999/vfs/bamboos.git) registered for path 'oneprovider_swagger/bamboos'
Cloning into '/mnt/storage/bamboo-agent-home/xml-data/build-dir/ODSRV-UP-RFCD/onedata/oneprovider_swagger/bamboos'...
Submodule 'bamboos' (ssh://git@git.onedata.org:7999/vfs/bamboos.git) registered for path 'onezone_swagger/bamboos'
Cloning into '/mnt/storage/bamboo-agent-home/xml-data/build-dir/ODSRV-UP-RFCD/onedata/onezone_swagger/bamboos'...
error: failed to create clusterrolebinding: clusterrolebindings.rbac.authorization.k8s.io "serviceaccounts-cluster-admin" already exists
coalesce.go:200: warning: cannot overwrite table with non table for external-volumes (map[])
coalesce.go:200: warning: cannot overwrite table with non table for external-volume-mounts (map[])
coalesce.go:200: warning: cannot overwrite table with non table for external-volumes (map[])
coalesce.go:200: warning: cannot overwrite table with non table for external-volume-mounts (map[])
coalesce.go:200: warning: cannot overwrite table with non table for external-volume-mounts (map[])
coalesce.go:200: warning: cannot overwrite table with non table for external-volume-mounts (map[])
coalesce.go:200: warning: cannot overwrite table with non table for external-volumes (map[])
coalesce.go:200: warning: cannot overwrite table with non table for external-storages (map[])
coalesce.go:200: warning: cannot overwrite table with non table for external-volumes (map[])
coalesce.go:200: warning: cannot overwrite table with non table for external-volume-mounts (map[])
coalesce.go:200: warning: cannot overwrite table with non table for external-volumes (map[])
coalesce.go:200: warning: cannot overwrite table with non table for external-volume-mounts (map[])
coalesce.go:200: warning: cannot overwrite table with non table for external-volumes (map[])
Error: UPGRADE FAILED: cannot patch "dev-cross-support-job-3p-users" with kind Job: Job.batch "dev-cross-support-job-3p-users" is invalid: spec.template: Invalid value: core.PodTemplateSpec{ObjectMeta:v1.ObjectMeta{Name:"dev", GenerateName:"", Namespace:"", SelfLink:"", UID:"", ResourceVersion:"", Generation:0, CreationTimestamp:time.Date(1, time.January, 1, 0, 0, 0, 0, time.UTC), DeletionTimestamp:<nil>, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app":"dev-cross-support-job-3p", "batch.kubernetes.io/controller-uid":"e252f8d0-d474-4df3-95d4-7559b33c4bc8", "batch.kubernetes.io/job-name":"dev-cross-support-job-3p-users", "chart":"cross-support-job-3p", "component":"cross-support-job-3p-users", "controller-uid":"e252f8d0-d474-4df3-95d4-7559b33c4bc8", "dependency-level":"4", "heritage":"Helm", "job":"dev-cross-support-job-3p-users", "job-name":"dev-cross-support-job-3p-users", "release":"dev"}, Annotations:map[string]string{"version":"0.2.18-rc78"}, OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:core.PodSpec{Volumes:[]core.Volume(nil), InitContainers:[]core.Container{core.Container{Name:"wait-for-release", Image:"groundnuty/k8s-wait-for:v2.0", Command:[]string(nil), Args:[]string{"pod", "-l release in (dev), chart notin (cross-support-job-3p, oneclient, jupyter-notebook)\n"}, WorkingDir:"", Ports:[]core.ContainerPort(nil), EnvFrom:[]core.EnvFromSource(nil), Env:[]core.EnvVar(nil), Resources:core.ResourceRequirements{Limits:core.ResourceList(nil), Requests:core.ResourceList(nil), Claims:[]core.ResourceClaim(nil)}, ResizePolicy:[]core.ContainerResizePolicy(nil), RestartPolicy:(*core.ContainerRestartPolicy)(nil), VolumeMounts:[]core.VolumeMount(nil), VolumeDevices:[]core.VolumeDevice(nil), LivenessProbe:(*core.Probe)(nil), ReadinessProbe:(*core.Probe)(nil), StartupProbe:(*core.Probe)(nil), Lifecycle:(*core.Lifecycle)(nil), TerminationMessagePath:"/dev/termination-log", TerminationMessagePolicy:"File", ImagePullPolicy:"IfNotPresent", SecurityContext:(*core.SecurityContext)(nil), Stdin:false, StdinOnce:false, TTY:false}}, Containers:[]core.Container{core.Container{Name:"cross-support-job-3p", Image:"onedata/rest-cli:20.02.12", Command:[]string{"bash", "-c", "set -e; echo \"-k\" > ~/.curlrc ; echo \"-f\" >> ~/.curlrc ; export KEYCLOAK_VARS_INITIALIZED=\"False\" ;\nexit 0;\n"}, Args:[]string(nil), WorkingDir:"", Ports:[]core.ContainerPort(nil), EnvFrom:[]core.EnvFromSource(nil), Env:[]core.EnvVar{core.EnvVar{Name:"ONEZONE_HOST", Value:"https://dev-onezone", ValueFrom:(*core.EnvVarSource)(nil)}, core.EnvVar{Name:"ONEPANEL_BASIC_AUTH", Value:"onepanel:password", ValueFrom:(*core.EnvVarSource)(nil)}, core.EnvVar{Name:"ONEZONE_BASIC_AUTH", Value:"admin:password", ValueFrom:(*core.EnvVarSource)(nil)}, core.EnvVar{Name:"ONEPROVIDER_HOST", Value:"https://dev-oneprovider", ValueFrom:(*core.EnvVarSource)(nil)}, core.EnvVar{Name:"ONEPANEL_HOST", Value:"https://dev-onezone:9443", ValueFrom:(*core.EnvVarSource)(nil)}, core.EnvVar{Name:"KEYCLOAK_HOST", Value:"http://dev-keycloak.default.svc.cluster.local", ValueFrom:(*core.EnvVarSource)(nil)}, core.EnvVar{Name:"TERM", Value:"xterm-256color", ValueFrom:(*core.EnvVarSource)(nil)}}, Resources:core.ResourceRequirements{Limits:core.ResourceList(nil), Requests:core.ResourceList(nil), Claims:[]core.ResourceClaim(nil)}, ResizePolicy:[]core.ContainerResizePolicy(nil), RestartPolicy:(*core.ContainerRestartPolicy)(nil), VolumeMounts:[]core.VolumeMount(nil), VolumeDevices:[]core.VolumeDevice(nil), LivenessProbe:(*core.Probe)(nil), ReadinessProbe:(*core.Probe)(nil), StartupProbe:(*core.Probe)(nil), Lifecycle:(*core.Lifecycle)(nil), TerminationMessagePath:"/dev/termination-log", TerminationMessagePolicy:"File", ImagePullPolicy:"IfNotPresent", SecurityContext:(*core.SecurityContext)(nil), Stdin:false, StdinOnce:false, TTY:false}}, EphemeralContainers:[]core.EphemeralContainer(nil), RestartPolicy:"Never", TerminationGracePeriodSeconds:(*int64)(0xc00ed2fdc0), ActiveDeadlineSeconds:(*int64)(nil), DNSPolicy:"ClusterFirst", NodeSelector:map[string]string(nil), ServiceAccountName:"", AutomountServiceAccountToken:(*bool)(nil), NodeName:"", SecurityContext:(*core.PodSecurityContext)(0xc00cbe5830), ImagePullSecrets:[]core.LocalObjectReference{}, Hostname:"", Subdomain:"", SetHostnameAsFQDN:(*bool)(nil), Affinity:(*core.Affinity)(nil), SchedulerName:"default-scheduler", Tolerations:[]core.Toleration(nil), HostAliases:[]core.HostAlias(nil), PriorityClassName:"", Priority:(*int32)(nil), PreemptionPolicy:(*core.PreemptionPolicy)(nil), DNSConfig:(*core.PodDNSConfig)(nil), ReadinessGates:[]core.PodReadinessGate(nil), RuntimeClassName:(*string)(nil), Overhead:core.ResourceList(nil), EnableServiceLinks:(*bool)(nil), TopologySpreadConstraints:[]core.TopologySpreadConstraint(nil), OS:(*core.PodOS)(nil), SchedulingGates:[]core.PodSchedulingGate(nil), ResourceClaims:[]core.PodResourceClaim(nil)}}: field is immutable
coalesce.go:200: warning: cannot overwrite table with non table for external-volume-mounts (map[])
coalesce.go:200: warning: cannot overwrite table with non table for external-volumes (map[])
coalesce.go:200: warning: cannot overwrite table with non table for external-storages (map[])
coalesce.go:200: warning: cannot overwrite table with non table for external-volumes (map[])
coalesce.go:200: warning: cannot overwrite table with non table for external-volume-mounts (map[])
coalesce.go:200: warning: cannot overwrite table with non table for external-volumes (map[])
coalesce.go:200: warning: cannot overwrite table with non table for external-volume-mounts (map[])
coalesce.go:200: warning: cannot overwrite table with non table for external-volumes (map[])
coalesce.go:200: warning: cannot overwrite table with non table for external-volume-mounts (map[])
coalesce.go:200: warning: cannot overwrite table with non table for external-storages (map[])
coalesce.go:200: warning: cannot overwrite table with non table for external-volume-mounts (map[])
coalesce.go:200: warning: cannot overwrite table with non table for external-volumes (map[])
coalesce.go:200: warning: cannot overwrite table with non table for external-volumes (map[])
coalesce.go:200: warning: cannot overwrite table with non table for external-volume-mounts (map[])
coalesce.go:200: warning: cannot overwrite table with non table for external-volume-mounts (map[])
coalesce.go:200: warning: cannot overwrite table with non table for external-storages (map[])
coalesce.go:200: warning: cannot overwrite table with non table for external-volumes (map[])
coalesce.go:200: warning: cannot overwrite table with non table for external-volumes (map[])
coalesce.go:200: warning: cannot overwrite table with non table for external-volume-mounts (map[])
coalesce.go:200: warning: cannot overwrite table with non table for external-volumes (map[])
coalesce.go:200: warning: cannot overwrite table with non table for external-volume-mounts (map[])
coalesce.go:200: warning: cannot overwrite table with non table for external-volumes (map[])
coalesce.go:200: warning: cannot overwrite table with non table for external-storages (map[])
coalesce.go:200: warning: cannot overwrite table with non table for external-volume-mounts (map[])
coalesce.go:200: warning: cannot overwrite table with non table for external-volumes (map[])
coalesce.go:200: warning: cannot overwrite table with non table for external-volume-mounts (map[])
coalesce.go:200: warning: cannot overwrite table with non table for external-volume-mounts (map[])
coalesce.go:200: warning: cannot overwrite table with non table for external-volumes (map[])
Error: UPGRADE FAILED: cannot patch "dev-cross-support-job-3p-users" with kind Job: Job.batch "dev-cross-support-job-3p-users" is invalid: spec.template: Invalid value: core.PodTemplateSpec{ObjectMeta:v1.ObjectMeta{Name:"dev", GenerateName:"", Namespace:"", SelfLink:"", UID:"", ResourceVersion:"", Generation:0, CreationTimestamp:time.Date(1, time.January, 1, 0, 0, 0, 0, time.UTC), DeletionTimestamp:<nil>, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app":"dev-cross-support-job-3p", "batch.kubernetes.io/controller-uid":"e252f8d0-d474-4df3-95d4-7559b33c4bc8", "batch.kubernetes.io/job-name":"dev-cross-support-job-3p-users", "chart":"cross-support-job-3p", "component":"cross-support-job-3p-users", "controller-uid":"e252f8d0-d474-4df3-95d4-7559b33c4bc8", "dependency-level":"4", "heritage":"Helm", "job":"dev-cross-support-job-3p-users", "job-name":"dev-cross-support-job-3p-users", "release":"dev"}, Annotations:map[string]string{"version":"0.2.18-rc78"}, OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:core.PodSpec{Volumes:[]core.Volume(nil), InitContainers:[]core.Container{core.Container{Name:"wait-for-release", Image:"groundnuty/k8s-wait-for:v2.0", Command:[]string(nil), Args:[]string{"pod", "-l release in (dev), chart notin (cross-support-job-3p, oneclient, jupyter-notebook)\n"}, WorkingDir:"", Ports:[]core.ContainerPort(nil), EnvFrom:[]core.EnvFromSource(nil), Env:[]core.EnvVar(nil), Resources:core.ResourceRequirements{Limits:core.ResourceList(nil), Requests:core.ResourceList(nil), Claims:[]core.ResourceClaim(nil)}, ResizePolicy:[]core.ContainerResizePolicy(nil), RestartPolicy:(*core.ContainerRestartPolicy)(nil), VolumeMounts:[]core.VolumeMount(nil), VolumeDevices:[]core.VolumeDevice(nil), LivenessProbe:(*core.Probe)(nil), ReadinessProbe:(*core.Probe)(nil), StartupProbe:(*core.Probe)(nil), Lifecycle:(*core.Lifecycle)(nil), TerminationMessagePath:"/dev/termination-log", TerminationMessagePolicy:"File", ImagePullPolicy:"IfNotPresent", SecurityContext:(*core.SecurityContext)(nil), Stdin:false, StdinOnce:false, TTY:false}}, Containers:[]core.Container{core.Container{Name:"cross-support-job-3p", Image:"onedata/rest-cli:20.02.12", Command:[]string{"bash", "-c", "set -e; echo \"-k\" > ~/.curlrc ; echo \"-f\" >> ~/.curlrc ; export KEYCLOAK_VARS_INITIALIZED=\"False\" ;\nexit 0;\n"}, Args:[]string(nil), WorkingDir:"", Ports:[]core.ContainerPort(nil), EnvFrom:[]core.EnvFromSource(nil), Env:[]core.EnvVar{core.EnvVar{Name:"ONEZONE_HOST", Value:"https://dev-onezone", ValueFrom:(*core.EnvVarSource)(nil)}, core.EnvVar{Name:"ONEPANEL_BASIC_AUTH", Value:"onepanel:password", ValueFrom:(*core.EnvVarSource)(nil)}, core.EnvVar{Name:"ONEZONE_BASIC_AUTH", Value:"admin:password", ValueFrom:(*core.EnvVarSource)(nil)}, core.EnvVar{Name:"ONEPROVIDER_HOST", Value:"https://dev-oneprovider", ValueFrom:(*core.EnvVarSource)(nil)}, core.EnvVar{Name:"ONEPANEL_HOST", Value:"https://dev-onezone:9443", ValueFrom:(*core.EnvVarSource)(nil)}, core.EnvVar{Name:"KEYCLOAK_HOST", Value:"http://dev-keycloak.default.svc.cluster.local", ValueFrom:(*core.EnvVarSource)(nil)}, core.EnvVar{Name:"TERM", Value:"xterm-256color", ValueFrom:(*core.EnvVarSource)(nil)}}, Resources:core.ResourceRequirements{Limits:core.ResourceList(nil), Requests:core.ResourceList(nil), Claims:[]core.ResourceClaim(nil)}, ResizePolicy:[]core.ContainerResizePolicy(nil), RestartPolicy:(*core.ContainerRestartPolicy)(nil), VolumeMounts:[]core.VolumeMount(nil), VolumeDevices:[]core.VolumeDevice(nil), LivenessProbe:(*core.Probe)(nil), ReadinessProbe:(*core.Probe)(nil), StartupProbe:(*core.Probe)(nil), Lifecycle:(*core.Lifecycle)(nil), TerminationMessagePath:"/dev/termination-log", TerminationMessagePolicy:"File", ImagePullPolicy:"IfNotPresent", SecurityContext:(*core.SecurityContext)(nil), Stdin:false, StdinOnce:false, TTY:false}}, EphemeralContainers:[]core.EphemeralContainer(nil), RestartPolicy:"Never", TerminationGracePeriodSeconds:(*int64)(0xc0104207e0), ActiveDeadlineSeconds:(*int64)(nil), DNSPolicy:"ClusterFirst", NodeSelector:map[string]string(nil), ServiceAccountName:"", AutomountServiceAccountToken:(*bool)(nil), NodeName:"", SecurityContext:(*core.PodSecurityContext)(0xc00f0c14d0), ImagePullSecrets:[]core.LocalObjectReference{}, Hostname:"", Subdomain:"", SetHostnameAsFQDN:(*bool)(nil), Affinity:(*core.Affinity)(nil), SchedulerName:"default-scheduler", Tolerations:[]core.Toleration(nil), HostAliases:[]core.HostAlias(nil), PriorityClassName:"", Priority:(*int32)(nil), PreemptionPolicy:(*core.PreemptionPolicy)(nil), DNSConfig:(*core.PodDNSConfig)(nil), ReadinessGates:[]core.PodReadinessGate(nil), RuntimeClassName:(*string)(nil), Overhead:core.ResourceList(nil), EnableServiceLinks:(*bool)(nil), TopologySpreadConstraints:[]core.TopologySpreadConstraint(nil), OS:(*core.PodOS)(nil), SchedulingGates:[]core.PodSchedulingGate(nil), ResourceClaims:[]core.PodResourceClaim(nil)}}: field is immutable
Defaulted container "oneclient" out of: oneclient, wait-for-onezone (init), wait-for-token-dispenser (init)
id: ‘user1’: no such user
command terminated with exit code 1
Defaulted container "oneclient" out of: oneclient, wait-for-onezone (init), wait-for-token-dispenser (init)
Defaulted container "oneclient" out of: oneclient, wait-for-onezone (init), wait-for-token-dispenser (init)
id: ‘user2’: no such user
command terminated with exit code 1
Defaulted container "oneclient" out of: oneclient, wait-for-onezone (init), wait-for-token-dispenser (init)
Defaulted container "oneclient" out of: oneclient, wait-for-onezone (init), wait-for-token-dispenser (init)
command terminated with exit code 1
Defaulted container "oneclient" out of: oneclient, wait-for-onezone (init), wait-for-token-dispenser (init)
Defaulted container "oneclient" out of: oneclient, wait-for-onezone (init), wait-for-token-dispenser (init)
Defaulted container "oneclient" out of: oneclient, wait-for-onezone (init), wait-for-token-dispenser (init)
Defaulted container "oneclient" out of: oneclient, wait-for-onezone (init), wait-for-token-dispenser (init)
command terminated with exit code 1
Defaulted container "oneclient" out of: oneclient, wait-for-onezone (init), wait-for-token-dispenser (init)
Defaulted container "oneclient" out of: oneclient, wait-for-onezone (init), wait-for-token-dispenser (init)
Defaulted container "oneclient" out of: oneclient, wait-for-onezone (init), wait-for-token-dispenser (init)
Defaulted container "oneclient" out of: oneclient, wait-for-onezone (init), wait-for-token-dispenser (init)
command terminated with exit code 1
Defaulted container "oneclient" out of: oneclient, wait-for-onezone (init), wait-for-token-dispenser (init)
Defaulted container "oneclient" out of: oneclient, wait-for-onezone (init), wait-for-token-dispenser (init)
Defaulted container "oneclient" out of: oneclient, wait-for-onezone (init), wait-for-token-dispenser (init)
Defaulted container "oneclient" out of: oneclient, wait-for-onezone (init), wait-for-token-dispenser (init)
id: ‘user1’: no such user
command terminated with exit code 1
Defaulted container "oneclient" out of: oneclient, wait-for-onezone (init), wait-for-token-dispenser (init)
Defaulted container "oneclient" out of: oneclient, wait-for-onezone (init), wait-for-token-dispenser (init)
id: ‘user2’: no such user
command terminated with exit code 1
Defaulted container "oneclient" out of: oneclient, wait-for-onezone (init), wait-for-token-dispenser (init)
Defaulted container "oneclient" out of: oneclient, wait-for-onezone (init), wait-for-token-dispenser (init)
command terminated with exit code 1
Defaulted container "oneclient" out of: oneclient, wait-for-onezone (init), wait-for-token-dispenser (init)
Defaulted container "oneclient" out of: oneclient, wait-for-onezone (init), wait-for-token-dispenser (init)
Defaulted container "oneclient" out of: oneclient, wait-for-onezone (init), wait-for-token-dispenser (init)
Defaulted container "oneclient" out of: oneclient, wait-for-onezone (init), wait-for-token-dispenser (init)
command terminated with exit code 1
Defaulted container "oneclient" out of: oneclient, wait-for-onezone (init), wait-for-token-dispenser (init)
Defaulted container "oneclient" out of: oneclient, wait-for-onezone (init), wait-for-token-dispenser (init)
Defaulted container "oneclient" out of: oneclient, wait-for-onezone (init), wait-for-token-dispenser (init)
Defaulted container "oneclient" out of: oneclient, wait-for-onezone (init), wait-for-token-dispenser (init)
command terminated with exit code 1
Defaulted container "oneclient" out of: oneclient, wait-for-onezone (init), wait-for-token-dispenser (init)
Defaulted container "oneclient" out of: oneclient, wait-for-onezone (init), wait-for-token-dispenser (init)
Defaulted container "oneclient" out of: oneclient, wait-for-onezone (init), wait-for-token-dispenser (init)
Defaulted container "oneclient" out of: oneclient, wait-for-onezone (init), wait-for-token-dispenser (init)
Defaulted container "oneclient" out of: oneclient, wait-for-onezone (init), wait-for-token-dispenser (init)
rsync: change_dir#3 "/tmp/logs" failed: No such file or directory (2)
rsync error: errors selecting input/output files, dirs (code 3) at main.c(828) [sender=3.1.3]
command terminated with exit code 3
Defaulted container "oneprovider" out of: oneprovider, wait-for-volume-s3-init (init), onezone-registration-token (init)
Defaulted container "oneprovider" out of: oneprovider, wait-for-volume-s3-init (init), onezone-registration-token (init)
Defaulted container "oneprovider" out of: oneprovider, wait-for-volume-s3-init (init), onezone-registration-token (init)
Defaulted container "oneprovider" out of: oneprovider, wait-for-volume-s3-init (init), onezone-registration-token (init)
Defaulted container "oneprovider" out of: oneprovider, wait-for-volume-s3-init (init), onezone-registration-token (init)
Defaulted container "oneprovider" out of: oneprovider, wait-for-volume-s3-init (init), onezone-registration-token (init)
cp: cannot stat 'onedata/one_env/sources_info.yaml': No such file or directory
mv: will not overwrite just-created 'onedata/tests/upgrade/logs/images.yaml' with 'onedata/tests/upgrade/logs/1op-compatibility-overlay/before_upgrade.upgrade_meta_test.1731520961.689328/images.yaml'
mv: will not overwrite just-created 'onedata/tests/upgrade/logs/images.yaml' with 'onedata/tests/upgrade/logs/1op-compatibility-overlay/before_upgrade.upgrade_meta_test.1731520981.4083748/images.yaml'
mv: will not overwrite just-created 'onedata/tests/upgrade/logs/images.yaml' with 'onedata/tests/upgrade/logs/2op-compatibility-overlay/after_upgrade.upgrade_meta_test.1731521777.0974734/images.yaml'
mv: will not overwrite just-created 'onedata/tests/upgrade/logs/images.yaml' with 'onedata/tests/upgrade/logs/2op-compatibility-overlay/before_upgrade.upgrade_meta_test.1731521456.6366632/images.yaml'
mv: will not overwrite just-created 'onedata/tests/upgrade/logs/images.yaml' with 'onedata/tests/upgrade/logs/2op-compatibility-overlay/before_upgrade.upgrade_meta_test.1731521486.372124/images.yaml'
  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                 Dload  Upload   Total   Spent    Left  Speed

  0     0    0     0    0     0      0      0 --:--:-- --:--:-- --:--:--     0
100  4822  100  4822    0     0  58804      0 --:--:-- --:--:-- --:--:-- 58804
Error response from daemon: Cannot kill container: fb4b0d8d5cfc: Container fb4b0d8d5cfce001c77011b1d832536f00308ae2e247b13938877d8579be2616 is not running
Error response from daemon: Cannot kill container: 84fae848dfdf: Container 84fae848dfdfc62d15da2fc644ac9742cdd5b0a55ea680c361af80c1f928298d is not running
Error response from daemon: Cannot kill container: 7aef81abcbce: Container 7aef81abcbce00e94f08c99c097a514aa8eeb355593134fc60bcb79438826a93 is not running
Error response from daemon: Cannot kill container: dcbc5f699d20: Container dcbc5f699d20004a970734cc273a65e5fde1187320c44e72399af53b3eccabef is not running
Error response from daemon: Cannot kill container: 8f65da473dfb: Container 8f65da473dfbccf9a2890c9fa0d72c277a43358e547242a8fa992f14c051d817 is not running
Error response from daemon: Cannot kill container: 43b71d5e67ab: Container 43b71d5e67ab0f7c4a9466aba949d756a24700def02aa978daed8fb784b72ac7 is not running
Error response from daemon: Cannot kill container: 31fa5860493e: Container 31fa5860493e01e923c809d1722f032a452e8a2a2bd655816a2c9ce3066c2fd1 is not running
Error response from daemon: Cannot kill container: 61bc2874f014: Container 61bc2874f0141dc2bbd81ff861864b1926ae419f59b9ab7bdf3906ec836862a3 is not running
Error response from daemon: Cannot kill container: fd3b18b1f0e0: Container fd3b18b1f0e057b28d16f35262ad50fe1e30d0dee10ae3e4f93bbdd3d0307201 is not running
Error response from daemon: Cannot kill container: f82fbf452bec: Container f82fbf452becc28a56402cee1d86039af5d11a69fa875735aa93926927e346d8 is not running
Error response from daemon: Cannot kill container: f197b8bc0cbb: Container f197b8bc0cbb1158bcd5cedf68ee784f2c28aeae9c17af5d7ca04f3a1618b088 is not running
Error response from daemon: Cannot kill container: 046e3ac179f7: Container 046e3ac179f7dc73a06753becff704ed6f5e81f6b321c49eb21d4e368d0845ce is not running
Error response from daemon: Cannot kill container: 9829ceafe9bc: Container 9829ceafe9bc3d991a4f21c7b873fb05dcc9ab4f875e32e96a7be74ee1023f96 is not running
Error response from daemon: Cannot kill container: eceb55e9f383: Container eceb55e9f383f6b64cf66206f15f3955272988a60f94a3069d86b6deb4065315 is not running
Error response from daemon: Cannot kill container: 3fb0fba7b63a: Container 3fb0fba7b63a4dda916681f19b4d43f7362dcc8fa49c9adce72399fef0a914a7 is not running
Error response from daemon: Cannot kill container: b5fcc2ec7f43: Container b5fcc2ec7f43741fe7e832ff1731cf70bad15cd4f724defc8e97320ce81c8013 is not running
Error response from daemon: Cannot kill container: 2cf292f3eb46: Container 2cf292f3eb4601af9fa644cb952e9ac177fee5dc015e5972f0c16b481ea0c0a7 is not running
Error response from daemon: Cannot kill container: 675ff2cbaaab: Container 675ff2cbaaab3c866419a4f3b1de07081beb8dc64a5ecc256895c7363102f22e is not running
Error response from daemon: Cannot kill container: 4a720df99a0e: Container 4a720df99a0ecc30dd2c6755f3cfc6f09abeb3742db7a80ea774933ac9e41a5b is not running
Error response from daemon: Cannot kill container: 71305736b25e: Container 71305736b25ef955ffe5bbca9c2ae3e6863fcbbf46e609bede0ae2b722d50cf1 is not running
Error response from daemon: Cannot kill container: f11e07614b65: Container f11e07614b65fbf41068d56e667efab27f2aa7918cb060f6464fc3e9e1275cd5 is not running
Error response from daemon: Cannot kill container: c7ac8879e05d: Container c7ac8879e05d7bc4cd0200b80f8985d0c14e0e41fcb7ac51f514d8cf1ee4a753 is not running
Error response from daemon: Cannot kill container: 98110297ab32: Container 98110297ab32eceebcf1a46be57d61d7497ae21f4b5995a6d5b2a4b9ba571f49 is not running
Error response from daemon: Cannot kill container: c20c9bcfdfdd: Container c20c9bcfdfdd69145c5be29ee0ba14ae75837e231f9c58821048f6221e7a3faf is not running
Error response from daemon: Cannot kill container: 6a945df3f76e: Container 6a945df3f76ec5d7c503c5aa65f58368a7289bb599d69f430c132b3c3c9a9b58 is not running