Build: #1615 was successful

Job: 20.02.19 - current was successful

Job result summary

Completed
Duration
26 minutes
Agent
bamboo-agent-os-13
Revision
e08cf96f4a472cf11845ec3b04a704bc1a7e3253
Total tests
2
Successful since
#1613 ()

Tests

  • 2 tests in total
  • 24 minutes taken in total.

Error summary

The build generated some errors. See the full build log for more details.

  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                 Dload  Upload   Total   Spent    Left  Speed

  0     0    0     0    0     0      0      0 --:--:-- --:--:-- --:--:--     0
  0     0    0     0    0     0      0      0 --:--:-- --:--:-- --:--:--     0
100  4822  100  4822    0     0  58096      0 --:--:-- --:--:-- --:--:-- 57404
Error response from daemon: Cannot kill container: c6c35a05b007: Container c6c35a05b007e8d68ce68e27d55455563d03f189c55e52fd64ac8a153aedfc7e is not running
Submodule 'automation-examples' (ssh://git@git.onedata.org:7999/vfs/automation-examples.git) registered for path 'automation-examples'
Submodule 'bamboos' (ssh://git@git.onedata.org:7999/vfs/bamboos.git) registered for path 'bamboos'
Submodule 'cdmi_swagger' (ssh://git@git.onedata.org:7999/vfs/cdmi-swagger.git) registered for path 'cdmi_swagger'
Submodule 'one_env' (ssh://git@git.onedata.org:7999/vfs/one-env.git) registered for path 'one_env'
Submodule 'onepanel_swagger' (ssh://git@git.onedata.org:7999/vfs/onepanel-swagger.git) registered for path 'onepanel_swagger'
Submodule 'oneprovider_swagger' (ssh://git@git.onedata.org:7999/vfs/oneprovider-swagger.git) registered for path 'oneprovider_swagger'
Submodule 'onezone_swagger' (ssh://git@git.onedata.org:7999/vfs/onezone-swagger.git) registered for path 'onezone_swagger'
Cloning into '/mnt/storage/bamboo-agent-home/xml-data/build-dir/ODSRV-UP-U20026DEV/onedata/automation-examples'...
Cloning into '/mnt/storage/bamboo-agent-home/xml-data/build-dir/ODSRV-UP-U20026DEV/onedata/bamboos'...
Cloning into '/mnt/storage/bamboo-agent-home/xml-data/build-dir/ODSRV-UP-U20026DEV/onedata/cdmi_swagger'...
Cloning into '/mnt/storage/bamboo-agent-home/xml-data/build-dir/ODSRV-UP-U20026DEV/onedata/one_env'...
Cloning into '/mnt/storage/bamboo-agent-home/xml-data/build-dir/ODSRV-UP-U20026DEV/onedata/onepanel_swagger'...
Cloning into '/mnt/storage/bamboo-agent-home/xml-data/build-dir/ODSRV-UP-U20026DEV/onedata/oneprovider_swagger'...
Cloning into '/mnt/storage/bamboo-agent-home/xml-data/build-dir/ODSRV-UP-U20026DEV/onedata/onezone_swagger'...
Submodule 'bamboos' (ssh://git@git.onedata.org:7999/vfs/bamboos.git) registered for path 'onepanel_swagger/bamboos'
Cloning into '/mnt/storage/bamboo-agent-home/xml-data/build-dir/ODSRV-UP-U20026DEV/onedata/onepanel_swagger/bamboos'...
Submodule 'bamboos' (ssh://git@git.onedata.org:7999/vfs/bamboos.git) registered for path 'oneprovider_swagger/bamboos'
Cloning into '/mnt/storage/bamboo-agent-home/xml-data/build-dir/ODSRV-UP-U20026DEV/onedata/oneprovider_swagger/bamboos'...
Submodule 'bamboos' (ssh://git@git.onedata.org:7999/vfs/bamboos.git) registered for path 'onezone_swagger/bamboos'
Cloning into '/mnt/storage/bamboo-agent-home/xml-data/build-dir/ODSRV-UP-U20026DEV/onedata/onezone_swagger/bamboos'...
error: failed to create clusterrolebinding: clusterrolebindings.rbac.authorization.k8s.io "serviceaccounts-cluster-admin" already exists
coalesce.go:200: warning: cannot overwrite table with non table for external-volumes (map[])
coalesce.go:200: warning: cannot overwrite table with non table for external-volume-mounts (map[])
coalesce.go:200: warning: cannot overwrite table with non table for external-volumes (map[])
coalesce.go:200: warning: cannot overwrite table with non table for external-volume-mounts (map[])
coalesce.go:200: warning: cannot overwrite table with non table for external-volume-mounts (map[])
coalesce.go:200: warning: cannot overwrite table with non table for external-volumes (map[])
coalesce.go:200: warning: cannot overwrite table with non table for external-volumes (map[])
coalesce.go:200: warning: cannot overwrite table with non table for external-volume-mounts (map[])
coalesce.go:200: warning: cannot overwrite table with non table for external-storages (map[])
coalesce.go:200: warning: cannot overwrite table with non table for external-volume-mounts (map[])
coalesce.go:200: warning: cannot overwrite table with non table for external-volumes (map[])
coalesce.go:200: warning: cannot overwrite table with non table for external-volume-mounts (map[])
coalesce.go:200: warning: cannot overwrite table with non table for external-volumes (map[])
Error: UPGRADE FAILED: cannot patch "dev-cross-support-job-3p-users" with kind Job: Job.batch "dev-cross-support-job-3p-users" is invalid: spec.template: Invalid value: core.PodTemplateSpec{ObjectMeta:v1.ObjectMeta{Name:"dev", GenerateName:"", Namespace:"", SelfLink:"", UID:"", ResourceVersion:"", Generation:0, CreationTimestamp:time.Date(1, time.January, 1, 0, 0, 0, 0, time.UTC), DeletionTimestamp:<nil>, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app":"dev-cross-support-job-3p", "batch.kubernetes.io/controller-uid":"72bfd77b-596d-4738-a6d8-f3c8b839368c", "batch.kubernetes.io/job-name":"dev-cross-support-job-3p-users", "chart":"cross-support-job-3p", "component":"cross-support-job-3p-users", "controller-uid":"72bfd77b-596d-4738-a6d8-f3c8b839368c", "dependency-level":"4", "heritage":"Helm", "job":"dev-cross-support-job-3p-users", "job-name":"dev-cross-support-job-3p-users", "release":"dev"}, Annotations:map[string]string{"version":"0.2.18-rc78"}, OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:core.PodSpec{Volumes:[]core.Volume(nil), InitContainers:[]core.Container{core.Container{Name:"wait-for-release", Image:"groundnuty/k8s-wait-for:v2.0", Command:[]string(nil), Args:[]string{"pod", "-l release in (dev), chart notin (cross-support-job-3p, oneclient, jupyter-notebook)\n"}, WorkingDir:"", Ports:[]core.ContainerPort(nil), EnvFrom:[]core.EnvFromSource(nil), Env:[]core.EnvVar(nil), Resources:core.ResourceRequirements{Limits:core.ResourceList(nil), Requests:core.ResourceList(nil), Claims:[]core.ResourceClaim(nil)}, ResizePolicy:[]core.ContainerResizePolicy(nil), RestartPolicy:(*core.ContainerRestartPolicy)(nil), VolumeMounts:[]core.VolumeMount(nil), VolumeDevices:[]core.VolumeDevice(nil), LivenessProbe:(*core.Probe)(nil), ReadinessProbe:(*core.Probe)(nil), StartupProbe:(*core.Probe)(nil), Lifecycle:(*core.Lifecycle)(nil), TerminationMessagePath:"/dev/termination-log", TerminationMessagePolicy:"File", ImagePullPolicy:"IfNotPresent", SecurityContext:(*core.SecurityContext)(nil), Stdin:false, StdinOnce:false, TTY:false}}, Containers:[]core.Container{core.Container{Name:"cross-support-job-3p", Image:"onedata/rest-cli:20.02.12", Command:[]string{"bash", "-c", "set -e; echo \"-k\" > ~/.curlrc ; echo \"-f\" >> ~/.curlrc ; export KEYCLOAK_VARS_INITIALIZED=\"False\" ;\nexit 0;\n"}, Args:[]string(nil), WorkingDir:"", Ports:[]core.ContainerPort(nil), EnvFrom:[]core.EnvFromSource(nil), Env:[]core.EnvVar{core.EnvVar{Name:"ONEZONE_HOST", Value:"https://dev-onezone", ValueFrom:(*core.EnvVarSource)(nil)}, core.EnvVar{Name:"ONEPANEL_BASIC_AUTH", Value:"onepanel:password", ValueFrom:(*core.EnvVarSource)(nil)}, core.EnvVar{Name:"ONEZONE_BASIC_AUTH", Value:"admin:password", ValueFrom:(*core.EnvVarSource)(nil)}, core.EnvVar{Name:"ONEPROVIDER_HOST", Value:"https://dev-oneprovider", ValueFrom:(*core.EnvVarSource)(nil)}, core.EnvVar{Name:"ONEPANEL_HOST", Value:"https://dev-onezone:9443", ValueFrom:(*core.EnvVarSource)(nil)}, core.EnvVar{Name:"KEYCLOAK_HOST", Value:"http://dev-keycloak.default.svc.cluster.local", ValueFrom:(*core.EnvVarSource)(nil)}, core.EnvVar{Name:"TERM", Value:"xterm-256color", ValueFrom:(*core.EnvVarSource)(nil)}}, Resources:core.ResourceRequirements{Limits:core.ResourceList(nil), Requests:core.ResourceList(nil), Claims:[]core.ResourceClaim(nil)}, ResizePolicy:[]core.ContainerResizePolicy(nil), RestartPolicy:(*core.ContainerRestartPolicy)(nil), VolumeMounts:[]core.VolumeMount(nil), VolumeDevices:[]core.VolumeDevice(nil), LivenessProbe:(*core.Probe)(nil), ReadinessProbe:(*core.Probe)(nil), StartupProbe:(*core.Probe)(nil), Lifecycle:(*core.Lifecycle)(nil), TerminationMessagePath:"/dev/termination-log", TerminationMessagePolicy:"File", ImagePullPolicy:"IfNotPresent", SecurityContext:(*core.SecurityContext)(nil), Stdin:false, StdinOnce:false, TTY:false}}, EphemeralContainers:[]core.EphemeralContainer(nil), RestartPolicy:"Never", TerminationGracePeriodSeconds:(*int64)(0xc00a16eb70), ActiveDeadlineSeconds:(*int64)(nil), DNSPolicy:"ClusterFirst", NodeSelector:map[string]string(nil), ServiceAccountName:"", AutomountServiceAccountToken:(*bool)(nil), NodeName:"", SecurityContext:(*core.PodSecurityContext)(0xc00f8f6f30), ImagePullSecrets:[]core.LocalObjectReference{}, Hostname:"", Subdomain:"", SetHostnameAsFQDN:(*bool)(nil), Affinity:(*core.Affinity)(nil), SchedulerName:"default-scheduler", Tolerations:[]core.Toleration(nil), HostAliases:[]core.HostAlias(nil), PriorityClassName:"", Priority:(*int32)(nil), PreemptionPolicy:(*core.PreemptionPolicy)(nil), DNSConfig:(*core.PodDNSConfig)(nil), ReadinessGates:[]core.PodReadinessGate(nil), RuntimeClassName:(*string)(nil), Overhead:core.ResourceList(nil), EnableServiceLinks:(*bool)(nil), TopologySpreadConstraints:[]core.TopologySpreadConstraint(nil), OS:(*core.PodOS)(nil), SchedulingGates:[]core.PodSchedulingGate(nil), ResourceClaims:[]core.PodResourceClaim(nil)}}: field is immutable
coalesce.go:200: warning: cannot overwrite table with non table for external-volume-mounts (map[])
coalesce.go:200: warning: cannot overwrite table with non table for external-volumes (map[])
coalesce.go:200: warning: cannot overwrite table with non table for external-volumes (map[])
coalesce.go:200: warning: cannot overwrite table with non table for external-volume-mounts (map[])
coalesce.go:200: warning: cannot overwrite table with non table for external-storages (map[])
coalesce.go:200: warning: cannot overwrite table with non table for external-volumes (map[])
coalesce.go:200: warning: cannot overwrite table with non table for external-volume-mounts (map[])
coalesce.go:200: warning: cannot overwrite table with non table for external-volume-mounts (map[])
coalesce.go:200: warning: cannot overwrite table with non table for external-volume-mounts (map[])
coalesce.go:200: warning: cannot overwrite table with non table for external-volumes (map[])
coalesce.go:200: warning: cannot overwrite table with non table for external-volumes (map[])
coalesce.go:200: warning: cannot overwrite table with non table for external-storages (map[])
coalesce.go:200: warning: cannot overwrite table with non table for external-volumes (map[])
coalesce.go:200: warning: cannot overwrite table with non table for external-storages (map[])
coalesce.go:200: warning: cannot overwrite table with non table for external-volume-mounts (map[])
coalesce.go:200: warning: cannot overwrite table with non table for external-volume-mounts (map[])
coalesce.go:200: warning: cannot overwrite table with non table for external-volumes (map[])
coalesce.go:200: warning: cannot overwrite table with non table for external-volumes (map[])
coalesce.go:200: warning: cannot overwrite table with non table for external-volume-mounts (map[])
coalesce.go:200: warning: cannot overwrite table with non table for external-volume-mounts (map[])
coalesce.go:200: warning: cannot overwrite table with non table for external-volumes (map[])
coalesce.go:200: warning: cannot overwrite table with non table for external-volumes (map[])
coalesce.go:200: warning: cannot overwrite table with non table for external-storages (map[])
coalesce.go:200: warning: cannot overwrite table with non table for external-volume-mounts (map[])
coalesce.go:200: warning: cannot overwrite table with non table for external-volumes (map[])
coalesce.go:200: warning: cannot overwrite table with non table for external-volume-mounts (map[])
coalesce.go:200: warning: cannot overwrite table with non table for external-volumes (map[])
coalesce.go:200: warning: cannot overwrite table with non table for external-volume-mounts (map[])
Error: UPGRADE FAILED: cannot patch "dev-cross-support-job-3p-users" with kind Job: Job.batch "dev-cross-support-job-3p-users" is invalid: spec.template: Invalid value: core.PodTemplateSpec{ObjectMeta:v1.ObjectMeta{Name:"dev", GenerateName:"", Namespace:"", SelfLink:"", UID:"", ResourceVersion:"", Generation:0, CreationTimestamp:time.Date(1, time.January, 1, 0, 0, 0, 0, time.UTC), DeletionTimestamp:<nil>, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app":"dev-cross-support-job-3p", "batch.kubernetes.io/controller-uid":"72bfd77b-596d-4738-a6d8-f3c8b839368c", "batch.kubernetes.io/job-name":"dev-cross-support-job-3p-users", "chart":"cross-support-job-3p", "component":"cross-support-job-3p-users", "controller-uid":"72bfd77b-596d-4738-a6d8-f3c8b839368c", "dependency-level":"4", "heritage":"Helm", "job":"dev-cross-support-job-3p-users", "job-name":"dev-cross-support-job-3p-users", "release":"dev"}, Annotations:map[string]string{"version":"0.2.18-rc78"}, OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:core.PodSpec{Volumes:[]core.Volume(nil), InitContainers:[]core.Container{core.Container{Name:"wait-for-release", Image:"groundnuty/k8s-wait-for:v2.0", Command:[]string(nil), Args:[]string{"pod", "-l release in (dev), chart notin (cross-support-job-3p, oneclient, jupyter-notebook)\n"}, WorkingDir:"", Ports:[]core.ContainerPort(nil), EnvFrom:[]core.EnvFromSource(nil), Env:[]core.EnvVar(nil), Resources:core.ResourceRequirements{Limits:core.ResourceList(nil), Requests:core.ResourceList(nil), Claims:[]core.ResourceClaim(nil)}, ResizePolicy:[]core.ContainerResizePolicy(nil), RestartPolicy:(*core.ContainerRestartPolicy)(nil), VolumeMounts:[]core.VolumeMount(nil), VolumeDevices:[]core.VolumeDevice(nil), LivenessProbe:(*core.Probe)(nil), ReadinessProbe:(*core.Probe)(nil), StartupProbe:(*core.Probe)(nil), Lifecycle:(*core.Lifecycle)(nil), TerminationMessagePath:"/dev/termination-log", TerminationMessagePolicy:"File", ImagePullPolicy:"IfNotPresent", SecurityContext:(*core.SecurityContext)(nil), Stdin:false, StdinOnce:false, TTY:false}}, Containers:[]core.Container{core.Container{Name:"cross-support-job-3p", Image:"onedata/rest-cli:20.02.12", Command:[]string{"bash", "-c", "set -e; echo \"-k\" > ~/.curlrc ; echo \"-f\" >> ~/.curlrc ; export KEYCLOAK_VARS_INITIALIZED=\"False\" ;\nexit 0;\n"}, Args:[]string(nil), WorkingDir:"", Ports:[]core.ContainerPort(nil), EnvFrom:[]core.EnvFromSource(nil), Env:[]core.EnvVar{core.EnvVar{Name:"ONEZONE_HOST", Value:"https://dev-onezone", ValueFrom:(*core.EnvVarSource)(nil)}, core.EnvVar{Name:"ONEPANEL_BASIC_AUTH", Value:"onepanel:password", ValueFrom:(*core.EnvVarSource)(nil)}, core.EnvVar{Name:"ONEZONE_BASIC_AUTH", Value:"admin:password", ValueFrom:(*core.EnvVarSource)(nil)}, core.EnvVar{Name:"ONEPROVIDER_HOST", Value:"https://dev-oneprovider", ValueFrom:(*core.EnvVarSource)(nil)}, core.EnvVar{Name:"ONEPANEL_HOST", Value:"https://dev-onezone:9443", ValueFrom:(*core.EnvVarSource)(nil)}, core.EnvVar{Name:"KEYCLOAK_HOST", Value:"http://dev-keycloak.default.svc.cluster.local", ValueFrom:(*core.EnvVarSource)(nil)}, core.EnvVar{Name:"TERM", Value:"xterm-256color", ValueFrom:(*core.EnvVarSource)(nil)}}, Resources:core.ResourceRequirements{Limits:core.ResourceList(nil), Requests:core.ResourceList(nil), Claims:[]core.ResourceClaim(nil)}, ResizePolicy:[]core.ContainerResizePolicy(nil), RestartPolicy:(*core.ContainerRestartPolicy)(nil), VolumeMounts:[]core.VolumeMount(nil), VolumeDevices:[]core.VolumeDevice(nil), LivenessProbe:(*core.Probe)(nil), ReadinessProbe:(*core.Probe)(nil), StartupProbe:(*core.Probe)(nil), Lifecycle:(*core.Lifecycle)(nil), TerminationMessagePath:"/dev/termination-log", TerminationMessagePolicy:"File", ImagePullPolicy:"IfNotPresent", SecurityContext:(*core.SecurityContext)(nil), Stdin:false, StdinOnce:false, TTY:false}}, EphemeralContainers:[]core.EphemeralContainer(nil), RestartPolicy:"Never", TerminationGracePeriodSeconds:(*int64)(0xc00f574800), ActiveDeadlineSeconds:(*int64)(nil), DNSPolicy:"ClusterFirst", NodeSelector:map[string]string(nil), ServiceAccountName:"", AutomountServiceAccountToken:(*bool)(nil), NodeName:"", SecurityContext:(*core.PodSecurityContext)(0xc00d437680), ImagePullSecrets:[]core.LocalObjectReference{}, Hostname:"", Subdomain:"", SetHostnameAsFQDN:(*bool)(nil), Affinity:(*core.Affinity)(nil), SchedulerName:"default-scheduler", Tolerations:[]core.Toleration(nil), HostAliases:[]core.HostAlias(nil), PriorityClassName:"", Priority:(*int32)(nil), PreemptionPolicy:(*core.PreemptionPolicy)(nil), DNSConfig:(*core.PodDNSConfig)(nil), ReadinessGates:[]core.PodReadinessGate(nil), RuntimeClassName:(*string)(nil), Overhead:core.ResourceList(nil), EnableServiceLinks:(*bool)(nil), TopologySpreadConstraints:[]core.TopologySpreadConstraint(nil), OS:(*core.PodOS)(nil), SchedulingGates:[]core.PodSchedulingGate(nil), ResourceClaims:[]core.PodResourceClaim(nil)}}: field is immutable
Defaulted container "oneclient" out of: oneclient, wait-for-onezone (init), wait-for-token-dispenser (init)
id: ‘user1’: no such user
command terminated with exit code 1
Defaulted container "oneclient" out of: oneclient, wait-for-onezone (init), wait-for-token-dispenser (init)
Defaulted container "oneclient" out of: oneclient, wait-for-onezone (init), wait-for-token-dispenser (init)
id: ‘user2’: no such user
command terminated with exit code 1
Defaulted container "oneclient" out of: oneclient, wait-for-onezone (init), wait-for-token-dispenser (init)
Defaulted container "oneclient" out of: oneclient, wait-for-onezone (init), wait-for-token-dispenser (init)
command terminated with exit code 1
Defaulted container "oneclient" out of: oneclient, wait-for-onezone (init), wait-for-token-dispenser (init)
Defaulted container "oneclient" out of: oneclient, wait-for-onezone (init), wait-for-token-dispenser (init)
Defaulted container "oneclient" out of: oneclient, wait-for-onezone (init), wait-for-token-dispenser (init)
Defaulted container "oneclient" out of: oneclient, wait-for-onezone (init), wait-for-token-dispenser (init)
command terminated with exit code 1
Defaulted container "oneclient" out of: oneclient, wait-for-onezone (init), wait-for-token-dispenser (init)
Defaulted container "oneclient" out of: oneclient, wait-for-onezone (init), wait-for-token-dispenser (init)
Defaulted container "oneclient" out of: oneclient, wait-for-onezone (init), wait-for-token-dispenser (init)
Defaulted container "oneclient" out of: oneclient, wait-for-onezone (init), wait-for-token-dispenser (init)
command terminated with exit code 1
Defaulted container "oneclient" out of: oneclient, wait-for-onezone (init), wait-for-token-dispenser (init)
Defaulted container "oneclient" out of: oneclient, wait-for-onezone (init), wait-for-token-dispenser (init)
Defaulted container "oneclient" out of: oneclient, wait-for-onezone (init), wait-for-token-dispenser (init)
Defaulted container "oneclient" out of: oneclient, wait-for-onezone (init), wait-for-token-dispenser (init)
id: ‘user1’: no such user
command terminated with exit code 1
Defaulted container "oneclient" out of: oneclient, wait-for-onezone (init), wait-for-token-dispenser (init)
Defaulted container "oneclient" out of: oneclient, wait-for-onezone (init), wait-for-token-dispenser (init)
id: ‘user2’: no such user
command terminated with exit code 1
Defaulted container "oneclient" out of: oneclient, wait-for-onezone (init), wait-for-token-dispenser (init)
Defaulted container "oneclient" out of: oneclient, wait-for-onezone (init), wait-for-token-dispenser (init)
command terminated with exit code 1
Defaulted container "oneclient" out of: oneclient, wait-for-onezone (init), wait-for-token-dispenser (init)
Defaulted container "oneclient" out of: oneclient, wait-for-onezone (init), wait-for-token-dispenser (init)
Defaulted container "oneclient" out of: oneclient, wait-for-onezone (init), wait-for-token-dispenser (init)
Defaulted container "oneclient" out of: oneclient, wait-for-onezone (init), wait-for-token-dispenser (init)
command terminated with exit code 1
Defaulted container "oneclient" out of: oneclient, wait-for-onezone (init), wait-for-token-dispenser (init)
Defaulted container "oneclient" out of: oneclient, wait-for-onezone (init), wait-for-token-dispenser (init)
Defaulted container "oneclient" out of: oneclient, wait-for-onezone (init), wait-for-token-dispenser (init)
Defaulted container "oneclient" out of: oneclient, wait-for-onezone (init), wait-for-token-dispenser (init)
command terminated with exit code 1
Defaulted container "oneclient" out of: oneclient, wait-for-onezone (init), wait-for-token-dispenser (init)
Defaulted container "oneclient" out of: oneclient, wait-for-onezone (init), wait-for-token-dispenser (init)
Defaulted container "oneclient" out of: oneclient, wait-for-onezone (init), wait-for-token-dispenser (init)
Defaulted container "oneclient" out of: oneclient, wait-for-onezone (init), wait-for-token-dispenser (init)
Defaulted container "oneclient" out of: oneclient, wait-for-onezone (init), wait-for-token-dispenser (init)
rsync: change_dir#3 "/tmp/logs" failed: No such file or directory (2)
rsync error: errors selecting input/output files, dirs (code 3) at main.c(828) [sender=3.1.3]
command terminated with exit code 3
Defaulted container "oneprovider" out of: oneprovider, wait-for-volume-s3-init (init), onezone-registration-token (init)
Defaulted container "oneprovider" out of: oneprovider, wait-for-volume-s3-init (init), onezone-registration-token (init)
Defaulted container "oneprovider" out of: oneprovider, wait-for-volume-s3-init (init), onezone-registration-token (init)
Defaulted container "oneprovider" out of: oneprovider, wait-for-volume-s3-init (init), onezone-registration-token (init)
Defaulted container "oneprovider" out of: oneprovider, wait-for-volume-s3-init (init), onezone-registration-token (init)
Defaulted container "oneprovider" out of: oneprovider, wait-for-volume-s3-init (init), onezone-registration-token (init)
cp: cannot stat 'onedata/one_env/sources_info.yaml': No such file or directory
mv: will not overwrite just-created 'onedata/tests/upgrade/logs/images.yaml' with 'onedata/tests/upgrade/logs/1op/before_upgrade.upgrade_meta_test.1732044630.4654696/images.yaml'
mv: will not overwrite just-created 'onedata/tests/upgrade/logs/images.yaml' with 'onedata/tests/upgrade/logs/1op/before_upgrade.upgrade_meta_test.1732044650.1477342/images.yaml'
mv: will not overwrite just-created 'onedata/tests/upgrade/logs/images.yaml' with 'onedata/tests/upgrade/logs/2op/after_upgrade.upgrade_meta_test.1732045652.6141431/images.yaml'
mv: will not overwrite just-created 'onedata/tests/upgrade/logs/images.yaml' with 'onedata/tests/upgrade/logs/2op/before_upgrade.upgrade_meta_test.1732045335.0176504/images.yaml'
mv: will not overwrite just-created 'onedata/tests/upgrade/logs/images.yaml' with 'onedata/tests/upgrade/logs/2op/before_upgrade.upgrade_meta_test.1732045364.4258323/images.yaml'
  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                 Dload  Upload   Total   Spent    Left  Speed

  0     0    0     0    0     0      0      0 --:--:-- --:--:-- --:--:--     0
100  4822  100  4822    0     0  54795      0 --:--:-- --:--:-- --:--:-- 54795
Error response from daemon: Cannot kill container: 192fe183f56f: Container 192fe183f56fa70d4cbfaf5b2b6ff0f05c5268904a559d740b449f8bcb59795b is not running
Error response from daemon: Cannot kill container: 7c3c3153a078: Container 7c3c3153a078692b30e477d391647d40b374033c9e47ba78dcc33caea3b8f274 is not running
Error response from daemon: Cannot kill container: 1f4cb54e5619: Container 1f4cb54e5619947985f7843c56dc27da722ac6f756a578022fbbe4ce0296acfe is not running
Error response from daemon: Cannot kill container: 271e9e6a22f3: Container 271e9e6a22f3daf437407655b188d1f8ffe152866ff21520aad83dc49f49a3d5 is not running
Error response from daemon: Cannot kill container: dcb507b03bb3: Container dcb507b03bb3ede3a93039b8287033d56ebda6e0c234acc2050836280d78d127 is not running
Error response from daemon: Cannot kill container: 2cb36511a2aa: Container 2cb36511a2aa56bbe453afd5c8dcaaf547bb4fd11d2cf2640c2b01474a3badbd is not running
Error response from daemon: Cannot kill container: 543b556f575c: Container 543b556f575c686cebdd304b6db180f9e185b94226e7e007333008ff5cc896dd is not running
Error response from daemon: Cannot kill container: 6f31f1b2dbaa: Container 6f31f1b2dbaa47c510f18e1f6450a9bc1b55c55b40d842ee66d7d8c813841fb2 is not running
Error response from daemon: Cannot kill container: 0355a231c833: Container 0355a231c833971da11a74ed9538837e66ecea24b2745d701d49921323a0b4a1 is not running
Error response from daemon: Cannot kill container: 0229cc89de5f: Container 0229cc89de5f4f9342cb650d1f449665387c2c080ef453e3489b668cc280397a is not running
Error response from daemon: Cannot kill container: 74f93e6720a0: Container 74f93e6720a0d580254521688b176ef75d65ca20c38e68b9e6dfea338a992671 is not running
Error response from daemon: Cannot kill container: 96aa68f67254: Container 96aa68f672540648db6ac78c0432fd462e2923f0b7b0ab18561eb046b4d383e1 is not running
Error response from daemon: Cannot kill container: fd91fbf65f15: Container fd91fbf65f15521ef9beaa675470a6ec32644b52d0c3155db7e4ab1efaebcb81 is not running
Error response from daemon: Cannot kill container: 6c9c54b29ce9: Container 6c9c54b29ce91776661e06461227933f33a4927b95532e4757cad8b73b6099ad is not running
Error response from daemon: Cannot kill container: 529f34b315a9: Container 529f34b315a9801539a315000bc9c340b516200b2ebeff896ab95c5f9973ad84 is not running
Error response from daemon: Cannot kill container: 3c5fee4d4c8e: Container 3c5fee4d4c8e828e489b6e09ef97fffeb38415367e0e2f3d4596d07004c02b5e is not running
Error response from daemon: Cannot kill container: 926f185be1ce: Container 926f185be1ce0813ee22c0976cdbfda13ac349ce3062e183b55598bb5261597b is not running
Error response from daemon: Cannot kill container: a8e8f5cde248: Container a8e8f5cde2488ff31c09003dfc799cc12dc34730c6d573a06829fec410254ca4 is not running
Error response from daemon: Cannot kill container: de3e3f7157ba: Container de3e3f7157ba0e7aa9294cf7c0abd8eb8634a0c009ef78b06cd71c86a8abcc27 is not running
Error response from daemon: Cannot kill container: c744e600d610: Container c744e600d610472f87429be97c1e741c0b00212bf699781ddb4100ec62e284e3 is not running