GUI acceptance tests using environment deployed from packages.

Build: #2433 failed

Job: Onezone harvesters effective privileges failed

Stages & jobs

  1. Acceptance Test

Job result summary

Completed
Duration
23 minutes
Revision
8b65125d3c634b17aaaefeb7943a7b46e5b82b38
Total tests
4
Fixed in
#2434 (Child of ODSRV-OZP-1818)

Tests

  • 4 tests in total
  • 2 tests failed
  • 2 failures are new
  • 17 minutes taken in total.
New test failures 2
Status Test Duration
Collapse Failed test_onezone_harvesters_effective_privileges test_user_sees_that_user_effective_privileges_are_the_sum_of_its_direct_parent_direct_privileges_and_its_direct_privileges[1oz_1op_elasticsearch]
3 mins
RuntimeError: no  item found in MembersItemRow in MembersList in MembersPage in DiscoveryPage in Onezone page
web_elem_root = <selenium.webdriver.remote.webelement.WebElement (session="44b684ef907e363286349953d56da48d", element="30a0c4b3-68bd-48c5-8067-c347ffaf1a49")>
css_sel = '.record-name-general'
err_msg = 'no  item found in MembersItemRow in MembersList in MembersPage in DiscoveryPage in Onezone page'

    def find_web_elem(web_elem_root, css_sel, err_msg):
        try:
            _scroll_to_css_sel(web_elem_root, css_sel)
(306 more lines...)
Collapse Failed test_onezone_harvesters_effective_privileges test_user_sees_that_user_effective_privileges_are_the_sum_of_its_direct_parents_direct_privileges[1oz_1op_elasticsearch]
4 mins
RuntimeError: no  item found in MembersItemRow in MembersList in MembersPage in DiscoveryPage in Onezone page
web_elem_root = <selenium.webdriver.remote.webelement.WebElement (session="62d842271c574586f884c8d6e65c19a0", element="df38d80c-1db8-41e3-8177-482af0443327")>
css_sel = '.record-name-general'
err_msg = 'no  item found in MembersItemRow in MembersList in MembersPage in DiscoveryPage in Onezone page'

    def find_web_elem(web_elem_root, css_sel, err_msg):
        try:
            _scroll_to_css_sel(web_elem_root, css_sel)
(306 more lines...)

Error summary

The build generated some errors. See the full build log for more details.

  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                 Dload  Upload   Total   Spent    Left  Speed

  0     0    0     0    0     0      0      0 --:--:-- --:--:-- --:--:--     0
100  4822  100  4822    0     0  45065      0 --:--:-- --:--:-- --:--:-- 45065
Error response from daemon: Cannot kill container: 1bbb5ebde631: Container 1bbb5ebde631fec5539e49ac9a31152ea17e360b4b40b2e6eda377b1e5d2dbee is not running
Error response from daemon: remove 1f7559c0d063e80853d1ae97ad6e30432879b1a528ff1d29aefe8e48a58ab24b: volume is in use - [752aca98b2e8398fe2aad4bc9e0439db3a2253f34134fa0bb4ab7d9342e174bb]
Error response from daemon: remove 4931fce1b39a59c38ce67aae77ff77395b8c106bd1df4f20f3239ad9663d6cc4: volume is in use - [752aca98b2e8398fe2aad4bc9e0439db3a2253f34134fa0bb4ab7d9342e174bb]
I1202 11:20:58.914996 3364747 out.go:286] Setting OutFile to fd 1 ...
I1202 11:20:58.915161 3364747 out.go:333] TERM=unknown,COLORTERM=, which probably does not support color
I1202 11:20:58.915169 3364747 out.go:299] Setting ErrFile to fd 2...
I1202 11:20:58.915178 3364747 out.go:333] TERM=unknown,COLORTERM=, which probably does not support color
I1202 11:20:58.915408 3364747 root.go:312] Updating PATH: /root/.minikube/bin
I1202 11:20:58.916803 3364747 cli_runner.go:115] Run: docker ps -a --filter label=name.minikube.sigs.k8s.io --format {{.Names}}
I1202 11:20:58.984464 3364747 delete.go:228] DeleteProfiles
I1202 11:20:58.984499 3364747 delete.go:256] Deleting minikube
I1202 11:20:58.984541 3364747 delete.go:261] minikube configuration: &{Name:minikube KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase:v0.0.25@sha256:6f936e3443b95cd918d77623bf7b595653bb382766e280290a02b4a349e88b79 Memory:3600 CPUs:2 DiskSize:20000 VMDriver: Driver:none HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.99.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.21.3 ClusterName:minikube Namespace:default APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin: FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: ExtraOptions:[{Component:kubelet Key:resolv-conf Value:/home/bamboo/.minikube-kubelet-resolv.conf} {Component:kubelet Key:eviction-hard Value:imagefs.available<5%,nodefs.available<5%,}] ShouldLoadCachedImages:false EnableDefaultCNI:false CNI: NodeIP: NodePort:8443 NodeName:} Nodes:[{Name:m01 IP:10.87.23.69 Port:8443 KubernetesVersion:v1.21.3 ControlPlane:true Worker:true}] Addons:map[default-storageclass:true storage-provisioner:true] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: MultiNodeRequested:false}
I1202 11:20:58.991557 3364747 out.go:165] * Uninstalling Kubernetes v1.21.3 using kubeadm ...
I1202 11:20:58.991647 3364747 host.go:66] Checking if "minikube" exists ...
I1202 11:20:58.992193 3364747 exec_runner.go:52] Run: systemctl --version
I1202 11:20:58.995069 3364747 exec_runner.go:52] Run: docker ps --filter status=paused --filter=name=k8s_ --format={{.ID}}
W1202 11:20:59.059935 3364747 pause.go:91] no paused containers found
I1202 11:20:59.060003 3364747 exec_runner.go:52] Run: sudo systemctl daemon-reload
I1202 11:20:59.498508 3364747 exec_runner.go:52] Run: sudo systemctl start kubelet
I1202 11:20:59.526842 3364747 exec_runner.go:52] Run: /bin/bash -c "sudo env PATH=/var/lib/minikube/binaries/v1.21.3:$PATH kubeadm reset --cri-socket /var/run/dockershim.sock --force"
I1202 11:21:03.608638 3364747 exec_runner.go:85] Completed: /bin/bash -c "sudo env PATH=/var/lib/minikube/binaries/v1.21.3:$PATH kubeadm reset --cri-socket /var/run/dockershim.sock --force": (4.081719581s)
I1202 11:21:03.608726 3364747 exec_runner.go:52] Run: sudo systemctl stop -f kubelet
I1202 11:21:03.629022 3364747 exec_runner.go:52] Run: docker ps -a --filter=name=k8s_.*_(kube-system)_ --format={{.ID}}
W1202 11:21:03.700213 3364747 none.go:130] unable to get port: "minikube" does not appear in /root/.kube/config
I1202 11:21:03.700269 3364747 api_server.go:164] Checking apiserver status ...
I1202 11:21:03.700315 3364747 exec_runner.go:52] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
W1202 11:21:03.720005 3364747 api_server.go:168] stopped: unable to get apiserver pid: sudo pgrep -xnf kube-apiserver.*minikube.*: exit status 1
stdout:

stderr:
I1202 11:21:03.720085 3364747 exec_runner.go:52] Run: sudo systemctl is-active --quiet service kubelet
I1202 11:21:03.814899 3364747 out.go:165] * Deleting "minikube" in none ...
I1202 11:21:03.815069 3364747 exec_runner.go:52] Run: sudo systemctl stop -f kubelet
I1202 11:21:03.836341 3364747 exec_runner.go:52] Run: docker ps -a --filter=name=k8s_ --format={{.ID}}
I1202 11:21:03.903257 3364747 none.go:185] Removing: [/var/tmp/minikube /etc/kubernetes/manifests /var/lib/minikube]
I1202 11:21:03.903347 3364747 exec_runner.go:52] Run: sudo rm -rf /var/tmp/minikube /etc/kubernetes/manifests /var/lib/minikube
I1202 11:21:03.974194 3364747 out.go:165] * Removed all traces of the "minikube" cluster.
*
! The 'none' driver is designed for experts who need to integrate with an existing VM
* Most users should use the newer 'docker' driver instead, which does not require root!
* For more information, see: https://minikube.sigs.k8s.io/docs/reference/drivers/none/
*
! kubectl and minikube configuration will be stored in /root
! To use kubectl or minikube commands as your own user, you may need to relocate them. For example, to overwrite your own settings, run:
*
  - sudo mv /root/.kube /root/.minikube $HOME
  - sudo chown -R $USER $HOME/.kube $HOME/.minikube
*
* This can also be done automatically by setting the env var CHANGE_MINIKUBE_NONE_USER=true
Submodule 'bamboos' (ssh://git@git.onedata.org:7999/vfs/bamboos.git) registered for path 'bamboos'
Submodule 'cdmi_swagger' (ssh://git@git.onedata.org:7999/vfs/cdmi-swagger.git) registered for path 'cdmi_swagger'
Submodule 'one_env' (ssh://git@git.onedata.org:7999/vfs/one-env.git) registered for path 'one_env'
Submodule 'onepanel_swagger' (ssh://git@git.onedata.org:7999/vfs/onepanel-swagger.git) registered for path 'onepanel_swagger'
Submodule 'oneprovider_swagger' (ssh://git@git.onedata.org:7999/vfs/oneprovider-swagger.git) registered for path 'oneprovider_swagger'
Submodule 'onezone_swagger' (ssh://git@git.onedata.org:7999/vfs/onezone-swagger.git) registered for path 'onezone_swagger'
Cloning into '/mnt/storage/bamboo-agent-home/xml-data/build-dir/ODSRV-GAPT-COHEP/onedata-acceptance/bamboos'...
Cloning into '/mnt/storage/bamboo-agent-home/xml-data/build-dir/ODSRV-GAPT-COHEP/onedata-acceptance/cdmi_swagger'...
Cloning into '/mnt/storage/bamboo-agent-home/xml-data/build-dir/ODSRV-GAPT-COHEP/onedata-acceptance/one_env'...
Cloning into '/mnt/storage/bamboo-agent-home/xml-data/build-dir/ODSRV-GAPT-COHEP/onedata-acceptance/onepanel_swagger'...
Cloning into '/mnt/storage/bamboo-agent-home/xml-data/build-dir/ODSRV-GAPT-COHEP/onedata-acceptance/oneprovider_swagger'...
Cloning into '/mnt/storage/bamboo-agent-home/xml-data/build-dir/ODSRV-GAPT-COHEP/onedata-acceptance/onezone_swagger'...
Submodule 'bamboos' (ssh://git@git.onedata.org:7999/vfs/bamboos.git) registered for path 'onepanel_swagger/bamboos'
Cloning into '/mnt/storage/bamboo-agent-home/xml-data/build-dir/ODSRV-GAPT-COHEP/onedata-acceptance/onepanel_swagger/bamboos'...
Submodule 'bamboos' (ssh://git@git.onedata.org:7999/vfs/bamboos.git) registered for path 'oneprovider_swagger/bamboos'
Cloning into '/mnt/storage/bamboo-agent-home/xml-data/build-dir/ODSRV-GAPT-COHEP/onedata-acceptance/oneprovider_swagger/bamboos'...
Submodule 'bamboos' (ssh://git@git.onedata.org:7999/vfs/bamboos.git) registered for path 'onezone_swagger/bamboos'
Cloning into '/mnt/storage/bamboo-agent-home/xml-data/build-dir/ODSRV-GAPT-COHEP/onedata-acceptance/onezone_swagger/bamboos'...
/usr/local/lib/python3.8/dist-packages/pytest_selenium/drivers/crossbrowsertesting.py:72: SyntaxWarning: "is not" with a literal. Did you mean "!="?
  if report.when == 'setup' or info.get('test_score') is not 'fail':
error: failed to create clusterrolebinding: clusterrolebindings.rbac.authorization.k8s.io "serviceaccounts-cluster-admin" already exists
error: failed to create clusterrolebinding: clusterrolebindings.rbac.authorization.k8s.io "serviceaccounts-cluster-admin" already exists
W1202 11:40:09.863363     730 warnings.go:70] policy/v1beta1 PodDisruptionBudget is deprecated in v1.21+, unavailable in v1.25+; use policy/v1 PodDisruptionBudget
cp: cannot stat 'onedata-acceptance/one_env/sources_info.yaml': No such file or directory
  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                 Dload  Upload   Total   Spent    Left  Speed

  0     0    0     0    0     0      0      0 --:--:-- --:--:-- --:--:--     0
100  4822  100  4822    0     0  45490      0 --:--:-- --:--:-- --:--:-- 45490
Error response from daemon: Cannot kill container: 22aebc8dd255: Container 22aebc8dd2557abec8e4c533fbf3e4445e6588d9f5694ebbd9f34fa46be6b9eb is not running
Error response from daemon: Cannot kill container: 88733e559106: Container 88733e55910608944f6c1c98c97dec833d7ffb968a2cc11b5861f040dc06609c is not running
Error response from daemon: Cannot kill container: c71cec80e40d: Container c71cec80e40df55f675061fff7196ad4842259e00586b26dfc6d41de061f5dff is not running
Error response from daemon: Cannot kill container: b0997b96e945: Container b0997b96e945dbbcf3eee41e420c90357c558e1b9f0b1bcb04fa1860895f8e81 is not running
Error response from daemon: Cannot kill container: 824042341f6e: Container 824042341f6e1b0bdc160517a175745faad4fd9d84703f919864823a9a73d5f1 is not running
Error response from daemon: Cannot kill container: 05e38d599e7a: Container 05e38d599e7a74589a733c0a432504d5edf7990162719b1f9fa19d6085a634e6 is not running
Error response from daemon: Cannot kill container: b5bb733685b5: Container b5bb733685b5cdcce577b7b70a7cb0ec87168aba6f6eed6869b70ed2a0d7e5b1 is not running
Error response from daemon: Cannot kill container: 46ee7605e275: Container 46ee7605e27519916637168dbfcfe72c11ce19fbd8e3e5757af544e1a740df4c is not running
Error response from daemon: Cannot kill container: a6c6c5bfbc3c: Container a6c6c5bfbc3c92c724d38ceb2a91cda80ad69ee1d2f75080c32ebd44bd4e5011 is not running
Error response from daemon: Cannot kill container: d45999701471: Container d459997014710763cc79cb580302a38d6660ac5e184aebd3a95a5ca46b232077 is not running
Error response from daemon: Cannot kill container: ccb8bde5a320: Container ccb8bde5a320a0805951cffcbd4c57ff11eec34c6fcdb94dbc0a372e59a6cb7d is not running
Error response from daemon: Cannot kill container: 7d3ba3b1f737: Container 7d3ba3b1f73784cee4282ad271c4096157faf5b773d9cb2c780cbde49199603a is not running
Error response from daemon: Cannot kill container: c0af7894ead1: Container c0af7894ead1ee7937d2d9f141bbd6235b090ef000b7520bd670de0970c2fbc7 is not running
Error response from daemon: Cannot kill container: 837341c44731: Container 837341c447310c6d30969fe86bba044cdc5ed7d1264bb4e86494cb77abe1f779 is not running
Error response from daemon: Cannot kill container: e82afa4c9655: Container e82afa4c96557d41ea7b42286fef57665b480714c125cbb23e0a0625ef73874d is not running
Error response from daemon: Cannot kill container: b025440b03e8: Container b025440b03e8fa64c2de44c041d4302230f6675566661b8a0e3aed75b3a4673f is not running