GUI acceptance tests using environment deployed from packages.

Build: #2261 failed

Job: Oneprovider ACL remove file failed

Stages & jobs

  1. Acceptance Test

Job result summary

Completed
Duration
5 minutes
Revision
32a8637a9a01b830c75d008c42ff27e7892513dd
Total tests
8
Fixed in
#2262 (Child of ODSRV-OPRPM-2186)

Tests

  • 8 tests in total
  • 8 tests failed
  • 8 failures are new
  • 2 minutes taken in total.
New test failures 8
Status Test Duration
Collapse Failed test_oneprovider_acl_files test_read_files_acl[1oz_1op_deployed-group-group1-fails-all except [acl:read acl]]
15 secs
tests.utils.http_exceptions.HTTPBadRequest: [400] Bad Request: {"error":{"id":"errorOnNodes","details":{"hostnames":["dev-oneprovider-krakow-0.dev-oneprovider-krakow.default.svc.cluster.local"],"error":{"id":"badValueToken","details":{"tokenError":{"id":"badToken","description":"Provided token could not be understood by the server."},"key":"token"},"description":"Bad value: provided \"token\" is not a valid token (see details)."}},"description":"Error on nodes dev-oneprovider-krakow-0.dev-oneprovider-krakow.default.svc.cluster.local: Bad value: provided \"token\" is not a valid token (see details)."}}
request = <FixtureRequest for <Function 'test_read_files_acl[1oz_1op_deployed-group-group1-fails-all except [acl:read acl]]'>>

    @pytest.mark.usefixtures(*function_args)
    def scenario_wrapper(request):
>       _execute_scenario(feature, scenario, request, encoding)

/usr/local/lib/python3.6/dist-packages/pytest_bdd/scenario.py:227: 
(47 more lines...)
Collapse Failed test_oneprovider_acl_files test_read_files_acl[1oz_1op_deployed-group-group1-succeeds-[acl:read acl]]
1 sec
tests.utils.http_exceptions.HTTPBadRequest: [400] Bad Request: {"error":{"id":"errorOnNodes","details":{"hostnames":["dev-oneprovider-krakow-0.dev-oneprovider-krakow.default.svc.cluster.local"],"error":{"id":"badValueToken","details":{"tokenError":{"id":"badToken","description":"Provided token could not be understood by the server."},"key":"token"},"description":"Bad value: provided \"token\" is not a valid token (see details)."}},"description":"Error on nodes dev-oneprovider-krakow-0.dev-oneprovider-krakow.default.svc.cluster.local: Bad value: provided \"token\" is not a valid token (see details)."}}
request = <FixtureRequest for <Function 'test_read_files_acl[1oz_1op_deployed-group-group1-succeeds-[acl:read acl]]'>>

    @pytest.mark.usefixtures(*function_args)
    def scenario_wrapper(request):
>       _execute_scenario(feature, scenario, request, encoding)

/usr/local/lib/python3.6/dist-packages/pytest_bdd/scenario.py:227: 
(47 more lines...)
Collapse Failed test_oneprovider_acl_files test_read_files_acl[1oz_1op_deployed-user-user1-fails-all except [acl:read acl]]
1 sec
tests.utils.http_exceptions.HTTPBadRequest: [400] Bad Request: {"error":{"id":"errorOnNodes","details":{"hostnames":["dev-oneprovider-krakow-0.dev-oneprovider-krakow.default.svc.cluster.local"],"error":{"id":"badValueToken","details":{"tokenError":{"id":"badToken","description":"Provided token could not be understood by the server."},"key":"token"},"description":"Bad value: provided \"token\" is not a valid token (see details)."}},"description":"Error on nodes dev-oneprovider-krakow-0.dev-oneprovider-krakow.default.svc.cluster.local: Bad value: provided \"token\" is not a valid token (see details)."}}
request = <FixtureRequest for <Function 'test_read_files_acl[1oz_1op_deployed-user-user1-fails-all except [acl:read acl]]'>>

    @pytest.mark.usefixtures(*function_args)
    def scenario_wrapper(request):
>       _execute_scenario(feature, scenario, request, encoding)

/usr/local/lib/python3.6/dist-packages/pytest_bdd/scenario.py:227: 
(47 more lines...)
Collapse Failed test_oneprovider_acl_files test_read_files_acl[1oz_1op_deployed-user-user1-succeeds-[acl:read acl]]
1 sec
tests.utils.http_exceptions.HTTPBadRequest: [400] Bad Request: {"error":{"id":"errorOnNodes","details":{"hostnames":["dev-oneprovider-krakow-0.dev-oneprovider-krakow.default.svc.cluster.local"],"error":{"id":"badValueToken","details":{"tokenError":{"id":"badToken","description":"Provided token could not be understood by the server."},"key":"token"},"description":"Bad value: provided \"token\" is not a valid token (see details)."}},"description":"Error on nodes dev-oneprovider-krakow-0.dev-oneprovider-krakow.default.svc.cluster.local: Bad value: provided \"token\" is not a valid token (see details)."}}
request = <FixtureRequest for <Function 'test_read_files_acl[1oz_1op_deployed-user-user1-succeeds-[acl:read acl]]'>>

    @pytest.mark.usefixtures(*function_args)
    def scenario_wrapper(request):
>       _execute_scenario(feature, scenario, request, encoding)

/usr/local/lib/python3.6/dist-packages/pytest_bdd/scenario.py:227: 
(47 more lines...)
Collapse Failed test_oneprovider_acl_files test_rename_file[1oz_1op_deployed-group-group1-fails-all except [general:delete]]
1 sec
tests.utils.http_exceptions.HTTPBadRequest: [400] Bad Request: {"error":{"id":"errorOnNodes","details":{"hostnames":["dev-oneprovider-krakow-0.dev-oneprovider-krakow.default.svc.cluster.local"],"error":{"id":"badValueToken","details":{"tokenError":{"id":"badToken","description":"Provided token could not be understood by the server."},"key":"token"},"description":"Bad value: provided \"token\" is not a valid token (see details)."}},"description":"Error on nodes dev-oneprovider-krakow-0.dev-oneprovider-krakow.default.svc.cluster.local: Bad value: provided \"token\" is not a valid token (see details)."}}
request = <FixtureRequest for <Function 'test_rename_file[1oz_1op_deployed-group-group1-fails-all except [general:delete]]'>>

    @pytest.mark.usefixtures(*function_args)
    def scenario_wrapper(request):
>       _execute_scenario(feature, scenario, request, encoding)

/usr/local/lib/python3.6/dist-packages/pytest_bdd/scenario.py:227: 
(47 more lines...)
Collapse Failed test_oneprovider_acl_files test_rename_file[1oz_1op_deployed-group-group1-succeeds-[general:delete]]
1 sec
tests.utils.http_exceptions.HTTPBadRequest: [400] Bad Request: {"error":{"id":"errorOnNodes","details":{"hostnames":["dev-oneprovider-krakow-0.dev-oneprovider-krakow.default.svc.cluster.local"],"error":{"id":"badValueToken","details":{"tokenError":{"id":"badToken","description":"Provided token could not be understood by the server."},"key":"token"},"description":"Bad value: provided \"token\" is not a valid token (see details)."}},"description":"Error on nodes dev-oneprovider-krakow-0.dev-oneprovider-krakow.default.svc.cluster.local: Bad value: provided \"token\" is not a valid token (see details)."}}
request = <FixtureRequest for <Function 'test_rename_file[1oz_1op_deployed-group-group1-succeeds-[general:delete]]'>>

    @pytest.mark.usefixtures(*function_args)
    def scenario_wrapper(request):
>       _execute_scenario(feature, scenario, request, encoding)

/usr/local/lib/python3.6/dist-packages/pytest_bdd/scenario.py:227: 
(47 more lines...)
Collapse Failed test_oneprovider_acl_files test_rename_file[1oz_1op_deployed-user-user1-fails-all except [general:delete]]
1 sec
tests.utils.http_exceptions.HTTPBadRequest: [400] Bad Request: {"error":{"id":"errorOnNodes","details":{"hostnames":["dev-oneprovider-krakow-0.dev-oneprovider-krakow.default.svc.cluster.local"],"error":{"id":"badValueToken","details":{"tokenError":{"id":"badToken","description":"Provided token could not be understood by the server."},"key":"token"},"description":"Bad value: provided \"token\" is not a valid token (see details)."}},"description":"Error on nodes dev-oneprovider-krakow-0.dev-oneprovider-krakow.default.svc.cluster.local: Bad value: provided \"token\" is not a valid token (see details)."}}
request = <FixtureRequest for <Function 'test_rename_file[1oz_1op_deployed-user-user1-fails-all except [general:delete]]'>>

    @pytest.mark.usefixtures(*function_args)
    def scenario_wrapper(request):
>       _execute_scenario(feature, scenario, request, encoding)

/usr/local/lib/python3.6/dist-packages/pytest_bdd/scenario.py:227: 
(47 more lines...)
Collapse Failed test_oneprovider_acl_files test_rename_file[1oz_1op_deployed-user-user1-succeeds-[general:delete]]
2 mins
tests.utils.http_exceptions.HTTPBadRequest: [400] Bad Request: {"error":{"id":"errorOnNodes","details":{"hostnames":["dev-oneprovider-krakow-0.dev-oneprovider-krakow.default.svc.cluster.local"],"error":{"id":"badValueToken","details":{"tokenError":{"id":"badToken","description":"Provided token could not be understood by the server."},"key":"token"},"description":"Bad value: provided \"token\" is not a valid token (see details)."}},"description":"Error on nodes dev-oneprovider-krakow-0.dev-oneprovider-krakow.default.svc.cluster.local: Bad value: provided \"token\" is not a valid token (see details)."}}
request = <FixtureRequest for <Function 'test_rename_file[1oz_1op_deployed-user-user1-succeeds-[general:delete]]'>>

    @pytest.mark.usefixtures(*function_args)
    def scenario_wrapper(request):
>       _execute_scenario(feature, scenario, request, encoding)

/usr/local/lib/python3.6/dist-packages/pytest_bdd/scenario.py:227: 
(47 more lines...)

Error summary

The build generated some errors. See the full build log for more details.

  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                 Dload  Upload   Total   Spent    Left  Speed

  0     0    0     0    0     0      0      0 --:--:-- --:--:-- --:--:--     0
100  4653  100  4653    0     0  47000      0 --:--:-- --:--:-- --:--:-- 47000
Error response from daemon: Cannot kill container: 9b56ccb14e4b: Container 9b56ccb14e4b9aa1558f95f33f2d406f4c4ed4aae3ff7481e6ffd58471a4c01a is not running
Submodule 'bamboos' (ssh://git@git.onedata.org:7999/vfs/bamboos.git) registered for path 'bamboos'
Submodule 'cdmi_swagger' (ssh://git@git.onedata.org:7999/vfs/cdmi-swagger.git) registered for path 'cdmi_swagger'
Submodule 'one_env' (ssh://git@git.onedata.org:7999/vfs/one-env.git) registered for path 'one_env'
Submodule 'onepanel_swagger' (ssh://git@git.onedata.org:7999/vfs/onepanel-swagger.git) registered for path 'onepanel_swagger'
Submodule 'oneprovider_swagger' (ssh://git@git.onedata.org:7999/vfs/oneprovider-swagger.git) registered for path 'oneprovider_swagger'
Submodule 'onezone_swagger' (ssh://git@git.onedata.org:7999/vfs/onezone-swagger.git) registered for path 'onezone_swagger'
Cloning into '/mnt/storage/bamboo-agent-home/xml-data/build-dir/ODSRV-GAPT-COARF/onedata/bamboos'...
Cloning into '/mnt/storage/bamboo-agent-home/xml-data/build-dir/ODSRV-GAPT-COARF/onedata/cdmi_swagger'...
Cloning into '/mnt/storage/bamboo-agent-home/xml-data/build-dir/ODSRV-GAPT-COARF/onedata/one_env'...
Cloning into '/mnt/storage/bamboo-agent-home/xml-data/build-dir/ODSRV-GAPT-COARF/onedata/onepanel_swagger'...
Cloning into '/mnt/storage/bamboo-agent-home/xml-data/build-dir/ODSRV-GAPT-COARF/onedata/oneprovider_swagger'...
Cloning into '/mnt/storage/bamboo-agent-home/xml-data/build-dir/ODSRV-GAPT-COARF/onedata/onezone_swagger'...
Submodule 'bamboos' (ssh://git@git.onedata.org:7999/vfs/bamboos.git) registered for path 'onepanel_swagger/bamboos'
Cloning into '/mnt/storage/bamboo-agent-home/xml-data/build-dir/ODSRV-GAPT-COARF/onedata/onepanel_swagger/bamboos'...
Submodule 'bamboos' (ssh://git@git.onedata.org:7999/vfs/bamboos.git) registered for path 'oneprovider_swagger/bamboos'
Cloning into '/mnt/storage/bamboo-agent-home/xml-data/build-dir/ODSRV-GAPT-COARF/onedata/oneprovider_swagger/bamboos'...
Submodule 'bamboos' (ssh://git@git.onedata.org:7999/vfs/bamboos.git) registered for path 'onezone_swagger/bamboos'
Cloning into '/mnt/storage/bamboo-agent-home/xml-data/build-dir/ODSRV-GAPT-COARF/onedata/onezone_swagger/bamboos'...
error: failed to create clusterrolebinding: clusterrolebindings.rbac.authorization.k8s.io "serviceaccounts-cluster-admin" already exists
Error: could not find tiller
Error: could not find tiller
error: failed to create clusterrolebinding: clusterrolebindings.rbac.authorization.k8s.io "serviceaccounts-cluster-admin" already exists
protocol version mismatch -- is your shell clean?
(see the rsync man page for an explanation)
rsync error: protocol incompatibility (code 2) at compat.c(178) [Receiver=3.1.2]
command terminated with exit code 126
protocol version mismatch -- is your shell clean?
(see the rsync man page for an explanation)
rsync error: protocol incompatibility (code 2) at compat.c(178) [Receiver=3.1.2]
command terminated with exit code 126
protocol version mismatch -- is your shell clean?
(see the rsync man page for an explanation)
rsync error: protocol incompatibility (code 2) at compat.c(178) [Receiver=3.1.2]
command terminated with exit code 126
protocol version mismatch -- is your shell clean?
(see the rsync man page for an explanation)
rsync error: protocol incompatibility (code 2) at compat.c(178) [Receiver=3.1.2]
command terminated with exit code 126
protocol version mismatch -- is your shell clean?
(see the rsync man page for an explanation)
rsync error: protocol incompatibility (code 2) at compat.c(178) [Receiver=3.1.2]
command terminated with exit code 126
protocol version mismatch -- is your shell clean?
(see the rsync man page for an explanation)
rsync error: protocol incompatibility (code 2) at compat.c(178) [Receiver=3.1.2]
command terminated with exit code 126
error: failed to create clusterrolebinding: clusterrolebindings.rbac.authorization.k8s.io "serviceaccounts-cluster-admin" already exists
protocol version mismatch -- is your shell clean?
(see the rsync man page for an explanation)
rsync error: protocol incompatibility (code 2) at compat.c(178) [Receiver=3.1.2]
command terminated with exit code 126
protocol version mismatch -- is your shell clean?
(see the rsync man page for an explanation)
rsync error: protocol incompatibility (code 2) at compat.c(178) [Receiver=3.1.2]
command terminated with exit code 126
protocol version mismatch -- is your shell clean?
(see the rsync man page for an explanation)
rsync error: protocol incompatibility (code 2) at compat.c(178) [Receiver=3.1.2]
command terminated with exit code 126
protocol version mismatch -- is your shell clean?
(see the rsync man page for an explanation)
rsync error: protocol incompatibility (code 2) at compat.c(178) [Receiver=3.1.2]
command terminated with exit code 126
protocol version mismatch -- is your shell clean?
(see the rsync man page for an explanation)
rsync error: protocol incompatibility (code 2) at compat.c(178) [Receiver=3.1.2]
command terminated with exit code 126
protocol version mismatch -- is your shell clean?
(see the rsync man page for an explanation)
rsync error: protocol incompatibility (code 2) at compat.c(178) [Receiver=3.1.2]
command terminated with exit code 126
W0816 20:05:35.272090     723 warnings.go:70] policy/v1beta1 PodDisruptionBudget is deprecated in v1.21+, unavailable in v1.25+; use policy/v1 PodDisruptionBudget
cp: cannot stat 'onedata/one_env/sources_info.yaml': No such file or directory
  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                 Dload  Upload   Total   Spent    Left  Speed

  0     0    0     0    0     0      0      0 --:--:-- --:--:-- --:--:--     0
100  4653  100  4653    0     0  46069      0 --:--:-- --:--:-- --:--:-- 46069
Error from server (NotFound): pods "dev-oneprovider-krakow-0" not found
Error response from daemon: Cannot kill container: d75172f2f28a: Container d75172f2f28a6a7955f70ee8778125598f743d4e35c6f05e7e32fb3d87099f95 is not running
Error response from daemon: Cannot kill container: fdbe3031baeb: Container fdbe3031baeb78006eab378aca85dc697b29f182960c0fb8be99704789ab80c8 is not running
Error response from daemon: Cannot kill container: c78c0201bbfd: Container c78c0201bbfdf905f65f92e36c9e24de98f5c1406d6d0eac3322310983a7828c is not running
Error response from daemon: Cannot kill container: 9c0e5012174c: Container 9c0e5012174c48ddd508f7e48c4a1e7933fff76d722cb8ad65efdefa87502fb5 is not running
Error response from daemon: Cannot kill container: 406b66ddb1a5: Container 406b66ddb1a5907bc227467619e7743fa4cdb8a5067b77e74dc82f4484ccd210 is not running
Error response from daemon: Cannot kill container: f5e3594543c0: Container f5e3594543c0c3d64750ec5a6184b288fc6139848a225ec2aee7da47cc0bd033 is not running
Error response from daemon: Cannot kill container: 2ead3cbd1306: Container 2ead3cbd1306682da23a9f5ee9b2115cbaa3ab7051364632317a839e31272f9a is not running
Error response from daemon: Cannot kill container: e30780ef92be: Container e30780ef92be7f8a72a6e2b23726d7c7ebdc3af1657f9fe4dbd7b44bbc075396 is not running
Error response from daemon: Cannot kill container: a222ae0a2b46: Container a222ae0a2b466f360304f0cd01bccaa8420cedc575710c9ed85bfbb718b6e75c is not running
Error response from daemon: Cannot kill container: 741a93209d3a: Container 741a93209d3a69f39b0600cca84bebfce21b275d9dcd2c53cc16266da6e1cbc2 is not running
Error response from daemon: Cannot kill container: b7be571b9f49: Container b7be571b9f4993e0bd459fefef1711d823da33c485f0b1715ceb93afab1c90fb is not running
Error response from daemon: Cannot kill container: f23c8cb9ce74: Container f23c8cb9ce74f4018e74fdd598c5beccfc4018253bfe3c47f9a34c80605f60aa is not running
Error response from daemon: Cannot kill container: 0b4ac94a0ab2: Container 0b4ac94a0ab2c584720712f6179fe483f6d6313a401698b79473a5c2d481104f is not running
Error response from daemon: Cannot kill container: d9b976f4180b: Container d9b976f4180bfa382199bd354de90ed4b4230d80c99cbddf4885685cf3e4587c is not running
Error response from daemon: Cannot kill container: 5a7f46938201: Container 5a7f46938201ff403a886c5d582bdc91091c8ee7516b86d13b7b18ada89fcd2f is not running
Error response from daemon: Cannot kill container: bcacb76bec91: Container bcacb76bec919ad1fa78333de6017e38b6f2a010c80be63c1a453306cd84885e is not running