GUI acceptance tests using environment deployed from packages.

Build: #2261 failed

Job: Oneprovider archives incremental and nested failed

Stages & jobs

  1. Acceptance Test

Job result summary

Completed
Duration
4 minutes
Revision
32a8637a9a01b830c75d008c42ff27e7892513dd
Total tests
8
Fixed in
#2262 (Child of ODSRV-OPRPM-2186)

Tests

  • 8 tests in total
  • 8 tests failed
  • 8 failures are new
  • 1 minute taken in total.
New test failures 8
Status Test Duration
Collapse Failed test_oneprovider_archives_incremental_and_nested test_user_creates_incremental_archive_that_has_chosen_base_archive[1oz_1op_deployed]
< 1 sec
tests.utils.http_exceptions.HTTPBadRequest: [400] Bad Request: {"error":{"id":"errorOnNodes","details":{"hostnames":["dev-oneprovider-krakow-0.dev-oneprovider-krakow.default.svc.cluster.local"],"error":{"id":"badValueToken","details":{"tokenError":{"id":"badToken","description":"Provided token could not be understood by the server."},"key":"token"},"description":"Bad value: provided \"token\" is not a valid token (see details)."}},"description":"Error on nodes dev-oneprovider-krakow-0.dev-oneprovider-krakow.default.svc.cluster.local: Bad value: provided \"token\" is not a valid token (see details)."}}
request = <FixtureRequest for <Function 'test_user_creates_incremental_archive_that_has_chosen_base_archive[1oz_1op_deployed]'>>

    @pytest.mark.usefixtures(*function_args)
    def scenario_wrapper(request):
>       _execute_scenario(feature, scenario, request, encoding)

/usr/local/lib/python3.6/dist-packages/pytest_bdd/scenario.py:227: 
(47 more lines...)
Collapse Failed test_oneprovider_archives_incremental_and_nested test_user_sees_name_of_base_archive_after_creating_incremental_archive[1oz_1op_deployed]
< 1 sec
tests.utils.http_exceptions.HTTPBadRequest: [400] Bad Request: {"error":{"id":"errorOnNodes","details":{"hostnames":["dev-oneprovider-krakow-0.dev-oneprovider-krakow.default.svc.cluster.local"],"error":{"id":"badValueToken","details":{"tokenError":{"id":"badToken","description":"Provided token could not be understood by the server."},"key":"token"},"description":"Bad value: provided \"token\" is not a valid token (see details)."}},"description":"Error on nodes dev-oneprovider-krakow-0.dev-oneprovider-krakow.default.svc.cluster.local: Bad value: provided \"token\" is not a valid token (see details)."}}
request = <FixtureRequest for <Function 'test_user_sees_name_of_base_archive_after_creating_incremental_archive[1oz_1op_deployed]'>>

    @pytest.mark.usefixtures(*function_args)
    def scenario_wrapper(request):
>       _execute_scenario(feature, scenario, request, encoding)

/usr/local/lib/python3.6/dist-packages/pytest_bdd/scenario.py:227: 
(47 more lines...)
Collapse Failed test_oneprovider_archives_incremental_and_nested test_user_sees_real_directory_tree_of_downloaded_tar_generated_for_nested_archive[1oz_1op_deployed]
13 secs
tests.utils.http_exceptions.HTTPBadRequest: [400] Bad Request: {"error":{"id":"errorOnNodes","details":{"hostnames":["dev-oneprovider-krakow-0.dev-oneprovider-krakow.default.svc.cluster.local"],"error":{"id":"badValueToken","details":{"tokenError":{"id":"badToken","description":"Provided token could not be understood by the server."},"key":"token"},"description":"Bad value: provided \"token\" is not a valid token (see details)."}},"description":"Error on nodes dev-oneprovider-krakow-0.dev-oneprovider-krakow.default.svc.cluster.local: Bad value: provided \"token\" is not a valid token (see details)."}}
request = <FixtureRequest for <Function 'test_user_sees_real_directory_tree_of_downloaded_tar_generated_for_nested_archive[1oz_1op_deployed]'>>

    @pytest.mark.usefixtures(*function_args)
    def scenario_wrapper(request):
>       _execute_scenario(feature, scenario, request, encoding)

/usr/local/lib/python3.6/dist-packages/pytest_bdd/scenario.py:227: 
(47 more lines...)
Collapse Failed test_oneprovider_archives_incremental_and_nested test_user_sees_symbolic_links_on_child_datasets_after_creating_nested_archive_on_parent[1oz_1op_deployed]
< 1 sec
tests.utils.http_exceptions.HTTPBadRequest: [400] Bad Request: {"error":{"id":"errorOnNodes","details":{"hostnames":["dev-oneprovider-krakow-0.dev-oneprovider-krakow.default.svc.cluster.local"],"error":{"id":"badValueToken","details":{"tokenError":{"id":"badToken","description":"Provided token could not be understood by the server."},"key":"token"},"description":"Bad value: provided \"token\" is not a valid token (see details)."}},"description":"Error on nodes dev-oneprovider-krakow-0.dev-oneprovider-krakow.default.svc.cluster.local: Bad value: provided \"token\" is not a valid token (see details)."}}
request = <FixtureRequest for <Function 'test_user_sees_symbolic_links_on_child_datasets_after_creating_nested_archive_on_parent[1oz_1op_deployed]'>>

    @pytest.mark.usefixtures(*function_args)
    def scenario_wrapper(request):
>       _execute_scenario(feature, scenario, request, encoding)

/usr/local/lib/python3.6/dist-packages/pytest_bdd/scenario.py:227: 
(47 more lines...)
Collapse Failed test_oneprovider_archives_incremental_and_nested test_user_sees_that_dataset_has_more_archives_than_its_parent_after_creating_nested_archive_on_child_dataset[1oz_1op_deployed]
< 1 sec
tests.utils.http_exceptions.HTTPBadRequest: [400] Bad Request: {"error":{"id":"errorOnNodes","details":{"hostnames":["dev-oneprovider-krakow-0.dev-oneprovider-krakow.default.svc.cluster.local"],"error":{"id":"badValueToken","details":{"tokenError":{"id":"badToken","description":"Provided token could not be understood by the server."},"key":"token"},"description":"Bad value: provided \"token\" is not a valid token (see details)."}},"description":"Error on nodes dev-oneprovider-krakow-0.dev-oneprovider-krakow.default.svc.cluster.local: Bad value: provided \"token\" is not a valid token (see details)."}}
request = <FixtureRequest for <Function 'test_user_sees_that_dataset_has_more_archives_than_its_parent_after_creating_nested_archive_on_child_dataset[1oz_1op_deployed]'>>

    @pytest.mark.usefixtures(*function_args)
    def scenario_wrapper(request):
>       _execute_scenario(feature, scenario, request, encoding)

/usr/local/lib/python3.6/dist-packages/pytest_bdd/scenario.py:227: 
(47 more lines...)
Collapse Failed test_oneprovider_archives_incremental_and_nested test_user_sees_that_files_that_did_not_change_since_creating_last_archive_have_2_hardlinks_tag_after_creating_new_incremental_archive[1oz_1op_deployed]
1 min
tests.utils.http_exceptions.HTTPBadRequest: [400] Bad Request: {"error":{"id":"errorOnNodes","details":{"hostnames":["dev-oneprovider-krakow-0.dev-oneprovider-krakow.default.svc.cluster.local"],"error":{"id":"badValueToken","details":{"tokenError":{"id":"badToken","description":"Provided token could not be understood by the server."},"key":"token"},"description":"Bad value: provided \"token\" is not a valid token (see details)."}},"description":"Error on nodes dev-oneprovider-krakow-0.dev-oneprovider-krakow.default.svc.cluster.local: Bad value: provided \"token\" is not a valid token (see details)."}}
request = <FixtureRequest for <Function 'test_user_sees_that_files_that_did_not_change_since_creating_last_archive_have_2_hardlinks_tag_after_creating_new_incremental_archive[1oz_1op_deployed]'>>

    @pytest.mark.usefixtures(*function_args)
    def scenario_wrapper(request):
>       _execute_scenario(feature, scenario, request, encoding)

/usr/local/lib/python3.6/dist-packages/pytest_bdd/scenario.py:227: 
(47 more lines...)
Collapse Failed test_oneprovider_archives_incremental_and_nested test_user_sees_that_files_that_did_not_change_since_creating_last_two_base_archives_have_3_hardlinks_tag_after_creating_new_incremental_archive[1oz_1op_deployed]
< 1 sec
tests.utils.http_exceptions.HTTPBadRequest: [400] Bad Request: {"error":{"id":"errorOnNodes","details":{"hostnames":["dev-oneprovider-krakow-0.dev-oneprovider-krakow.default.svc.cluster.local"],"error":{"id":"badValueToken","details":{"tokenError":{"id":"badToken","description":"Provided token could not be understood by the server."},"key":"token"},"description":"Bad value: provided \"token\" is not a valid token (see details)."}},"description":"Error on nodes dev-oneprovider-krakow-0.dev-oneprovider-krakow.default.svc.cluster.local: Bad value: provided \"token\" is not a valid token (see details)."}}
request = <FixtureRequest for <Function 'test_user_sees_that_files_that_did_not_change_since_creating_last_two_base_archives_have_3_hardlinks_tag_after_creating_new_incremental_archive[1oz_1op_deployed]'>>

    @pytest.mark.usefixtures(*function_args)
    def scenario_wrapper(request):
>       _execute_scenario(feature, scenario, request, encoding)

/usr/local/lib/python3.6/dist-packages/pytest_bdd/scenario.py:227: 
(47 more lines...)
Collapse Failed test_oneprovider_archives_incremental_and_nested test_user_sees_that_the_base_archive_in_create_archive_modal_is_the_latest_created_archive_after_enabling_incremental_toggle[1oz_1op_deployed]
< 1 sec
tests.utils.http_exceptions.HTTPBadRequest: [400] Bad Request: {"error":{"id":"errorOnNodes","details":{"hostnames":["dev-oneprovider-krakow-0.dev-oneprovider-krakow.default.svc.cluster.local"],"error":{"id":"badValueToken","details":{"tokenError":{"id":"badToken","description":"Provided token could not be understood by the server."},"key":"token"},"description":"Bad value: provided \"token\" is not a valid token (see details)."}},"description":"Error on nodes dev-oneprovider-krakow-0.dev-oneprovider-krakow.default.svc.cluster.local: Bad value: provided \"token\" is not a valid token (see details)."}}
request = <FixtureRequest for <Function 'test_user_sees_that_the_base_archive_in_create_archive_modal_is_the_latest_created_archive_after_enabling_incremental_toggle[1oz_1op_deployed]'>>

    @pytest.mark.usefixtures(*function_args)
    def scenario_wrapper(request):
>       _execute_scenario(feature, scenario, request, encoding)

/usr/local/lib/python3.6/dist-packages/pytest_bdd/scenario.py:227: 
(47 more lines...)

Error summary

The build generated some errors. See the full build log for more details.

  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                 Dload  Upload   Total   Spent    Left  Speed

  0     0    0     0    0     0      0      0 --:--:-- --:--:-- --:--:--     0
100  4653  100  4653    0     0  48468      0 --:--:-- --:--:-- --:--:-- 48468
Error response from daemon: Cannot kill container: 9a84640ffbe0: Container 9a84640ffbe041762fe33d02b1e7eab8a3d9ae4c1830cedb49ce5cba5ff49a73 is not running
Submodule 'bamboos' (ssh://git@git.onedata.org:7999/vfs/bamboos.git) registered for path 'bamboos'
Submodule 'cdmi_swagger' (ssh://git@git.onedata.org:7999/vfs/cdmi-swagger.git) registered for path 'cdmi_swagger'
Submodule 'one_env' (ssh://git@git.onedata.org:7999/vfs/one-env.git) registered for path 'one_env'
Submodule 'onepanel_swagger' (ssh://git@git.onedata.org:7999/vfs/onepanel-swagger.git) registered for path 'onepanel_swagger'
Submodule 'oneprovider_swagger' (ssh://git@git.onedata.org:7999/vfs/oneprovider-swagger.git) registered for path 'oneprovider_swagger'
Submodule 'onezone_swagger' (ssh://git@git.onedata.org:7999/vfs/onezone-swagger.git) registered for path 'onezone_swagger'
Cloning into '/mnt/storage/bamboo-agent-home/xml-data/build-dir/ODSRV-GAPT-COAIN/onedata/bamboos'...
Cloning into '/mnt/storage/bamboo-agent-home/xml-data/build-dir/ODSRV-GAPT-COAIN/onedata/cdmi_swagger'...
Cloning into '/mnt/storage/bamboo-agent-home/xml-data/build-dir/ODSRV-GAPT-COAIN/onedata/one_env'...
Cloning into '/mnt/storage/bamboo-agent-home/xml-data/build-dir/ODSRV-GAPT-COAIN/onedata/onepanel_swagger'...
Cloning into '/mnt/storage/bamboo-agent-home/xml-data/build-dir/ODSRV-GAPT-COAIN/onedata/oneprovider_swagger'...
Cloning into '/mnt/storage/bamboo-agent-home/xml-data/build-dir/ODSRV-GAPT-COAIN/onedata/onezone_swagger'...
Submodule 'bamboos' (ssh://git@git.onedata.org:7999/vfs/bamboos.git) registered for path 'onepanel_swagger/bamboos'
Cloning into '/mnt/storage/bamboo-agent-home/xml-data/build-dir/ODSRV-GAPT-COAIN/onedata/onepanel_swagger/bamboos'...
Submodule 'bamboos' (ssh://git@git.onedata.org:7999/vfs/bamboos.git) registered for path 'oneprovider_swagger/bamboos'
Cloning into '/mnt/storage/bamboo-agent-home/xml-data/build-dir/ODSRV-GAPT-COAIN/onedata/oneprovider_swagger/bamboos'...
Submodule 'bamboos' (ssh://git@git.onedata.org:7999/vfs/bamboos.git) registered for path 'onezone_swagger/bamboos'
Cloning into '/mnt/storage/bamboo-agent-home/xml-data/build-dir/ODSRV-GAPT-COAIN/onedata/onezone_swagger/bamboos'...
error: failed to create clusterrolebinding: clusterrolebindings.rbac.authorization.k8s.io "serviceaccounts-cluster-admin" already exists
Error: could not find tiller
Error: could not find tiller
error: failed to create clusterrolebinding: clusterrolebindings.rbac.authorization.k8s.io "serviceaccounts-cluster-admin" already exists
protocol version mismatch -- is your shell clean?
(see the rsync man page for an explanation)
rsync error: protocol incompatibility (code 2) at compat.c(178) [Receiver=3.1.2]
protocol version mismatch -- is your shell clean?
(see the rsync man page for an explanation)
rsync error: protocol incompatibility (code 2) at compat.c(178) [Receiver=3.1.2]
command terminated with exit code 126
protocol version mismatch -- is your shell clean?
(see the rsync man page for an explanation)
rsync error: protocol incompatibility (code 2) at compat.c(178) [Receiver=3.1.2]
command terminated with exit code 126
protocol version mismatch -- is your shell clean?
(see the rsync man page for an explanation)
rsync error: protocol incompatibility (code 2) at compat.c(178) [Receiver=3.1.2]
command terminated with exit code 126
protocol version mismatch -- is your shell clean?
(see the rsync man page for an explanation)
rsync error: protocol incompatibility (code 2) at compat.c(178) [Receiver=3.1.2]
command terminated with exit code 126
protocol version mismatch -- is your shell clean?
(see the rsync man page for an explanation)
rsync error: protocol incompatibility (code 2) at compat.c(178) [Receiver=3.1.2]
command terminated with exit code 126
error: failed to create clusterrolebinding: clusterrolebindings.rbac.authorization.k8s.io "serviceaccounts-cluster-admin" already exists
protocol version mismatch -- is your shell clean?
(see the rsync man page for an explanation)
rsync error: protocol incompatibility (code 2) at compat.c(178) [Receiver=3.1.2]
command terminated with exit code 126
protocol version mismatch -- is your shell clean?
(see the rsync man page for an explanation)
rsync error: protocol incompatibility (code 2) at compat.c(178) [Receiver=3.1.2]
command terminated with exit code 126
protocol version mismatch -- is your shell clean?
(see the rsync man page for an explanation)
rsync error: protocol incompatibility (code 2) at compat.c(178) [Receiver=3.1.2]
command terminated with exit code 126
protocol version mismatch -- is your shell clean?
(see the rsync man page for an explanation)
rsync error: protocol incompatibility (code 2) at compat.c(178) [Receiver=3.1.2]
command terminated with exit code 126
protocol version mismatch -- is your shell clean?
(see the rsync man page for an explanation)
rsync error: protocol incompatibility (code 2) at compat.c(178) [Receiver=3.1.2]
command terminated with exit code 126
protocol version mismatch -- is your shell clean?
(see the rsync man page for an explanation)
rsync error: protocol incompatibility (code 2) at compat.c(178) [Receiver=3.1.2]
command terminated with exit code 126
W0816 20:14:04.587513     733 warnings.go:70] policy/v1beta1 PodDisruptionBudget is deprecated in v1.21+, unavailable in v1.25+; use policy/v1 PodDisruptionBudget
cp: cannot stat 'onedata/one_env/sources_info.yaml': No such file or directory
  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                 Dload  Upload   Total   Spent    Left  Speed

  0     0    0     0    0     0      0      0 --:--:-- --:--:-- --:--:--     0
100  4653  100  4653    0     0  43083      0 --:--:-- --:--:-- --:--:-- 43083
Error response from daemon: Cannot kill container: 57a95c101347: Container 57a95c10134756043f2e0d9bf11918d04c3160fe39c59799c0f37b2bf25374ce is not running
Error response from daemon: Cannot kill container: 360967bf0e10: Container 360967bf0e10015370db56b13f794d492315039f9282e4c2b3381aaf7407ad96 is not running
Error response from daemon: Cannot kill container: 937c771eaf19: Container 937c771eaf190c6e00e4e2ae89edce918990d21e1a31339d96f106646d7e1753 is not running
Error response from daemon: Cannot kill container: cf8f44da8890: Container cf8f44da88903c5062ff6f2f6cafee08db69e5aa18bff84285c75ab1f4f84caa is not running
Error response from daemon: Cannot kill container: c1b6695278d0: Container c1b6695278d034a2182cb9631e6f2a1684ef324fe7163ae6cd5dab983b76df5c is not running
Error response from daemon: Cannot kill container: 1d1be6b79221: Container 1d1be6b792219ae9ee6413492210239d6800e2f3cfadd4a349179212d279d423 is not running
Error response from daemon: Cannot kill container: 672130c193ce: Container 672130c193ce6f8d088a9b6675a791a833ada3590a208b23305578d92e99eec9 is not running
Error response from daemon: Cannot kill container: 99aca8893d53: Container 99aca8893d5399475509afd2dc2d62a9d4df609b626f259899e3b98caa230a01 is not running
Error response from daemon: Cannot kill container: f2aaa351808c: Container f2aaa351808c02260c33691f10baff844245b448c0506537e6806979a5cc6378 is not running
Error response from daemon: Cannot kill container: 220c9c01c712: Container 220c9c01c712c354150e0c56f91a7a99112bee454fb64c2d97ce7b579996a290 is not running
Error response from daemon: Cannot kill container: 92a536ca5439: Container 92a536ca5439d5d3f8ae16c4fb90e47e70fdfca4249a29f603a71ed63162c6bd is not running
Error response from daemon: Cannot kill container: ee3024425800: Container ee3024425800960a9880eba7eccce6dfcdf94419f608cd115cc1a3bf38487fea is not running
Error response from daemon: Cannot kill container: 9249d8ca7897: Container 9249d8ca7897d8bfbbdd59e5257ef269ffdf9eaef2913fc8cdfecfc43bfaa57a is not running
Error response from daemon: Cannot kill container: f6d436f2ccfd: Container f6d436f2ccfd7d0dc0e42e0c8d7688a1626fdff56aa5c9087420234d12339488 is not running
Error response from daemon: Cannot kill container: 998e8964033b: Container 998e8964033bdeca1188e701be6ce9665c2e5008d7898124d2bdbcf2b5992170 is not running
Error response from daemon: Cannot kill container: e31a28154507: Container e31a28154507a3f00fe51512740a5fc9230b385b679a564ec046d463d076bef7 is not running