GUI acceptance tests using environment deployed from packages.

Build: #2261 failed

Job: Chrome transfers multi browser tests failed

Stages & jobs

  1. Acceptance Test

Job result summary

Completed
Duration
4 minutes
Revision
32a8637a9a01b830c75d008c42ff27e7892513dd
Total tests
7
Fixed in
#2262 (Child of ODSRV-OPRPM-2186)

Configuration changes

Job Chrome transfers multi browser tests with key ODSRV-GAPT-CTMBT2 no longer exists.

Tests

  • 7 tests in total
  • 7 tests failed
  • 7 failures are new
  • 2 minutes taken in total.
New test failures 7
Status Test Duration
Collapse Failed test_oneprovider_transfers_multi test_nonspaceowner_successfully_schedules_eviction_if_he_got_transfer_management_privileges[1oz_2op_deployed]
18 secs
tests.utils.http_exceptions.HTTPBadRequest: [400] Bad Request: {"error":{"id":"errorOnNodes","details":{"hostnames":["dev-oneprovider-krakow-0.dev-oneprovider-krakow.default.svc.cluster.local"],"error":{"id":"badValueToken","details":{"tokenError":{"id":"badToken","description":"Provided token could not be understood by the server."},"key":"token"},"description":"Bad value: provided \"token\" is not a valid token (see details)."}},"description":"Error on nodes dev-oneprovider-krakow-0.dev-oneprovider-krakow.default.svc.cluster.local: Bad value: provided \"token\" is not a valid token (see details)."}}
request = <FixtureRequest for <Function 'test_nonspaceowner_successfully_schedules_eviction_if_he_got_transfer_management_privileges[1oz_2op_deployed]'>>

    @pytest.mark.usefixtures(*function_args)
    def scenario_wrapper(request):
>       _execute_scenario(feature, scenario, request, encoding)

/usr/local/lib/python3.6/dist-packages/pytest_bdd/scenario.py:227: 
(47 more lines...)
Collapse Failed test_oneprovider_transfers_multi test_nonspaceowner_successfully_schedules_replication_if_he_got_transfer_management_privileges[1oz_2op_deployed]
1 sec
tests.utils.http_exceptions.HTTPBadRequest: [400] Bad Request: {"error":{"id":"errorOnNodes","details":{"hostnames":["dev-oneprovider-krakow-0.dev-oneprovider-krakow.default.svc.cluster.local"],"error":{"id":"badValueToken","details":{"tokenError":{"id":"badToken","description":"Provided token could not be understood by the server."},"key":"token"},"description":"Bad value: provided \"token\" is not a valid token (see details)."}},"description":"Error on nodes dev-oneprovider-krakow-0.dev-oneprovider-krakow.default.svc.cluster.local: Bad value: provided \"token\" is not a valid token (see details)."}}
request = <FixtureRequest for <Function 'test_nonspaceowner_successfully_schedules_replication_if_he_got_transfer_management_privileges[1oz_2op_deployed]'>>

    @pytest.mark.usefixtures(*function_args)
    def scenario_wrapper(request):
>       _execute_scenario(feature, scenario, request, encoding)

/usr/local/lib/python3.6/dist-packages/pytest_bdd/scenario.py:227: 
(47 more lines...)
Collapse Failed test_oneprovider_transfers_multi test_nonspaceowner_successfully_views_transfers_tab_in_menu_if_he_got_view_transfers_privilege[1oz_2op_deployed]
1 sec
tests.utils.http_exceptions.HTTPBadRequest: [400] Bad Request: {"error":{"id":"errorOnNodes","details":{"hostnames":["dev-oneprovider-krakow-0.dev-oneprovider-krakow.default.svc.cluster.local"],"error":{"id":"badValueToken","details":{"tokenError":{"id":"badToken","description":"Provided token could not be understood by the server."},"key":"token"},"description":"Bad value: provided \"token\" is not a valid token (see details)."}},"description":"Error on nodes dev-oneprovider-krakow-0.dev-oneprovider-krakow.default.svc.cluster.local: Bad value: provided \"token\" is not a valid token (see details)."}}
request = <FixtureRequest for <Function 'test_nonspaceowner_successfully_views_transfers_tab_in_menu_if_he_got_view_transfers_privilege[1oz_2op_deployed]'>>

    @pytest.mark.usefixtures(*function_args)
    def scenario_wrapper(request):
>       _execute_scenario(feature, scenario, request, encoding)

/usr/local/lib/python3.6/dist-packages/pytest_bdd/scenario.py:227: 
(47 more lines...)
Collapse Failed test_oneprovider_transfers_multi test_user_migrates_directory_with_2_files_on_different_providers_to_current_provider[1oz_2op_deployed]
1 sec
tests.utils.http_exceptions.HTTPBadRequest: [400] Bad Request: {"error":{"id":"errorOnNodes","details":{"hostnames":["dev-oneprovider-krakow-0.dev-oneprovider-krakow.default.svc.cluster.local"],"error":{"id":"badValueToken","details":{"tokenError":{"id":"badToken","description":"Provided token could not be understood by the server."},"key":"token"},"description":"Bad value: provided \"token\" is not a valid token (see details)."}},"description":"Error on nodes dev-oneprovider-krakow-0.dev-oneprovider-krakow.default.svc.cluster.local: Bad value: provided \"token\" is not a valid token (see details)."}}
request = <FixtureRequest for <Function 'test_user_migrates_directory_with_2_files_on_different_providers_to_current_provider[1oz_2op_deployed]'>>

    @pytest.mark.usefixtures(*function_args)
    def scenario_wrapper(request):
>       _execute_scenario(feature, scenario, request, encoding)

/usr/local/lib/python3.6/dist-packages/pytest_bdd/scenario.py:227: 
(47 more lines...)
Collapse Failed test_oneprovider_transfers_multi test_user_migrates_file_from_remote_provider_to_current_provider[1oz_2op_deployed]
1 sec
tests.utils.http_exceptions.HTTPBadRequest: [400] Bad Request: {"error":{"id":"errorOnNodes","details":{"hostnames":["dev-oneprovider-krakow-0.dev-oneprovider-krakow.default.svc.cluster.local"],"error":{"id":"badValueToken","details":{"tokenError":{"id":"badToken","description":"Provided token could not be understood by the server."},"key":"token"},"description":"Bad value: provided \"token\" is not a valid token (see details)."}},"description":"Error on nodes dev-oneprovider-krakow-0.dev-oneprovider-krakow.default.svc.cluster.local: Bad value: provided \"token\" is not a valid token (see details)."}}
request = <FixtureRequest for <Function 'test_user_migrates_file_from_remote_provider_to_current_provider[1oz_2op_deployed]'>>

    @pytest.mark.usefixtures(*function_args)
    def scenario_wrapper(request):
>       _execute_scenario(feature, scenario, request, encoding)

/usr/local/lib/python3.6/dist-packages/pytest_bdd/scenario.py:227: 
(47 more lines...)
Collapse Failed test_oneprovider_transfers_multi test_user_replicates_directory_with_2_files_on_different_providers_to_current_provider[1oz_2op_deployed]
1 sec
tests.utils.http_exceptions.HTTPBadRequest: [400] Bad Request: {"error":{"id":"errorOnNodes","details":{"hostnames":["dev-oneprovider-krakow-0.dev-oneprovider-krakow.default.svc.cluster.local"],"error":{"id":"badValueToken","details":{"tokenError":{"id":"badToken","description":"Provided token could not be understood by the server."},"key":"token"},"description":"Bad value: provided \"token\" is not a valid token (see details)."}},"description":"Error on nodes dev-oneprovider-krakow-0.dev-oneprovider-krakow.default.svc.cluster.local: Bad value: provided \"token\" is not a valid token (see details)."}}
request = <FixtureRequest for <Function 'test_user_replicates_directory_with_2_files_on_different_providers_to_current_provider[1oz_2op_deployed]'>>

    @pytest.mark.usefixtures(*function_args)
    def scenario_wrapper(request):
>       _execute_scenario(feature, scenario, request, encoding)

/usr/local/lib/python3.6/dist-packages/pytest_bdd/scenario.py:227: 
(47 more lines...)
Collapse Failed test_oneprovider_transfers_multi test_user_replicates_file_from_remote_provider_to_current_provider[1oz_2op_deployed]
1 min
tests.utils.http_exceptions.HTTPBadRequest: [400] Bad Request: {"error":{"id":"errorOnNodes","details":{"hostnames":["dev-oneprovider-krakow-0.dev-oneprovider-krakow.default.svc.cluster.local"],"error":{"id":"badValueToken","details":{"tokenError":{"id":"badToken","description":"Provided token could not be understood by the server."},"key":"token"},"description":"Bad value: provided \"token\" is not a valid token (see details)."}},"description":"Error on nodes dev-oneprovider-krakow-0.dev-oneprovider-krakow.default.svc.cluster.local: Bad value: provided \"token\" is not a valid token (see details)."}}
request = <FixtureRequest for <Function 'test_user_replicates_file_from_remote_provider_to_current_provider[1oz_2op_deployed]'>>

    @pytest.mark.usefixtures(*function_args)
    def scenario_wrapper(request):
>       _execute_scenario(feature, scenario, request, encoding)

/usr/local/lib/python3.6/dist-packages/pytest_bdd/scenario.py:227: 
(47 more lines...)

Error summary

The build generated some errors. See the full build log for more details.

  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                 Dload  Upload   Total   Spent    Left  Speed

  0     0    0     0    0     0      0      0 --:--:-- --:--:-- --:--:--     0
100  4653  100  4653    0     0  50032      0 --:--:-- --:--:-- --:--:-- 50032
Error response from daemon: Cannot kill container: 6ce3f94e0ea0: Container 6ce3f94e0ea09591fc470e681cce545e4df0d6dd42591c1618770a4774d2e175 is not running
Submodule 'bamboos' (ssh://git@git.onedata.org:7999/vfs/bamboos.git) registered for path 'bamboos'
Submodule 'cdmi_swagger' (ssh://git@git.onedata.org:7999/vfs/cdmi-swagger.git) registered for path 'cdmi_swagger'
Submodule 'one_env' (ssh://git@git.onedata.org:7999/vfs/one-env.git) registered for path 'one_env'
Submodule 'onepanel_swagger' (ssh://git@git.onedata.org:7999/vfs/onepanel-swagger.git) registered for path 'onepanel_swagger'
Submodule 'oneprovider_swagger' (ssh://git@git.onedata.org:7999/vfs/oneprovider-swagger.git) registered for path 'oneprovider_swagger'
Submodule 'onezone_swagger' (ssh://git@git.onedata.org:7999/vfs/onezone-swagger.git) registered for path 'onezone_swagger'
Cloning into '/mnt/storage/bamboo-agent-home/xml-data/build-dir/ODSRV-GAPT-CTMBT2/onedata/bamboos'...
Cloning into '/mnt/storage/bamboo-agent-home/xml-data/build-dir/ODSRV-GAPT-CTMBT2/onedata/cdmi_swagger'...
Cloning into '/mnt/storage/bamboo-agent-home/xml-data/build-dir/ODSRV-GAPT-CTMBT2/onedata/one_env'...
Cloning into '/mnt/storage/bamboo-agent-home/xml-data/build-dir/ODSRV-GAPT-CTMBT2/onedata/onepanel_swagger'...
Cloning into '/mnt/storage/bamboo-agent-home/xml-data/build-dir/ODSRV-GAPT-CTMBT2/onedata/oneprovider_swagger'...
Cloning into '/mnt/storage/bamboo-agent-home/xml-data/build-dir/ODSRV-GAPT-CTMBT2/onedata/onezone_swagger'...
Submodule 'bamboos' (ssh://git@git.onedata.org:7999/vfs/bamboos.git) registered for path 'onepanel_swagger/bamboos'
Cloning into '/mnt/storage/bamboo-agent-home/xml-data/build-dir/ODSRV-GAPT-CTMBT2/onedata/onepanel_swagger/bamboos'...
Submodule 'bamboos' (ssh://git@git.onedata.org:7999/vfs/bamboos.git) registered for path 'oneprovider_swagger/bamboos'
Cloning into '/mnt/storage/bamboo-agent-home/xml-data/build-dir/ODSRV-GAPT-CTMBT2/onedata/oneprovider_swagger/bamboos'...
Submodule 'bamboos' (ssh://git@git.onedata.org:7999/vfs/bamboos.git) registered for path 'onezone_swagger/bamboos'
Cloning into '/mnt/storage/bamboo-agent-home/xml-data/build-dir/ODSRV-GAPT-CTMBT2/onedata/onezone_swagger/bamboos'...
error: failed to create clusterrolebinding: clusterrolebindings.rbac.authorization.k8s.io "serviceaccounts-cluster-admin" already exists
Error: could not find tiller
Error: could not find tiller
error: failed to create clusterrolebinding: clusterrolebindings.rbac.authorization.k8s.io "serviceaccounts-cluster-admin" already exists
protocol version mismatch -- is your shell clean?
(see the rsync man page for an explanation)
rsync error: protocol incompatibility (code 2) at compat.c(178) [Receiver=3.1.2]
command terminated with exit code 126
protocol version mismatch -- is your shell clean?
(see the rsync man page for an explanation)
rsync error: protocol incompatibility (code 2) at compat.c(178) [Receiver=3.1.2]
command terminated with exit code 126
protocol version mismatch -- is your shell clean?
(see the rsync man page for an explanation)
rsync error: protocol incompatibility (code 2) at compat.c(178) [Receiver=3.1.2]
command terminated with exit code 126
protocol version mismatch -- is your shell clean?
(see the rsync man page for an explanation)
rsync error: protocol incompatibility (code 2) at compat.c(178) [Receiver=3.1.2]
command terminated with exit code 126
protocol version mismatch -- is your shell clean?
(see the rsync man page for an explanation)
rsync error: protocol incompatibility (code 2) at compat.c(178) [Receiver=3.1.2]
command terminated with exit code 126
protocol version mismatch -- is your shell clean?
(see the rsync man page for an explanation)
rsync error: protocol incompatibility (code 2) at compat.c(178) [Receiver=3.1.2]
command terminated with exit code 126
protocol version mismatch -- is your shell clean?
(see the rsync man page for an explanation)
rsync error: protocol incompatibility (code 2) at compat.c(178) [Receiver=3.1.2]
command terminated with exit code 126
protocol version mismatch -- is your shell clean?
(see the rsync man page for an explanation)
rsync error: protocol incompatibility (code 2) at compat.c(178) [Receiver=3.1.2]
command terminated with exit code 126
protocol version mismatch -- is your shell clean?
(see the rsync man page for an explanation)
rsync error: protocol incompatibility (code 2) at compat.c(178) [Receiver=3.1.2]
command terminated with exit code 126
error: failed to create clusterrolebinding: clusterrolebindings.rbac.authorization.k8s.io "serviceaccounts-cluster-admin" already exists
protocol version mismatch -- is your shell clean?
(see the rsync man page for an explanation)
rsync error: protocol incompatibility (code 2) at compat.c(178) [Receiver=3.1.2]
command terminated with exit code 126
protocol version mismatch -- is your shell clean?
(see the rsync man page for an explanation)
rsync error: protocol incompatibility (code 2) at compat.c(178) [Receiver=3.1.2]
protocol version mismatch -- is your shell clean?
(see the rsync man page for an explanation)
rsync error: protocol incompatibility (code 2) at compat.c(178) [Receiver=3.1.2]
command terminated with exit code 126
protocol version mismatch -- is your shell clean?
(see the rsync man page for an explanation)
rsync error: protocol incompatibility (code 2) at compat.c(178) [Receiver=3.1.2]
command terminated with exit code 126
protocol version mismatch -- is your shell clean?
(see the rsync man page for an explanation)
rsync error: protocol incompatibility (code 2) at compat.c(178) [Receiver=3.1.2]
command terminated with exit code 126
protocol version mismatch -- is your shell clean?
(see the rsync man page for an explanation)
rsync error: protocol incompatibility (code 2) at compat.c(178) [Receiver=3.1.2]
command terminated with exit code 126
protocol version mismatch -- is your shell clean?
(see the rsync man page for an explanation)
rsync error: protocol incompatibility (code 2) at compat.c(178) [Receiver=3.1.2]
command terminated with exit code 126
protocol version mismatch -- is your shell clean?
(see the rsync man page for an explanation)
rsync error: protocol incompatibility (code 2) at compat.c(178) [Receiver=3.1.2]
command terminated with exit code 126
protocol version mismatch -- is your shell clean?
(see the rsync man page for an explanation)
rsync error: protocol incompatibility (code 2) at compat.c(178) [Receiver=3.1.2]
cp: cannot stat 'onedata/one_env/sources_info.yaml': No such file or directory
  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                 Dload  Upload   Total   Spent    Left  Speed

  0     0    0     0    0     0      0      0 --:--:-- --:--:-- --:--:--     0
100  4653  100  4653    0     0  45617      0 --:--:-- --:--:-- --:--:-- 45617
Error from server (NotFound): pods "dev-oneprovider-paris-0" not found
Error response from daemon: Cannot kill container: 65f62bcdb677: Container 65f62bcdb67734156b7e9d696f88e1b4402e65aa04390852ede4ec43344f2bcd is not running
Error response from daemon: Cannot kill container: a8875e02c83a: Container a8875e02c83a8cd62788f2cdf50d8933bd5e6464721121a53adc9a1ccea5e738 is not running
Error response from daemon: Cannot kill container: 2c85460ce6fa: Container 2c85460ce6fa82573788184d4ee710051269222278ac909469d1085811c0f31a is not running
Error response from daemon: Cannot kill container: 109b88039218: Container 109b880392183567b8a71d436573b5b358deabc1631a57e4655a58cf4068cff1 is not running
Error response from daemon: Cannot kill container: 2c4bf029df34: Container 2c4bf029df34ba7844e64189e1a2b8ff79cb5192bb218c80debb1361e0032b21 is not running
Error response from daemon: Cannot kill container: b993b08ee74b: Container b993b08ee74b786af6a86c225ed2e0a6f866b26a0ee19f8f4e532ad01ee398e1 is not running
Error response from daemon: Cannot kill container: f78e2c327bf6: Container f78e2c327bf6b1ebd02246d4510e8231bbce14d6c182aabce6aa4a57524adb1b is not running
Error response from daemon: Cannot kill container: 130bb98b7e5d: Container 130bb98b7e5dfb5bed1edd625ad1882d306b07ba6ed0b307076535c49db0aacb is not running
Error response from daemon: Cannot kill container: e95b3149b19d: Container e95b3149b19d98f0f2dccbea106be126ae0e0a2efe66d10c9412525f38735307 is not running
Error response from daemon: Cannot kill container: 71ceb5125396: Container 71ceb51253969858936ebb7c2df5c85bf8c3051c54f2921b367df0ea4227c11a is not running
Error response from daemon: Cannot kill container: 99aa70d23955: Container 99aa70d2395501d898e35f66756cb3047897ac33140203138fc2b4bb47e971ba is not running
Error response from daemon: Cannot kill container: ab3531226fcc: Container ab3531226fcc2d99493b42221b9313230eecfb129888f2546aa1903750baeee3 is not running
Error response from daemon: Cannot kill container: d6cc64a3aede: Container d6cc64a3aedea413e9d29c92571657f2525583c7b5b79c8f1ef32397df40a457 is not running
Error response from daemon: Cannot kill container: 72191574fd0f: Container 72191574fd0f3b9b7b057c571996bb9118389f3a5dea0baccf1b7df277d8b450 is not running
Error response from daemon: Cannot kill container: 595e6fe97850: Container 595e6fe978504ceaf9301b73a65fd7715bcf085dd62626477548a64b90167353 is not running
Error response from daemon: Cannot kill container: 68b779c27977: Container 68b779c279772262fd7c8e8333b0035a75599f4034dd5945946fb165dbcd86d3 is not running
Error response from daemon: Cannot kill container: 75acd5b1607d: Container 75acd5b1607d8da23f96e8de4bc4bac24e6641c4467fc6498de23e2542dee139 is not running
Error response from daemon: Cannot kill container: e170c6726d04: Container e170c6726d04a6bb7dbd4ac97f971163280d3c203a4aaef6bc3fca43b61a8022 is not running
Error response from daemon: Cannot kill container: 7ddca1df5e75: Container 7ddca1df5e7535c11db17fc308ded0fe8ceafba5a2bb2b7bdb5750c34760de84 is not running