GUI acceptance tests using environment deployed from packages.
Build: #2261 failed
Job: Oneprovider datasets failed
Job result summary
- Completed
- Duration
- 6 minutes
- Revision
-
32a8637a9a01b830c75d008c42ff27e7892513dd
- Total tests
- 17
- Fixed in
- #2262 (Child of ODSRV-OPRPM-2186)
Tests
- 17 tests in total
- 17 tests failed
- 17 failures are new
- 2 minutes taken in total.
Status | Test | Duration | |
---|---|---|---|
Collapse |
test_oneprovider_datasets
test_user_does_not_see_dataset_tag_after_removing_dataset_in_dataset_browser[1oz_1op_deployed]
|
1 sec | |
tests.utils.http_exceptions.HTTPBadRequest: [400] Bad Request: {"error":{"id":"errorOnNodes","details":{"hostnames":["dev-oneprovider-krakow-0.dev-oneprovider-krakow.default.svc.cluster.local"],"error":{"id":"badValueToken","details":{"tokenError":{"id":"badToken","description":"Provided token could not be understood by the server."},"key":"token"},"description":"Bad value: provided \"token\" is not a valid token (see details)."}},"description":"Error on nodes dev-oneprovider-krakow-0.dev-oneprovider-krakow.default.svc.cluster.local: Bad value: provided \"token\" is not a valid token (see details)."}} request = <FixtureRequest for <Function 'test_user_does_not_see_dataset_tag_after_removing_dataset_in_dataset_browser[1oz_1op_deployed]'>> @pytest.mark.usefixtures(*function_args) def scenario_wrapper(request): > _execute_scenario(feature, scenario, request, encoding) /usr/local/lib/python3.6/dist-packages/pytest_bdd/scenario.py:227: (47 more lines...) |
|||
Collapse |
test_oneprovider_datasets
test_user_does_not_see_dataset_tag_in_file_browser_and_see_directory_in_detached_tab_after_detaching_dataset[1oz_1op_deployed]
|
< 1 sec | |
tests.utils.http_exceptions.HTTPBadRequest: [400] Bad Request: {"error":{"id":"errorOnNodes","details":{"hostnames":["dev-oneprovider-krakow-0.dev-oneprovider-krakow.default.svc.cluster.local"],"error":{"id":"badValueToken","details":{"tokenError":{"id":"badToken","description":"Provided token could not be understood by the server."},"key":"token"},"description":"Bad value: provided \"token\" is not a valid token (see details)."}},"description":"Error on nodes dev-oneprovider-krakow-0.dev-oneprovider-krakow.default.svc.cluster.local: Bad value: provided \"token\" is not a valid token (see details)."}} request = <FixtureRequest for <Function 'test_user_does_not_see_dataset_tag_in_file_browser_and_see_directory_in_detached_tab_after_detaching_dataset[1oz_1op_deployed]'>> @pytest.mark.usefixtures(*function_args) def scenario_wrapper(request): > _execute_scenario(feature, scenario, request, encoding) /usr/local/lib/python3.6/dist-packages/pytest_bdd/scenario.py:227: (47 more lines...) |
|||
Collapse |
test_oneprovider_datasets
test_user_fails_to_delete_child_directory_after_marking_parent_directory_dataset_data_write_protection[1oz_1op_deployed]
|
1 sec | |
tests.utils.http_exceptions.HTTPBadRequest: [400] Bad Request: {"error":{"id":"errorOnNodes","details":{"hostnames":["dev-oneprovider-krakow-0.dev-oneprovider-krakow.default.svc.cluster.local"],"error":{"id":"badValueToken","details":{"tokenError":{"id":"badToken","description":"Provided token could not be understood by the server."},"key":"token"},"description":"Bad value: provided \"token\" is not a valid token (see details)."}},"description":"Error on nodes dev-oneprovider-krakow-0.dev-oneprovider-krakow.default.svc.cluster.local: Bad value: provided \"token\" is not a valid token (see details)."}} request = <FixtureRequest for <Function 'test_user_fails_to_delete_child_directory_after_marking_parent_directory_dataset_data_write_protection[1oz_1op_deployed]'>> @pytest.mark.usefixtures(*function_args) def scenario_wrapper(request): > _execute_scenario(feature, scenario, request, encoding) /usr/local/lib/python3.6/dist-packages/pytest_bdd/scenario.py:227: (47 more lines...) |
|||
Collapse |
test_oneprovider_datasets
test_user_fails_to_reattach_dataset_after_deleting_directory[1oz_1op_deployed]
|
1 sec | |
tests.utils.http_exceptions.HTTPBadRequest: [400] Bad Request: {"error":{"id":"errorOnNodes","details":{"hostnames":["dev-oneprovider-krakow-0.dev-oneprovider-krakow.default.svc.cluster.local"],"error":{"id":"badValueToken","details":{"tokenError":{"id":"badToken","description":"Provided token could not be understood by the server."},"key":"token"},"description":"Bad value: provided \"token\" is not a valid token (see details)."}},"description":"Error on nodes dev-oneprovider-krakow-0.dev-oneprovider-krakow.default.svc.cluster.local: Bad value: provided \"token\" is not a valid token (see details)."}} request = <FixtureRequest for <Function 'test_user_fails_to_reattach_dataset_after_deleting_directory[1oz_1op_deployed]'>> @pytest.mark.usefixtures(*function_args) def scenario_wrapper(request): > _execute_scenario(feature, scenario, request, encoding) /usr/local/lib/python3.6/dist-packages/pytest_bdd/scenario.py:227: (47 more lines...) |
|||
Collapse |
test_oneprovider_datasets
test_user_sees_both_data_and_metadata_protection_tags_on_hardlinks_if_hardlinked_files_have_these_flags_separately_set[1oz_1op_deployed]
|
< 1 sec | |
tests.utils.http_exceptions.HTTPBadRequest: [400] Bad Request: {"error":{"id":"errorOnNodes","details":{"hostnames":["dev-oneprovider-krakow-0.dev-oneprovider-krakow.default.svc.cluster.local"],"error":{"id":"badValueToken","details":{"tokenError":{"id":"badToken","description":"Provided token could not be understood by the server."},"key":"token"},"description":"Bad value: provided \"token\" is not a valid token (see details)."}},"description":"Error on nodes dev-oneprovider-krakow-0.dev-oneprovider-krakow.default.svc.cluster.local: Bad value: provided \"token\" is not a valid token (see details)."}} request = <FixtureRequest for <Function 'test_user_sees_both_data_and_metadata_protection_tags_on_hardlinks_if_hardlinked_files_have_these_flags_separately_set[1oz_1op_deployed]'>> @pytest.mark.usefixtures(*function_args) def scenario_wrapper(request): > _execute_scenario(feature, scenario, request, encoding) /usr/local/lib/python3.6/dist-packages/pytest_bdd/scenario.py:227: (47 more lines...) |
|||
Collapse |
test_oneprovider_datasets
test_user_sees_both_data_and_metadata_protection_tags_on_hardlinks_if_hardlinked_files_inherit_these_flags_from_their_parents_separately[1oz_1op_deployed]
|
1 sec | |
tests.utils.http_exceptions.HTTPBadRequest: [400] Bad Request: {"error":{"id":"errorOnNodes","details":{"hostnames":["dev-oneprovider-krakow-0.dev-oneprovider-krakow.default.svc.cluster.local"],"error":{"id":"badValueToken","details":{"tokenError":{"id":"badToken","description":"Provided token could not be understood by the server."},"key":"token"},"description":"Bad value: provided \"token\" is not a valid token (see details)."}},"description":"Error on nodes dev-oneprovider-krakow-0.dev-oneprovider-krakow.default.svc.cluster.local: Bad value: provided \"token\" is not a valid token (see details)."}} request = <FixtureRequest for <Function 'test_user_sees_both_data_and_metadata_protection_tags_on_hardlinks_if_hardlinked_files_inherit_these_flags_from_their_parents_separately[1oz_1op_deployed]'>> @pytest.mark.usefixtures(*function_args) def scenario_wrapper(request): > _execute_scenario(feature, scenario, request, encoding) /usr/local/lib/python3.6/dist-packages/pytest_bdd/scenario.py:227: (47 more lines...) |
|||
Collapse |
test_oneprovider_datasets
test_user_sees_data_protection_tag_in_dataset_modal_for_hardlink_of_data_protected_file[1oz_1op_deployed]
|
1 sec | |
tests.utils.http_exceptions.HTTPBadRequest: [400] Bad Request: {"error":{"id":"errorOnNodes","details":{"hostnames":["dev-oneprovider-krakow-0.dev-oneprovider-krakow.default.svc.cluster.local"],"error":{"id":"badValueToken","details":{"tokenError":{"id":"badToken","description":"Provided token could not be understood by the server."},"key":"token"},"description":"Bad value: provided \"token\" is not a valid token (see details)."}},"description":"Error on nodes dev-oneprovider-krakow-0.dev-oneprovider-krakow.default.svc.cluster.local: Bad value: provided \"token\" is not a valid token (see details)."}} request = <FixtureRequest for <Function 'test_user_sees_data_protection_tag_in_dataset_modal_for_hardlink_of_data_protected_file[1oz_1op_deployed]'>> @pytest.mark.usefixtures(*function_args) def scenario_wrapper(request): > _execute_scenario(feature, scenario, request, encoding) /usr/local/lib/python3.6/dist-packages/pytest_bdd/scenario.py:227: (47 more lines...) |
|||
Collapse |
test_oneprovider_datasets
test_user_sees_dataset_in_attached_tab_after_reattaching_detached_dataset[1oz_1op_deployed]
|
1 sec | |
tests.utils.http_exceptions.HTTPBadRequest: [400] Bad Request: {"error":{"id":"errorOnNodes","details":{"hostnames":["dev-oneprovider-krakow-0.dev-oneprovider-krakow.default.svc.cluster.local"],"error":{"id":"badValueToken","details":{"tokenError":{"id":"badToken","description":"Provided token could not be understood by the server."},"key":"token"},"description":"Bad value: provided \"token\" is not a valid token (see details)."}},"description":"Error on nodes dev-oneprovider-krakow-0.dev-oneprovider-krakow.default.svc.cluster.local: Bad value: provided \"token\" is not a valid token (see details)."}} request = <FixtureRequest for <Function 'test_user_sees_dataset_in_attached_tab_after_reattaching_detached_dataset[1oz_1op_deployed]'>> @pytest.mark.usefixtures(*function_args) def scenario_wrapper(request): > _execute_scenario(feature, scenario, request, encoding) /usr/local/lib/python3.6/dist-packages/pytest_bdd/scenario.py:227: (47 more lines...) |
|||
Collapse |
test_oneprovider_datasets
test_user_sees_dataset_in_detached_tab_after_deleting_directory[1oz_1op_deployed]
|
< 1 sec | |
tests.utils.http_exceptions.HTTPBadRequest: [400] Bad Request: {"error":{"id":"errorOnNodes","details":{"hostnames":["dev-oneprovider-krakow-0.dev-oneprovider-krakow.default.svc.cluster.local"],"error":{"id":"badValueToken","details":{"tokenError":{"id":"badToken","description":"Provided token could not be understood by the server."},"key":"token"},"description":"Bad value: provided \"token\" is not a valid token (see details)."}},"description":"Error on nodes dev-oneprovider-krakow-0.dev-oneprovider-krakow.default.svc.cluster.local: Bad value: provided \"token\" is not a valid token (see details)."}} request = <FixtureRequest for <Function 'test_user_sees_dataset_in_detached_tab_after_deleting_directory[1oz_1op_deployed]'>> @pytest.mark.usefixtures(*function_args) def scenario_wrapper(request): > _execute_scenario(feature, scenario, request, encoding) /usr/local/lib/python3.6/dist-packages/pytest_bdd/scenario.py:227: (47 more lines...) |
|||
Collapse |
test_oneprovider_datasets
test_user_sees_directory_tree_in_dataset_browser_after_marking_directories_as_dataset[1oz_1op_deployed]
|
1 sec | |
tests.utils.http_exceptions.HTTPBadRequest: [400] Bad Request: {"error":{"id":"errorOnNodes","details":{"hostnames":["dev-oneprovider-krakow-0.dev-oneprovider-krakow.default.svc.cluster.local"],"error":{"id":"badValueToken","details":{"tokenError":{"id":"badToken","description":"Provided token could not be understood by the server."},"key":"token"},"description":"Bad value: provided \"token\" is not a valid token (see details)."}},"description":"Error on nodes dev-oneprovider-krakow-0.dev-oneprovider-krakow.default.svc.cluster.local: Bad value: provided \"token\" is not a valid token (see details)."}} request = <FixtureRequest for <Function 'test_user_sees_directory_tree_in_dataset_browser_after_marking_directories_as_dataset[1oz_1op_deployed]'>> @pytest.mark.usefixtures(*function_args) def scenario_wrapper(request): > _execute_scenario(feature, scenario, request, encoding) /usr/local/lib/python3.6/dist-packages/pytest_bdd/scenario.py:227: (47 more lines...) |
|||
Collapse |
test_oneprovider_datasets
test_user_sees_directory_tree_in_detached_tab_after_detaching_directories[1oz_1op_deployed]
|
< 1 sec | |
tests.utils.http_exceptions.HTTPBadRequest: [400] Bad Request: {"error":{"id":"errorOnNodes","details":{"hostnames":["dev-oneprovider-krakow-0.dev-oneprovider-krakow.default.svc.cluster.local"],"error":{"id":"badValueToken","details":{"tokenError":{"id":"badToken","description":"Provided token could not be understood by the server."},"key":"token"},"description":"Bad value: provided \"token\" is not a valid token (see details)."}},"description":"Error on nodes dev-oneprovider-krakow-0.dev-oneprovider-krakow.default.svc.cluster.local: Bad value: provided \"token\" is not a valid token (see details)."}} request = <FixtureRequest for <Function 'test_user_sees_directory_tree_in_detached_tab_after_detaching_directories[1oz_1op_deployed]'>> @pytest.mark.usefixtures(*function_args) def scenario_wrapper(request): > _execute_scenario(feature, scenario, request, encoding) /usr/local/lib/python3.6/dist-packages/pytest_bdd/scenario.py:227: (47 more lines...) |
|||
Collapse |
test_oneprovider_datasets
test_user_sees_inherited_dataset_status_tag_after_marking_its_parent_directory_as_dataset[1oz_1op_deployed]
|
1 sec | |
tests.utils.http_exceptions.HTTPBadRequest: [400] Bad Request: {"error":{"id":"errorOnNodes","details":{"hostnames":["dev-oneprovider-krakow-0.dev-oneprovider-krakow.default.svc.cluster.local"],"error":{"id":"badValueToken","details":{"tokenError":{"id":"badToken","description":"Provided token could not be understood by the server."},"key":"token"},"description":"Bad value: provided \"token\" is not a valid token (see details)."}},"description":"Error on nodes dev-oneprovider-krakow-0.dev-oneprovider-krakow.default.svc.cluster.local: Bad value: provided \"token\" is not a valid token (see details)."}} request = <FixtureRequest for <Function 'test_user_sees_inherited_dataset_status_tag_after_marking_its_parent_directory_as_dataset[1oz_1op_deployed]'>> @pytest.mark.usefixtures(*function_args) def scenario_wrapper(request): > _execute_scenario(feature, scenario, request, encoding) /usr/local/lib/python3.6/dist-packages/pytest_bdd/scenario.py:227: (47 more lines...) |
|||
Collapse |
test_oneprovider_datasets
test_user_sees_metadata_data_write_protection_toggles_checked_on_ancestors_list_in_directory_dataset_modal_after_marking_its_parent_directories[1oz_1op_deployed]
|
1 sec | |
tests.utils.http_exceptions.HTTPBadRequest: [400] Bad Request: {"error":{"id":"errorOnNodes","details":{"hostnames":["dev-oneprovider-krakow-0.dev-oneprovider-krakow.default.svc.cluster.local"],"error":{"id":"badValueToken","details":{"tokenError":{"id":"badToken","description":"Provided token could not be understood by the server."},"key":"token"},"description":"Bad value: provided \"token\" is not a valid token (see details)."}},"description":"Error on nodes dev-oneprovider-krakow-0.dev-oneprovider-krakow.default.svc.cluster.local: Bad value: provided \"token\" is not a valid token (see details)."}} request = <FixtureRequest for <Function 'test_user_sees_metadata_data_write_protection_toggles_checked_on_ancestors_list_in_directory_dataset_modal_after_marking_its_parent_directories[1oz_1op_deployed]'>> @pytest.mark.usefixtures(*function_args) def scenario_wrapper(request): > _execute_scenario(feature, scenario, request, encoding) /usr/local/lib/python3.6/dist-packages/pytest_bdd/scenario.py:227: (47 more lines...) |
|||
Collapse |
test_oneprovider_datasets
test_user_sees_path_to_dataset_root_file_after_creating_datasets_and_entering_them[1oz_1op_deployed]
|
< 1 sec | |
tests.utils.http_exceptions.HTTPBadRequest: [400] Bad Request: {"error":{"id":"errorOnNodes","details":{"hostnames":["dev-oneprovider-krakow-0.dev-oneprovider-krakow.default.svc.cluster.local"],"error":{"id":"badValueToken","details":{"tokenError":{"id":"badToken","description":"Provided token could not be understood by the server."},"key":"token"},"description":"Bad value: provided \"token\" is not a valid token (see details)."}},"description":"Error on nodes dev-oneprovider-krakow-0.dev-oneprovider-krakow.default.svc.cluster.local: Bad value: provided \"token\" is not a valid token (see details)."}} request = <FixtureRequest for <Function 'test_user_sees_path_to_dataset_root_file_after_creating_datasets_and_entering_them[1oz_1op_deployed]'>> @pytest.mark.usefixtures(*function_args) def scenario_wrapper(request): > _execute_scenario(feature, scenario, request, encoding) /usr/local/lib/python3.6/dist-packages/pytest_bdd/scenario.py:227: (47 more lines...) |
|||
Collapse |
test_oneprovider_datasets
test_user_sees_proper_list_of_datasets_when_their_names_have_common_prefix_and_end_with_digit[1oz_1op_deployed]
|
1 sec | |
tests.utils.http_exceptions.HTTPBadRequest: [400] Bad Request: {"error":{"id":"errorOnNodes","details":{"hostnames":["dev-oneprovider-krakow-0.dev-oneprovider-krakow.default.svc.cluster.local"],"error":{"id":"badValueToken","details":{"tokenError":{"id":"badToken","description":"Provided token could not be understood by the server."},"key":"token"},"description":"Bad value: provided \"token\" is not a valid token (see details)."}},"description":"Error on nodes dev-oneprovider-krakow-0.dev-oneprovider-krakow.default.svc.cluster.local: Bad value: provided \"token\" is not a valid token (see details)."}} request = <FixtureRequest for <Function 'test_user_sees_proper_list_of_datasets_when_their_names_have_common_prefix_and_end_with_digit[1oz_1op_deployed]'>> @pytest.mark.usefixtures(*function_args) def scenario_wrapper(request): > _execute_scenario(feature, scenario, request, encoding) /usr/local/lib/python3.6/dist-packages/pytest_bdd/scenario.py:227: (47 more lines...) |
|||
Collapse |
test_oneprovider_datasets
test_user_sees_same_paths_to_detached_datasets_root_files_after_deleting_dataset_root_file_and_recreating_file_marked_as_dataset_with_the_same_name[1oz_1op_deployed]
|
17 secs | |
tests.utils.http_exceptions.HTTPBadRequest: [400] Bad Request: {"error":{"id":"errorOnNodes","details":{"hostnames":["dev-oneprovider-krakow-0.dev-oneprovider-krakow.default.svc.cluster.local"],"error":{"id":"badValueToken","details":{"tokenError":{"id":"badToken","description":"Provided token could not be understood by the server."},"key":"token"},"description":"Bad value: provided \"token\" is not a valid token (see details)."}},"description":"Error on nodes dev-oneprovider-krakow-0.dev-oneprovider-krakow.default.svc.cluster.local: Bad value: provided \"token\" is not a valid token (see details)."}} request = <FixtureRequest for <Function 'test_user_sees_same_paths_to_detached_datasets_root_files_after_deleting_dataset_root_file_and_recreating_file_marked_as_dataset_with_the_same_name[1oz_1op_deployed]'>> @pytest.mark.usefixtures(*function_args) def scenario_wrapper(request): > _execute_scenario(feature, scenario, request, encoding) /usr/local/lib/python3.6/dist-packages/pytest_bdd/scenario.py:227: (47 more lines...) |
|||
Collapse |
test_oneprovider_datasets
test_user_sees_that_file_has_dataset_tag_set_after_marking_it_as_dataset[1oz_1op_deployed]
|
2 mins | |
tests.utils.http_exceptions.HTTPBadRequest: [400] Bad Request: {"error":{"id":"errorOnNodes","details":{"hostnames":["dev-oneprovider-krakow-0.dev-oneprovider-krakow.default.svc.cluster.local"],"error":{"id":"badValueToken","details":{"tokenError":{"id":"badToken","description":"Provided token could not be understood by the server."},"key":"token"},"description":"Bad value: provided \"token\" is not a valid token (see details)."}},"description":"Error on nodes dev-oneprovider-krakow-0.dev-oneprovider-krakow.default.svc.cluster.local: Bad value: provided \"token\" is not a valid token (see details)."}} request = <FixtureRequest for <Function 'test_user_sees_that_file_has_dataset_tag_set_after_marking_it_as_dataset[1oz_1op_deployed]'>> @pytest.mark.usefixtures(*function_args) def scenario_wrapper(request): > _execute_scenario(feature, scenario, request, encoding) /usr/local/lib/python3.6/dist-packages/pytest_bdd/scenario.py:227: (47 more lines...) |
Error summary
The build generated some errors. See the full build log for more details.
% Total % Received % Xferd Average Speed Time Time Time Current
Dload Upload Total Spent Left Speed
0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0
100 4653 100 4653 0 0 46530 0 --:--:-- --:--:-- --:--:-- 46530
Error response from daemon: Cannot kill container: 2ab9bb8a4905: Container 2ab9bb8a4905a08e03ac541016c21d4e06782b399d75d99ed177b5ea87fffddf is not running
Submodule 'bamboos' (ssh://git@git.onedata.org:7999/vfs/bamboos.git) registered for path 'bamboos'
Submodule 'cdmi_swagger' (ssh://git@git.onedata.org:7999/vfs/cdmi-swagger.git) registered for path 'cdmi_swagger'
Submodule 'one_env' (ssh://git@git.onedata.org:7999/vfs/one-env.git) registered for path 'one_env'
Submodule 'onepanel_swagger' (ssh://git@git.onedata.org:7999/vfs/onepanel-swagger.git) registered for path 'onepanel_swagger'
Submodule 'oneprovider_swagger' (ssh://git@git.onedata.org:7999/vfs/oneprovider-swagger.git) registered for path 'oneprovider_swagger'
Submodule 'onezone_swagger' (ssh://git@git.onedata.org:7999/vfs/onezone-swagger.git) registered for path 'onezone_swagger'
Cloning into '/mnt/storage/bamboo-agent-home/xml-data/build-dir/ODSRV-GAPT-CODT/onedata/bamboos'...
Cloning into '/mnt/storage/bamboo-agent-home/xml-data/build-dir/ODSRV-GAPT-CODT/onedata/cdmi_swagger'...
Cloning into '/mnt/storage/bamboo-agent-home/xml-data/build-dir/ODSRV-GAPT-CODT/onedata/one_env'...
Cloning into '/mnt/storage/bamboo-agent-home/xml-data/build-dir/ODSRV-GAPT-CODT/onedata/onepanel_swagger'...
Cloning into '/mnt/storage/bamboo-agent-home/xml-data/build-dir/ODSRV-GAPT-CODT/onedata/oneprovider_swagger'...
Cloning into '/mnt/storage/bamboo-agent-home/xml-data/build-dir/ODSRV-GAPT-CODT/onedata/onezone_swagger'...
Submodule 'bamboos' (ssh://git@git.onedata.org:7999/vfs/bamboos.git) registered for path 'onepanel_swagger/bamboos'
Cloning into '/mnt/storage/bamboo-agent-home/xml-data/build-dir/ODSRV-GAPT-CODT/onedata/onepanel_swagger/bamboos'...
Submodule 'bamboos' (ssh://git@git.onedata.org:7999/vfs/bamboos.git) registered for path 'oneprovider_swagger/bamboos'
Cloning into '/mnt/storage/bamboo-agent-home/xml-data/build-dir/ODSRV-GAPT-CODT/onedata/oneprovider_swagger/bamboos'...
Submodule 'bamboos' (ssh://git@git.onedata.org:7999/vfs/bamboos.git) registered for path 'onezone_swagger/bamboos'
Cloning into '/mnt/storage/bamboo-agent-home/xml-data/build-dir/ODSRV-GAPT-CODT/onedata/onezone_swagger/bamboos'...
error: failed to create clusterrolebinding: clusterrolebindings.rbac.authorization.k8s.io "serviceaccounts-cluster-admin" already exists
Error: could not find tiller
Error: could not find tiller
error: failed to create clusterrolebinding: clusterrolebindings.rbac.authorization.k8s.io "serviceaccounts-cluster-admin" already exists
protocol version mismatch -- is your shell clean?
(see the rsync man page for an explanation)
rsync error: protocol incompatibility (code 2) at compat.c(178) [Receiver=3.1.2]
command terminated with exit code 126
command terminated with exit code 126
protocol version mismatch -- is your shell clean?
(see the rsync man page for an explanation)
rsync error: protocol incompatibility (code 2) at compat.c(178) [Receiver=3.1.2]
protocol version mismatch -- is your shell clean?
(see the rsync man page for an explanation)
rsync error: protocol incompatibility (code 2) at compat.c(178) [Receiver=3.1.2]
command terminated with exit code 126
protocol version mismatch -- is your shell clean?
(see the rsync man page for an explanation)
command terminated with exit code 126
rsync error: protocol incompatibility (code 2) at compat.c(178) [Receiver=3.1.2]
protocol version mismatch -- is your shell clean?
(see the rsync man page for an explanation)
rsync error: protocol incompatibility (code 2) at compat.c(178) [Receiver=3.1.2]
command terminated with exit code 126
protocol version mismatch -- is your shell clean?
(see the rsync man page for an explanation)
rsync error: protocol incompatibility (code 2) at compat.c(178) [Receiver=3.1.2]
command terminated with exit code 126
error: failed to create clusterrolebinding: clusterrolebindings.rbac.authorization.k8s.io "serviceaccounts-cluster-admin" already exists
protocol version mismatch -- is your shell clean?
(see the rsync man page for an explanation)
rsync error: protocol incompatibility (code 2) at compat.c(178) [Receiver=3.1.2]
command terminated with exit code 126
command terminated with exit code 126
protocol version mismatch -- is your shell clean?
(see the rsync man page for an explanation)
rsync error: protocol incompatibility (code 2) at compat.c(178) [Receiver=3.1.2]
protocol version mismatch -- is your shell clean?
(see the rsync man page for an explanation)
rsync error: protocol incompatibility (code 2) at compat.c(178) [Receiver=3.1.2]
command terminated with exit code 126
protocol version mismatch -- is your shell clean?
(see the rsync man page for an explanation)
rsync error: protocol incompatibility (code 2) at compat.c(178) [Receiver=3.1.2]
command terminated with exit code 126
protocol version mismatch -- is your shell clean?
(see the rsync man page for an explanation)
rsync error: protocol incompatibility (code 2) at compat.c(178) [Receiver=3.1.2]
command terminated with exit code 126
protocol version mismatch -- is your shell clean?
(see the rsync man page for an explanation)
rsync error: protocol incompatibility (code 2) at compat.c(178) [Receiver=3.1.2]
command terminated with exit code 126
W0816 20:07:49.980681 725 warnings.go:70] policy/v1beta1 PodDisruptionBudget is deprecated in v1.21+, unavailable in v1.25+; use policy/v1 PodDisruptionBudget
cp: cannot stat 'onedata/one_env/sources_info.yaml': No such file or directory
% Total % Received % Xferd Average Speed Time Time Time Current
Dload Upload Total Spent Left Speed
0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0
0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0
100 4653 100 4653 0 0 32538 0 --:--:-- --:--:-- --:--:-- 32538
Error from server (NotFound): pods "dev-oneprovider-krakow-0" not found
Error response from daemon: Cannot kill container: 9b51b8ff86ea: Container 9b51b8ff86ea21a93ffe712b897648cf3d6d3908eb71f0b60d7f2baa45641a61 is not running
Error response from daemon: Cannot kill container: 4155c16ebc54: Container 4155c16ebc54bb4b53cfd5a305c97e7e8251a6c27fa388834b6570ed432e9fde is not running
Error response from daemon: Cannot kill container: 8c04401c5a07: Container 8c04401c5a07ed954f512e7c25f113345c8a57de0ff91bbebee37fe7d8f6ebe7 is not running
Error response from daemon: Cannot kill container: ef93868283c2: Container ef93868283c27e418f7661f1dd37a2199cc05f555837762232e88de0baf960b1 is not running
Error response from daemon: Cannot kill container: 3104aa81832b: Container 3104aa81832b5a72be16303b79e4077e217cffcfdeb017186251abd9aadf5af5 is not running
Error response from daemon: Cannot kill container: 54a46dabc245: Container 54a46dabc2454d6099b8606ef955ee4cac09fc1bdd8b7f2dabd19c61cad55c4f is not running
Error response from daemon: Cannot kill container: cb76489b0b3b: Container cb76489b0b3b927780577fb8c954a4093d20cddb7b7fe4f8e64d99ac92c93e07 is not running
Error response from daemon: Cannot kill container: 2298d17a9c1e: Container 2298d17a9c1ef899642e6487aaff472f47ec3d9c69820c6f75423392fd29207e is not running
Error response from daemon: Cannot kill container: f8844791b15a: Container f8844791b15a881182eec0ac2ffba5e29d7a9a8c358fd9d24ef4fbdec13bff7b is not running
Error response from daemon: Cannot kill container: fbf08b6a14b3: Container fbf08b6a14b3b240605f62d1e076dd7e166e4f50978f270654bb56b56e93081a is not running
Error response from daemon: Cannot kill container: 4629da540db3: Container 4629da540db301626f48d03bfc4220c4db45034e561cf2f348c2ddc3100414ba is not running
Error response from daemon: Cannot kill container: 19382b8d63a5: Container 19382b8d63a5ec1f65539d26386ceca5c337240fea8bfd94b7dccad8c8c5700e is not running
Error response from daemon: Cannot kill container: 0f174b221182: Container 0f174b2211826f94d7b39b5e609aa516c239e31a9ecbd133b734a7203241a7b2 is not running
Error response from daemon: Cannot kill container: 38b8b0f45a4a: Container 38b8b0f45a4a7187582eab72024274add8d0ded845bdaa6fc3891480d7232614 is not running
Error response from daemon: Cannot kill container: 234bbd30373d: Container 234bbd30373d20b13cd576a19753a99e80541090840380e517dbed56ddca8c0d is not running
Error response from daemon: Cannot kill container: 2d991b2f73e1: Container 2d991b2f73e14c883d79f8a3f9c4f408621f93fe77a224b0ca3e85693e2f6bb7 is not running
Dload Upload Total Spent Left Speed
0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0
100 4653 100 4653 0 0 46530 0 --:--:-- --:--:-- --:--:-- 46530
Error response from daemon: Cannot kill container: 2ab9bb8a4905: Container 2ab9bb8a4905a08e03ac541016c21d4e06782b399d75d99ed177b5ea87fffddf is not running
Submodule 'bamboos' (ssh://git@git.onedata.org:7999/vfs/bamboos.git) registered for path 'bamboos'
Submodule 'cdmi_swagger' (ssh://git@git.onedata.org:7999/vfs/cdmi-swagger.git) registered for path 'cdmi_swagger'
Submodule 'one_env' (ssh://git@git.onedata.org:7999/vfs/one-env.git) registered for path 'one_env'
Submodule 'onepanel_swagger' (ssh://git@git.onedata.org:7999/vfs/onepanel-swagger.git) registered for path 'onepanel_swagger'
Submodule 'oneprovider_swagger' (ssh://git@git.onedata.org:7999/vfs/oneprovider-swagger.git) registered for path 'oneprovider_swagger'
Submodule 'onezone_swagger' (ssh://git@git.onedata.org:7999/vfs/onezone-swagger.git) registered for path 'onezone_swagger'
Cloning into '/mnt/storage/bamboo-agent-home/xml-data/build-dir/ODSRV-GAPT-CODT/onedata/bamboos'...
Cloning into '/mnt/storage/bamboo-agent-home/xml-data/build-dir/ODSRV-GAPT-CODT/onedata/cdmi_swagger'...
Cloning into '/mnt/storage/bamboo-agent-home/xml-data/build-dir/ODSRV-GAPT-CODT/onedata/one_env'...
Cloning into '/mnt/storage/bamboo-agent-home/xml-data/build-dir/ODSRV-GAPT-CODT/onedata/onepanel_swagger'...
Cloning into '/mnt/storage/bamboo-agent-home/xml-data/build-dir/ODSRV-GAPT-CODT/onedata/oneprovider_swagger'...
Cloning into '/mnt/storage/bamboo-agent-home/xml-data/build-dir/ODSRV-GAPT-CODT/onedata/onezone_swagger'...
Submodule 'bamboos' (ssh://git@git.onedata.org:7999/vfs/bamboos.git) registered for path 'onepanel_swagger/bamboos'
Cloning into '/mnt/storage/bamboo-agent-home/xml-data/build-dir/ODSRV-GAPT-CODT/onedata/onepanel_swagger/bamboos'...
Submodule 'bamboos' (ssh://git@git.onedata.org:7999/vfs/bamboos.git) registered for path 'oneprovider_swagger/bamboos'
Cloning into '/mnt/storage/bamboo-agent-home/xml-data/build-dir/ODSRV-GAPT-CODT/onedata/oneprovider_swagger/bamboos'...
Submodule 'bamboos' (ssh://git@git.onedata.org:7999/vfs/bamboos.git) registered for path 'onezone_swagger/bamboos'
Cloning into '/mnt/storage/bamboo-agent-home/xml-data/build-dir/ODSRV-GAPT-CODT/onedata/onezone_swagger/bamboos'...
error: failed to create clusterrolebinding: clusterrolebindings.rbac.authorization.k8s.io "serviceaccounts-cluster-admin" already exists
Error: could not find tiller
Error: could not find tiller
error: failed to create clusterrolebinding: clusterrolebindings.rbac.authorization.k8s.io "serviceaccounts-cluster-admin" already exists
protocol version mismatch -- is your shell clean?
(see the rsync man page for an explanation)
rsync error: protocol incompatibility (code 2) at compat.c(178) [Receiver=3.1.2]
command terminated with exit code 126
command terminated with exit code 126
protocol version mismatch -- is your shell clean?
(see the rsync man page for an explanation)
rsync error: protocol incompatibility (code 2) at compat.c(178) [Receiver=3.1.2]
protocol version mismatch -- is your shell clean?
(see the rsync man page for an explanation)
rsync error: protocol incompatibility (code 2) at compat.c(178) [Receiver=3.1.2]
command terminated with exit code 126
protocol version mismatch -- is your shell clean?
(see the rsync man page for an explanation)
command terminated with exit code 126
rsync error: protocol incompatibility (code 2) at compat.c(178) [Receiver=3.1.2]
protocol version mismatch -- is your shell clean?
(see the rsync man page for an explanation)
rsync error: protocol incompatibility (code 2) at compat.c(178) [Receiver=3.1.2]
command terminated with exit code 126
protocol version mismatch -- is your shell clean?
(see the rsync man page for an explanation)
rsync error: protocol incompatibility (code 2) at compat.c(178) [Receiver=3.1.2]
command terminated with exit code 126
error: failed to create clusterrolebinding: clusterrolebindings.rbac.authorization.k8s.io "serviceaccounts-cluster-admin" already exists
protocol version mismatch -- is your shell clean?
(see the rsync man page for an explanation)
rsync error: protocol incompatibility (code 2) at compat.c(178) [Receiver=3.1.2]
command terminated with exit code 126
command terminated with exit code 126
protocol version mismatch -- is your shell clean?
(see the rsync man page for an explanation)
rsync error: protocol incompatibility (code 2) at compat.c(178) [Receiver=3.1.2]
protocol version mismatch -- is your shell clean?
(see the rsync man page for an explanation)
rsync error: protocol incompatibility (code 2) at compat.c(178) [Receiver=3.1.2]
command terminated with exit code 126
protocol version mismatch -- is your shell clean?
(see the rsync man page for an explanation)
rsync error: protocol incompatibility (code 2) at compat.c(178) [Receiver=3.1.2]
command terminated with exit code 126
protocol version mismatch -- is your shell clean?
(see the rsync man page for an explanation)
rsync error: protocol incompatibility (code 2) at compat.c(178) [Receiver=3.1.2]
command terminated with exit code 126
protocol version mismatch -- is your shell clean?
(see the rsync man page for an explanation)
rsync error: protocol incompatibility (code 2) at compat.c(178) [Receiver=3.1.2]
command terminated with exit code 126
W0816 20:07:49.980681 725 warnings.go:70] policy/v1beta1 PodDisruptionBudget is deprecated in v1.21+, unavailable in v1.25+; use policy/v1 PodDisruptionBudget
cp: cannot stat 'onedata/one_env/sources_info.yaml': No such file or directory
% Total % Received % Xferd Average Speed Time Time Time Current
Dload Upload Total Spent Left Speed
0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0
0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0
100 4653 100 4653 0 0 32538 0 --:--:-- --:--:-- --:--:-- 32538
Error from server (NotFound): pods "dev-oneprovider-krakow-0" not found
Error response from daemon: Cannot kill container: 9b51b8ff86ea: Container 9b51b8ff86ea21a93ffe712b897648cf3d6d3908eb71f0b60d7f2baa45641a61 is not running
Error response from daemon: Cannot kill container: 4155c16ebc54: Container 4155c16ebc54bb4b53cfd5a305c97e7e8251a6c27fa388834b6570ed432e9fde is not running
Error response from daemon: Cannot kill container: 8c04401c5a07: Container 8c04401c5a07ed954f512e7c25f113345c8a57de0ff91bbebee37fe7d8f6ebe7 is not running
Error response from daemon: Cannot kill container: ef93868283c2: Container ef93868283c27e418f7661f1dd37a2199cc05f555837762232e88de0baf960b1 is not running
Error response from daemon: Cannot kill container: 3104aa81832b: Container 3104aa81832b5a72be16303b79e4077e217cffcfdeb017186251abd9aadf5af5 is not running
Error response from daemon: Cannot kill container: 54a46dabc245: Container 54a46dabc2454d6099b8606ef955ee4cac09fc1bdd8b7f2dabd19c61cad55c4f is not running
Error response from daemon: Cannot kill container: cb76489b0b3b: Container cb76489b0b3b927780577fb8c954a4093d20cddb7b7fe4f8e64d99ac92c93e07 is not running
Error response from daemon: Cannot kill container: 2298d17a9c1e: Container 2298d17a9c1ef899642e6487aaff472f47ec3d9c69820c6f75423392fd29207e is not running
Error response from daemon: Cannot kill container: f8844791b15a: Container f8844791b15a881182eec0ac2ffba5e29d7a9a8c358fd9d24ef4fbdec13bff7b is not running
Error response from daemon: Cannot kill container: fbf08b6a14b3: Container fbf08b6a14b3b240605f62d1e076dd7e166e4f50978f270654bb56b56e93081a is not running
Error response from daemon: Cannot kill container: 4629da540db3: Container 4629da540db301626f48d03bfc4220c4db45034e561cf2f348c2ddc3100414ba is not running
Error response from daemon: Cannot kill container: 19382b8d63a5: Container 19382b8d63a5ec1f65539d26386ceca5c337240fea8bfd94b7dccad8c8c5700e is not running
Error response from daemon: Cannot kill container: 0f174b221182: Container 0f174b2211826f94d7b39b5e609aa516c239e31a9ecbd133b734a7203241a7b2 is not running
Error response from daemon: Cannot kill container: 38b8b0f45a4a: Container 38b8b0f45a4a7187582eab72024274add8d0ded845bdaa6fc3891480d7232614 is not running
Error response from daemon: Cannot kill container: 234bbd30373d: Container 234bbd30373d20b13cd576a19753a99e80541090840380e517dbed56ddca8c0d is not running
Error response from daemon: Cannot kill container: 2d991b2f73e1: Container 2d991b2f73e14c883d79f8a3f9c4f408621f93fe77a224b0ca3e85693e2f6bb7 is not running