GUI acceptance tests using environment deployed from packages.
Build: #2261 failed
Job: Onezone spaces memberships privileges failed
Job result summary
- Completed
- Duration
- 5 minutes
- Revision
-
32a8637a9a01b830c75d008c42ff27e7892513dd
- Total tests
- 12
- Fixed in
- #2262 (Child of ODSRV-OPRPM-2186)
Tests
- 12 tests in total
- 12 tests failed
- 12 failures are new
- 2 minutes taken in total.
Status | Test | Duration | |
---|---|---|---|
Collapse |
test_onezone_spaces_memberships_privileges
test_appropriate_tabs_are_disabled_after_removing_some_of_user_privileges[1oz_1op_deployed]
|
1 sec | |
tests.utils.http_exceptions.HTTPBadRequest: [400] Bad Request: {"error":{"id":"errorOnNodes","details":{"hostnames":["dev-oneprovider-krakow-0.dev-oneprovider-krakow.default.svc.cluster.local"],"error":{"id":"badValueToken","details":{"tokenError":{"id":"badToken","description":"Provided token could not be understood by the server."},"key":"token"},"description":"Bad value: provided \"token\" is not a valid token (see details)."}},"description":"Error on nodes dev-oneprovider-krakow-0.dev-oneprovider-krakow.default.svc.cluster.local: Bad value: provided \"token\" is not a valid token (see details)."}} request = <FixtureRequest for <Function 'test_appropriate_tabs_are_disabled_after_removing_some_of_user_privileges[1oz_1op_deployed]'>> @pytest.mark.usefixtures(*function_args) def scenario_wrapper(request): > _execute_scenario(feature, scenario, request, encoding) /usr/local/lib/python3.6/dist-packages/pytest_bdd/scenario.py:227: (47 more lines...) |
|||
Collapse |
test_onezone_spaces_memberships_privileges
test_nonspaceowner_fails_to_generate_group_invite_token_because_of_lack_in_privileges[1oz_1op_deployed]
|
2 secs | |
tests.utils.http_exceptions.HTTPBadRequest: [400] Bad Request: {"error":{"id":"errorOnNodes","details":{"hostnames":["dev-oneprovider-krakow-0.dev-oneprovider-krakow.default.svc.cluster.local"],"error":{"id":"badValueToken","details":{"tokenError":{"id":"badToken","description":"Provided token could not be understood by the server."},"key":"token"},"description":"Bad value: provided \"token\" is not a valid token (see details)."}},"description":"Error on nodes dev-oneprovider-krakow-0.dev-oneprovider-krakow.default.svc.cluster.local: Bad value: provided \"token\" is not a valid token (see details)."}} request = <FixtureRequest for <Function 'test_nonspaceowner_fails_to_generate_group_invite_token_because_of_lack_in_privileges[1oz_1op_deployed]'>> @pytest.mark.usefixtures(*function_args) def scenario_wrapper(request): > _execute_scenario(feature, scenario, request, encoding) /usr/local/lib/python3.6/dist-packages/pytest_bdd/scenario.py:227: (47 more lines...) |
|||
Collapse |
test_onezone_spaces_memberships_privileges
test_nonspaceowner_generates_group_invite_token_to_join_space[1oz_1op_deployed]
|
2 secs | |
tests.utils.http_exceptions.HTTPBadRequest: [400] Bad Request: {"error":{"id":"errorOnNodes","details":{"hostnames":["dev-oneprovider-krakow-0.dev-oneprovider-krakow.default.svc.cluster.local"],"error":{"id":"badValueToken","details":{"tokenError":{"id":"badToken","description":"Provided token could not be understood by the server."},"key":"token"},"description":"Bad value: provided \"token\" is not a valid token (see details)."}},"description":"Error on nodes dev-oneprovider-krakow-0.dev-oneprovider-krakow.default.svc.cluster.local: Bad value: provided \"token\" is not a valid token (see details)."}} request = <FixtureRequest for <Function 'test_nonspaceowner_generates_group_invite_token_to_join_space[1oz_1op_deployed]'>> @pytest.mark.usefixtures(*function_args) def scenario_wrapper(request): > _execute_scenario(feature, scenario, request, encoding) /usr/local/lib/python3.6/dist-packages/pytest_bdd/scenario.py:227: (47 more lines...) |
|||
Collapse |
test_onezone_spaces_memberships_privileges
test_nonspaceowner_successfully_creates_directory_if_he_got_write_files_privilege[1oz_1op_deployed]
|
1 sec | |
tests.utils.http_exceptions.HTTPBadRequest: [400] Bad Request: {"error":{"id":"errorOnNodes","details":{"hostnames":["dev-oneprovider-krakow-0.dev-oneprovider-krakow.default.svc.cluster.local"],"error":{"id":"badValueToken","details":{"tokenError":{"id":"badToken","description":"Provided token could not be understood by the server."},"key":"token"},"description":"Bad value: provided \"token\" is not a valid token (see details)."}},"description":"Error on nodes dev-oneprovider-krakow-0.dev-oneprovider-krakow.default.svc.cluster.local: Bad value: provided \"token\" is not a valid token (see details)."}} request = <FixtureRequest for <Function 'test_nonspaceowner_successfully_creates_directory_if_he_got_write_files_privilege[1oz_1op_deployed]'>> @pytest.mark.usefixtures(*function_args) def scenario_wrapper(request): > _execute_scenario(feature, scenario, request, encoding) /usr/local/lib/python3.6/dist-packages/pytest_bdd/scenario.py:227: (47 more lines...) |
|||
Collapse |
test_onezone_spaces_memberships_privileges
test_nonspaceowner_successfully_creates_share_if_he_got_manage_shares_privilege[1oz_1op_deployed]
|
1 sec | |
tests.utils.http_exceptions.HTTPBadRequest: [400] Bad Request: {"error":{"id":"errorOnNodes","details":{"hostnames":["dev-oneprovider-krakow-0.dev-oneprovider-krakow.default.svc.cluster.local"],"error":{"id":"badValueToken","details":{"tokenError":{"id":"badToken","description":"Provided token could not be understood by the server."},"key":"token"},"description":"Bad value: provided \"token\" is not a valid token (see details)."}},"description":"Error on nodes dev-oneprovider-krakow-0.dev-oneprovider-krakow.default.svc.cluster.local: Bad value: provided \"token\" is not a valid token (see details)."}} request = <FixtureRequest for <Function 'test_nonspaceowner_successfully_creates_share_if_he_got_manage_shares_privilege[1oz_1op_deployed]'>> @pytest.mark.usefixtures(*function_args) def scenario_wrapper(request): > _execute_scenario(feature, scenario, request, encoding) /usr/local/lib/python3.6/dist-packages/pytest_bdd/scenario.py:227: (47 more lines...) |
|||
Collapse |
test_onezone_spaces_memberships_privileges
test_nonspaceowner_successfully_generates_space_invite_token_if_he_got_user_management_privilege[1oz_1op_deployed]
|
14 secs | |
tests.utils.http_exceptions.HTTPBadRequest: [400] Bad Request: {"error":{"id":"errorOnNodes","details":{"hostnames":["dev-oneprovider-krakow-0.dev-oneprovider-krakow.default.svc.cluster.local"],"error":{"id":"badValueToken","details":{"tokenError":{"id":"badToken","description":"Provided token could not be understood by the server."},"key":"token"},"description":"Bad value: provided \"token\" is not a valid token (see details)."}},"description":"Error on nodes dev-oneprovider-krakow-0.dev-oneprovider-krakow.default.svc.cluster.local: Bad value: provided \"token\" is not a valid token (see details)."}} request = <FixtureRequest for <Function 'test_nonspaceowner_successfully_generates_space_invite_token_if_he_got_user_management_privilege[1oz_1op_deployed]'>> @pytest.mark.usefixtures(*function_args) def scenario_wrapper(request): > _execute_scenario(feature, scenario, request, encoding) /usr/local/lib/python3.6/dist-packages/pytest_bdd/scenario.py:227: (47 more lines...) |
|||
Collapse |
test_onezone_spaces_memberships_privileges
test_nonspaceowner_successfully_views_data_if_he_got_read_files_privilege[1oz_1op_deployed]
|
2 secs | |
tests.utils.http_exceptions.HTTPBadRequest: [400] Bad Request: {"error":{"id":"errorOnNodes","details":{"hostnames":["dev-oneprovider-krakow-0.dev-oneprovider-krakow.default.svc.cluster.local"],"error":{"id":"badValueToken","details":{"tokenError":{"id":"badToken","description":"Provided token could not be understood by the server."},"key":"token"},"description":"Bad value: provided \"token\" is not a valid token (see details)."}},"description":"Error on nodes dev-oneprovider-krakow-0.dev-oneprovider-krakow.default.svc.cluster.local: Bad value: provided \"token\" is not a valid token (see details)."}} request = <FixtureRequest for <Function 'test_nonspaceowner_successfully_views_data_if_he_got_read_files_privilege[1oz_1op_deployed]'>> @pytest.mark.usefixtures(*function_args) def scenario_wrapper(request): > _execute_scenario(feature, scenario, request, encoding) /usr/local/lib/python3.6/dist-packages/pytest_bdd/scenario.py:227: (47 more lines...) |
|||
Collapse |
test_onezone_spaces_memberships_privileges
test_user_fails_to_generate_space_invite_token_because_of_lack_in_privileges[1oz_1op_deployed]
|
2 secs | |
tests.utils.http_exceptions.HTTPBadRequest: [400] Bad Request: {"error":{"id":"errorOnNodes","details":{"hostnames":["dev-oneprovider-krakow-0.dev-oneprovider-krakow.default.svc.cluster.local"],"error":{"id":"badValueToken","details":{"tokenError":{"id":"badToken","description":"Provided token could not be understood by the server."},"key":"token"},"description":"Bad value: provided \"token\" is not a valid token (see details)."}},"description":"Error on nodes dev-oneprovider-krakow-0.dev-oneprovider-krakow.default.svc.cluster.local: Bad value: provided \"token\" is not a valid token (see details)."}} request = <FixtureRequest for <Function 'test_user_fails_to_generate_space_invite_token_because_of_lack_in_privileges[1oz_1op_deployed]'>> @pytest.mark.usefixtures(*function_args) def scenario_wrapper(request): > _execute_scenario(feature, scenario, request, encoding) /usr/local/lib/python3.6/dist-packages/pytest_bdd/scenario.py:227: (47 more lines...) |
|||
Collapse |
test_onezone_spaces_memberships_privileges
test_user_fails_to_remove_group_from_space_without_remove_group_privileges[1oz_1op_deployed]
|
2 secs | |
tests.utils.http_exceptions.HTTPBadRequest: [400] Bad Request: {"error":{"id":"errorOnNodes","details":{"hostnames":["dev-oneprovider-krakow-0.dev-oneprovider-krakow.default.svc.cluster.local"],"error":{"id":"badValueToken","details":{"tokenError":{"id":"badToken","description":"Provided token could not be understood by the server."},"key":"token"},"description":"Bad value: provided \"token\" is not a valid token (see details)."}},"description":"Error on nodes dev-oneprovider-krakow-0.dev-oneprovider-krakow.default.svc.cluster.local: Bad value: provided \"token\" is not a valid token (see details)."}} request = <FixtureRequest for <Function 'test_user_fails_to_remove_group_from_space_without_remove_group_privileges[1oz_1op_deployed]'>> @pytest.mark.usefixtures(*function_args) def scenario_wrapper(request): > _execute_scenario(feature, scenario, request, encoding) /usr/local/lib/python3.6/dist-packages/pytest_bdd/scenario.py:227: (47 more lines...) |
|||
Collapse |
test_onezone_spaces_memberships_privileges
test_user_fails_to_remove_other_user_from_given_space_because_of_lack_in_privileges[1oz_1op_deployed]
|
2 secs | |
tests.utils.http_exceptions.HTTPBadRequest: [400] Bad Request: {"error":{"id":"errorOnNodes","details":{"hostnames":["dev-oneprovider-krakow-0.dev-oneprovider-krakow.default.svc.cluster.local"],"error":{"id":"badValueToken","details":{"tokenError":{"id":"badToken","description":"Provided token could not be understood by the server."},"key":"token"},"description":"Bad value: provided \"token\" is not a valid token (see details)."}},"description":"Error on nodes dev-oneprovider-krakow-0.dev-oneprovider-krakow.default.svc.cluster.local: Bad value: provided \"token\" is not a valid token (see details)."}} request = <FixtureRequest for <Function 'test_user_fails_to_remove_other_user_from_given_space_because_of_lack_in_privileges[1oz_1op_deployed]'>> @pytest.mark.usefixtures(*function_args) def scenario_wrapper(request): > _execute_scenario(feature, scenario, request, encoding) /usr/local/lib/python3.6/dist-packages/pytest_bdd/scenario.py:227: (47 more lines...) |
|||
Collapse |
test_onezone_spaces_memberships_privileges
test_user_fails_to_see_privileges_of_another_user_until_he_is_granted_all_privileges_by_becoming_an_owner[1oz_1op_deployed]
|
2 mins | |
tests.utils.http_exceptions.HTTPBadRequest: [400] Bad Request: {"error":{"id":"errorOnNodes","details":{"hostnames":["dev-oneprovider-krakow-0.dev-oneprovider-krakow.default.svc.cluster.local"],"error":{"id":"badValueToken","details":{"tokenError":{"id":"badToken","description":"Provided token could not be understood by the server."},"key":"token"},"description":"Bad value: provided \"token\" is not a valid token (see details)."}},"description":"Error on nodes dev-oneprovider-krakow-0.dev-oneprovider-krakow.default.svc.cluster.local: Bad value: provided \"token\" is not a valid token (see details)."}} request = <FixtureRequest for <Function 'test_user_fails_to_see_privileges_of_another_user_until_he_is_granted_all_privileges_by_becoming_an_owner[1oz_1op_deployed]'>> @pytest.mark.usefixtures(*function_args) def scenario_wrapper(request): > _execute_scenario(feature, scenario, request, encoding) /usr/local/lib/python3.6/dist-packages/pytest_bdd/scenario.py:227: (47 more lines...) |
|||
Collapse |
test_onezone_spaces_memberships_privileges
test_user_sees_and_modifies_privileges_to_his_space[1oz_1op_deployed]
|
2 secs | |
tests.utils.http_exceptions.HTTPBadRequest: [400] Bad Request: {"error":{"id":"errorOnNodes","details":{"hostnames":["dev-oneprovider-krakow-0.dev-oneprovider-krakow.default.svc.cluster.local"],"error":{"id":"badValueToken","details":{"tokenError":{"id":"badToken","description":"Provided token could not be understood by the server."},"key":"token"},"description":"Bad value: provided \"token\" is not a valid token (see details)."}},"description":"Error on nodes dev-oneprovider-krakow-0.dev-oneprovider-krakow.default.svc.cluster.local: Bad value: provided \"token\" is not a valid token (see details)."}} request = <FixtureRequest for <Function 'test_user_sees_and_modifies_privileges_to_his_space[1oz_1op_deployed]'>> @pytest.mark.usefixtures(*function_args) def scenario_wrapper(request): > _execute_scenario(feature, scenario, request, encoding) /usr/local/lib/python3.6/dist-packages/pytest_bdd/scenario.py:227: (47 more lines...) |
Error summary
The build generated some errors. See the full build log for more details.
% Total % Received % Xferd Average Speed Time Time Time Current
Dload Upload Total Spent Left Speed
0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0
100 4653 100 4653 0 0 40112 0 --:--:-- --:--:-- --:--:-- 40112
Error response from daemon: Cannot kill container: f3a6b884159a: Container f3a6b884159a174e37d2ed45d0f7be8d5062d4c3f7ac73fc6579d5e42b47d0fd is not running
Submodule 'bamboos' (ssh://git@git.onedata.org:7999/vfs/bamboos.git) registered for path 'bamboos'
Submodule 'cdmi_swagger' (ssh://git@git.onedata.org:7999/vfs/cdmi-swagger.git) registered for path 'cdmi_swagger'
Submodule 'one_env' (ssh://git@git.onedata.org:7999/vfs/one-env.git) registered for path 'one_env'
Submodule 'onepanel_swagger' (ssh://git@git.onedata.org:7999/vfs/onepanel-swagger.git) registered for path 'onepanel_swagger'
Submodule 'oneprovider_swagger' (ssh://git@git.onedata.org:7999/vfs/oneprovider-swagger.git) registered for path 'oneprovider_swagger'
Submodule 'onezone_swagger' (ssh://git@git.onedata.org:7999/vfs/onezone-swagger.git) registered for path 'onezone_swagger'
Cloning into '/mnt/storage/bamboo-agent-home/xml-data/build-dir/ODSRV-GAPT-COSMP/onedata/bamboos'...
Cloning into '/mnt/storage/bamboo-agent-home/xml-data/build-dir/ODSRV-GAPT-COSMP/onedata/cdmi_swagger'...
Cloning into '/mnt/storage/bamboo-agent-home/xml-data/build-dir/ODSRV-GAPT-COSMP/onedata/one_env'...
Cloning into '/mnt/storage/bamboo-agent-home/xml-data/build-dir/ODSRV-GAPT-COSMP/onedata/onepanel_swagger'...
Cloning into '/mnt/storage/bamboo-agent-home/xml-data/build-dir/ODSRV-GAPT-COSMP/onedata/oneprovider_swagger'...
Cloning into '/mnt/storage/bamboo-agent-home/xml-data/build-dir/ODSRV-GAPT-COSMP/onedata/onezone_swagger'...
Submodule 'bamboos' (ssh://git@git.onedata.org:7999/vfs/bamboos.git) registered for path 'onepanel_swagger/bamboos'
Cloning into '/mnt/storage/bamboo-agent-home/xml-data/build-dir/ODSRV-GAPT-COSMP/onedata/onepanel_swagger/bamboos'...
Submodule 'bamboos' (ssh://git@git.onedata.org:7999/vfs/bamboos.git) registered for path 'oneprovider_swagger/bamboos'
Cloning into '/mnt/storage/bamboo-agent-home/xml-data/build-dir/ODSRV-GAPT-COSMP/onedata/oneprovider_swagger/bamboos'...
Submodule 'bamboos' (ssh://git@git.onedata.org:7999/vfs/bamboos.git) registered for path 'onezone_swagger/bamboos'
Cloning into '/mnt/storage/bamboo-agent-home/xml-data/build-dir/ODSRV-GAPT-COSMP/onedata/onezone_swagger/bamboos'...
error: failed to create clusterrolebinding: clusterrolebindings.rbac.authorization.k8s.io "serviceaccounts-cluster-admin" already exists
Error: could not find tiller
Error: could not find tiller
error: failed to create clusterrolebinding: clusterrolebindings.rbac.authorization.k8s.io "serviceaccounts-cluster-admin" already exists
protocol version mismatch -- is your shell clean?
(see the rsync man page for an explanation)
rsync error: protocol incompatibility (code 2) at compat.c(178) [Receiver=3.1.2]
command terminated with exit code 126
protocol version mismatch -- is your shell clean?
(see the rsync man page for an explanation)
rsync error: protocol incompatibility (code 2) at compat.c(178) [Receiver=3.1.2]
command terminated with exit code 126
protocol version mismatch -- is your shell clean?
(see the rsync man page for an explanation)
rsync error: protocol incompatibility (code 2) at compat.c(178) [Receiver=3.1.2]
command terminated with exit code 126
protocol version mismatch -- is your shell clean?
(see the rsync man page for an explanation)
rsync error: protocol incompatibility (code 2) at compat.c(178) [Receiver=3.1.2]
command terminated with exit code 126
protocol version mismatch -- is your shell clean?
(see the rsync man page for an explanation)
rsync error: protocol incompatibility (code 2) at compat.c(178) [Receiver=3.1.2]
command terminated with exit code 126
protocol version mismatch -- is your shell clean?
(see the rsync man page for an explanation)
rsync error: protocol incompatibility (code 2) at compat.c(178) [Receiver=3.1.2]
command terminated with exit code 126
error: failed to create clusterrolebinding: clusterrolebindings.rbac.authorization.k8s.io "serviceaccounts-cluster-admin" already exists
protocol version mismatch -- is your shell clean?
(see the rsync man page for an explanation)
rsync error: protocol incompatibility (code 2) at compat.c(178) [Receiver=3.1.2]
command terminated with exit code 126
protocol version mismatch -- is your shell clean?
(see the rsync man page for an explanation)
rsync error: protocol incompatibility (code 2) at compat.c(178) [Receiver=3.1.2]
command terminated with exit code 126
protocol version mismatch -- is your shell clean?
(see the rsync man page for an explanation)
rsync error: protocol incompatibility (code 2) at compat.c(178) [Receiver=3.1.2]
command terminated with exit code 126
protocol version mismatch -- is your shell clean?
(see the rsync man page for an explanation)
rsync error: protocol incompatibility (code 2) at compat.c(178) [Receiver=3.1.2]
command terminated with exit code 126
protocol version mismatch -- is your shell clean?
(see the rsync man page for an explanation)
rsync error: protocol incompatibility (code 2) at compat.c(178) [Receiver=3.1.2]
command terminated with exit code 126
protocol version mismatch -- is your shell clean?
(see the rsync man page for an explanation)
rsync error: protocol incompatibility (code 2) at compat.c(178) [Receiver=3.1.2]
command terminated with exit code 126
W0816 20:06:08.137756 739 warnings.go:70] policy/v1beta1 PodDisruptionBudget is deprecated in v1.21+, unavailable in v1.25+; use policy/v1 PodDisruptionBudget
cp: cannot stat 'onedata/one_env/sources_info.yaml': No such file or directory
% Total % Received % Xferd Average Speed Time Time Time Current
Dload Upload Total Spent Left Speed
0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0
0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0
100 4653 100 4653 0 0 32767 0 --:--:-- --:--:-- --:--:-- 32767
Error from server (NotFound): pods "dev-oneprovider-krakow-0" not found
Error response from daemon: Cannot kill container: cafe2648cb38: Container cafe2648cb38d77dd16226f19a837eeae2ffa924a96c647a48625eba4bc0e396 is not running
Error response from daemon: Cannot kill container: 11eb67e8a0df: Container 11eb67e8a0dfc3115856ab0cef81892ea17c25fc42bfb441b0761440202d911d is not running
Error response from daemon: Cannot kill container: 667ef93e73f4: Container 667ef93e73f4adf9446c1daca2bf4534227319c848321915bc4f84b7ccb9eaf5 is not running
Error response from daemon: Cannot kill container: f188758fe6c1: Container f188758fe6c132f1b9c9f1763ce09d6b9a038b1072537def3357ece63fb211cf is not running
Error response from daemon: Cannot kill container: 3e41af41ac8d: Container 3e41af41ac8d0f13fd7d4d1fd7dd1e88dee43eee65be903add08b55c7c9eca4d is not running
Error response from daemon: Cannot kill container: ec05c7b9e8eb: Container ec05c7b9e8ebccc028ab66b0bb29dca795d51fbdfd98a7c5b14e48c6cbe48bc9 is not running
Error response from daemon: Cannot kill container: 628deadde4b1: Container 628deadde4b1395abe92c664cb6e4ae9aae1b0faa9888f2c0682bba45b79a6ae is not running
Error response from daemon: Cannot kill container: ef8d9bf4b4f7: Container ef8d9bf4b4f7d8fdcf625eefb5efd785f332c51a72c2147849816db5f73d9902 is not running
Error response from daemon: Cannot kill container: c848b16feff7: Container c848b16feff7b9589527f0f6baa745d34abf92ff782fe9397ad390e3d93cfdae is not running
Error response from daemon: Cannot kill container: 43e001b7f7a5: Container 43e001b7f7a5593bd3fe5b736e2c477e4f2ed418157f42d3e2ffe32e3a9153e2 is not running
Error response from daemon: Cannot kill container: 5d78d1a7c5f6: Container 5d78d1a7c5f6e4a53b3ddbf605869a4be3fd65a894757d62dbc0f8cb10886ae7 is not running
Error response from daemon: Cannot kill container: 9c1c22a6fae7: Container 9c1c22a6fae7efb34cdf182af7b7e6837db35d0adaf9572cf32c87a0b25299fb is not running
Error response from daemon: Cannot kill container: b7216daee70f: Container b7216daee70f2a155adb1d48228bc0ce3d084221246394b159097aa123a5d389 is not running
Error response from daemon: Cannot kill container: ec5e805383a2: Container ec5e805383a21e41473641f7a5792d75bbf89f8160e96e348b15615575a1feb5 is not running
Error response from daemon: Cannot kill container: ed4648dbf5c7: Container ed4648dbf5c72150a45d53744b76d15479382636873789787ae9c5f8c74dcb75 is not running
Error response from daemon: Cannot kill container: 2e36b77648c9: Container 2e36b77648c907fb04a99ae7b6e6104180f52ca57964cd6d7c0a1504e705572b is not running
Dload Upload Total Spent Left Speed
0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0
100 4653 100 4653 0 0 40112 0 --:--:-- --:--:-- --:--:-- 40112
Error response from daemon: Cannot kill container: f3a6b884159a: Container f3a6b884159a174e37d2ed45d0f7be8d5062d4c3f7ac73fc6579d5e42b47d0fd is not running
Submodule 'bamboos' (ssh://git@git.onedata.org:7999/vfs/bamboos.git) registered for path 'bamboos'
Submodule 'cdmi_swagger' (ssh://git@git.onedata.org:7999/vfs/cdmi-swagger.git) registered for path 'cdmi_swagger'
Submodule 'one_env' (ssh://git@git.onedata.org:7999/vfs/one-env.git) registered for path 'one_env'
Submodule 'onepanel_swagger' (ssh://git@git.onedata.org:7999/vfs/onepanel-swagger.git) registered for path 'onepanel_swagger'
Submodule 'oneprovider_swagger' (ssh://git@git.onedata.org:7999/vfs/oneprovider-swagger.git) registered for path 'oneprovider_swagger'
Submodule 'onezone_swagger' (ssh://git@git.onedata.org:7999/vfs/onezone-swagger.git) registered for path 'onezone_swagger'
Cloning into '/mnt/storage/bamboo-agent-home/xml-data/build-dir/ODSRV-GAPT-COSMP/onedata/bamboos'...
Cloning into '/mnt/storage/bamboo-agent-home/xml-data/build-dir/ODSRV-GAPT-COSMP/onedata/cdmi_swagger'...
Cloning into '/mnt/storage/bamboo-agent-home/xml-data/build-dir/ODSRV-GAPT-COSMP/onedata/one_env'...
Cloning into '/mnt/storage/bamboo-agent-home/xml-data/build-dir/ODSRV-GAPT-COSMP/onedata/onepanel_swagger'...
Cloning into '/mnt/storage/bamboo-agent-home/xml-data/build-dir/ODSRV-GAPT-COSMP/onedata/oneprovider_swagger'...
Cloning into '/mnt/storage/bamboo-agent-home/xml-data/build-dir/ODSRV-GAPT-COSMP/onedata/onezone_swagger'...
Submodule 'bamboos' (ssh://git@git.onedata.org:7999/vfs/bamboos.git) registered for path 'onepanel_swagger/bamboos'
Cloning into '/mnt/storage/bamboo-agent-home/xml-data/build-dir/ODSRV-GAPT-COSMP/onedata/onepanel_swagger/bamboos'...
Submodule 'bamboos' (ssh://git@git.onedata.org:7999/vfs/bamboos.git) registered for path 'oneprovider_swagger/bamboos'
Cloning into '/mnt/storage/bamboo-agent-home/xml-data/build-dir/ODSRV-GAPT-COSMP/onedata/oneprovider_swagger/bamboos'...
Submodule 'bamboos' (ssh://git@git.onedata.org:7999/vfs/bamboos.git) registered for path 'onezone_swagger/bamboos'
Cloning into '/mnt/storage/bamboo-agent-home/xml-data/build-dir/ODSRV-GAPT-COSMP/onedata/onezone_swagger/bamboos'...
error: failed to create clusterrolebinding: clusterrolebindings.rbac.authorization.k8s.io "serviceaccounts-cluster-admin" already exists
Error: could not find tiller
Error: could not find tiller
error: failed to create clusterrolebinding: clusterrolebindings.rbac.authorization.k8s.io "serviceaccounts-cluster-admin" already exists
protocol version mismatch -- is your shell clean?
(see the rsync man page for an explanation)
rsync error: protocol incompatibility (code 2) at compat.c(178) [Receiver=3.1.2]
command terminated with exit code 126
protocol version mismatch -- is your shell clean?
(see the rsync man page for an explanation)
rsync error: protocol incompatibility (code 2) at compat.c(178) [Receiver=3.1.2]
command terminated with exit code 126
protocol version mismatch -- is your shell clean?
(see the rsync man page for an explanation)
rsync error: protocol incompatibility (code 2) at compat.c(178) [Receiver=3.1.2]
command terminated with exit code 126
protocol version mismatch -- is your shell clean?
(see the rsync man page for an explanation)
rsync error: protocol incompatibility (code 2) at compat.c(178) [Receiver=3.1.2]
command terminated with exit code 126
protocol version mismatch -- is your shell clean?
(see the rsync man page for an explanation)
rsync error: protocol incompatibility (code 2) at compat.c(178) [Receiver=3.1.2]
command terminated with exit code 126
protocol version mismatch -- is your shell clean?
(see the rsync man page for an explanation)
rsync error: protocol incompatibility (code 2) at compat.c(178) [Receiver=3.1.2]
command terminated with exit code 126
error: failed to create clusterrolebinding: clusterrolebindings.rbac.authorization.k8s.io "serviceaccounts-cluster-admin" already exists
protocol version mismatch -- is your shell clean?
(see the rsync man page for an explanation)
rsync error: protocol incompatibility (code 2) at compat.c(178) [Receiver=3.1.2]
command terminated with exit code 126
protocol version mismatch -- is your shell clean?
(see the rsync man page for an explanation)
rsync error: protocol incompatibility (code 2) at compat.c(178) [Receiver=3.1.2]
command terminated with exit code 126
protocol version mismatch -- is your shell clean?
(see the rsync man page for an explanation)
rsync error: protocol incompatibility (code 2) at compat.c(178) [Receiver=3.1.2]
command terminated with exit code 126
protocol version mismatch -- is your shell clean?
(see the rsync man page for an explanation)
rsync error: protocol incompatibility (code 2) at compat.c(178) [Receiver=3.1.2]
command terminated with exit code 126
protocol version mismatch -- is your shell clean?
(see the rsync man page for an explanation)
rsync error: protocol incompatibility (code 2) at compat.c(178) [Receiver=3.1.2]
command terminated with exit code 126
protocol version mismatch -- is your shell clean?
(see the rsync man page for an explanation)
rsync error: protocol incompatibility (code 2) at compat.c(178) [Receiver=3.1.2]
command terminated with exit code 126
W0816 20:06:08.137756 739 warnings.go:70] policy/v1beta1 PodDisruptionBudget is deprecated in v1.21+, unavailable in v1.25+; use policy/v1 PodDisruptionBudget
cp: cannot stat 'onedata/one_env/sources_info.yaml': No such file or directory
% Total % Received % Xferd Average Speed Time Time Time Current
Dload Upload Total Spent Left Speed
0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0
0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0
100 4653 100 4653 0 0 32767 0 --:--:-- --:--:-- --:--:-- 32767
Error from server (NotFound): pods "dev-oneprovider-krakow-0" not found
Error response from daemon: Cannot kill container: cafe2648cb38: Container cafe2648cb38d77dd16226f19a837eeae2ffa924a96c647a48625eba4bc0e396 is not running
Error response from daemon: Cannot kill container: 11eb67e8a0df: Container 11eb67e8a0dfc3115856ab0cef81892ea17c25fc42bfb441b0761440202d911d is not running
Error response from daemon: Cannot kill container: 667ef93e73f4: Container 667ef93e73f4adf9446c1daca2bf4534227319c848321915bc4f84b7ccb9eaf5 is not running
Error response from daemon: Cannot kill container: f188758fe6c1: Container f188758fe6c132f1b9c9f1763ce09d6b9a038b1072537def3357ece63fb211cf is not running
Error response from daemon: Cannot kill container: 3e41af41ac8d: Container 3e41af41ac8d0f13fd7d4d1fd7dd1e88dee43eee65be903add08b55c7c9eca4d is not running
Error response from daemon: Cannot kill container: ec05c7b9e8eb: Container ec05c7b9e8ebccc028ab66b0bb29dca795d51fbdfd98a7c5b14e48c6cbe48bc9 is not running
Error response from daemon: Cannot kill container: 628deadde4b1: Container 628deadde4b1395abe92c664cb6e4ae9aae1b0faa9888f2c0682bba45b79a6ae is not running
Error response from daemon: Cannot kill container: ef8d9bf4b4f7: Container ef8d9bf4b4f7d8fdcf625eefb5efd785f332c51a72c2147849816db5f73d9902 is not running
Error response from daemon: Cannot kill container: c848b16feff7: Container c848b16feff7b9589527f0f6baa745d34abf92ff782fe9397ad390e3d93cfdae is not running
Error response from daemon: Cannot kill container: 43e001b7f7a5: Container 43e001b7f7a5593bd3fe5b736e2c477e4f2ed418157f42d3e2ffe32e3a9153e2 is not running
Error response from daemon: Cannot kill container: 5d78d1a7c5f6: Container 5d78d1a7c5f6e4a53b3ddbf605869a4be3fd65a894757d62dbc0f8cb10886ae7 is not running
Error response from daemon: Cannot kill container: 9c1c22a6fae7: Container 9c1c22a6fae7efb34cdf182af7b7e6837db35d0adaf9572cf32c87a0b25299fb is not running
Error response from daemon: Cannot kill container: b7216daee70f: Container b7216daee70f2a155adb1d48228bc0ce3d084221246394b159097aa123a5d389 is not running
Error response from daemon: Cannot kill container: ec5e805383a2: Container ec5e805383a21e41473641f7a5792d75bbf89f8160e96e348b15615575a1feb5 is not running
Error response from daemon: Cannot kill container: ed4648dbf5c7: Container ed4648dbf5c72150a45d53744b76d15479382636873789787ae9c5f8c74dcb75 is not running
Error response from daemon: Cannot kill container: 2e36b77648c9: Container 2e36b77648c907fb04a99ae7b6e6104180f52ca57964cd6d7c0a1504e705572b is not running