GUI acceptance tests using environment deployed from packages.
Build: #2261 failed
Job: Oneprovider ACL basic failed
Job result summary
- Completed
- Duration
- 6 minutes
- Revision
-
32a8637a9a01b830c75d008c42ff27e7892513dd
- Total tests
- 9
- Fixed in
- #2262 (Child of ODSRV-OPRPM-2186)
Tests
- 9 tests in total
- 9 tests failed
- 9 failures are new
- 3 minutes taken in total.
Status | Test | Duration | |
---|---|---|---|
Collapse |
test_oneprovider_acl_basic
test_user_cancels_acl_editing[1oz_1op_deployed]
|
1 sec | |
tests.utils.http_exceptions.HTTPBadRequest: [400] Bad Request: {"error":{"id":"errorOnNodes","details":{"hostnames":["dev-oneprovider-krakow-0.dev-oneprovider-krakow.default.svc.cluster.local"],"error":{"id":"badValueToken","details":{"tokenError":{"id":"badToken","description":"Provided token could not be understood by the server."},"key":"token"},"description":"Bad value: provided \"token\" is not a valid token (see details)."}},"description":"Error on nodes dev-oneprovider-krakow-0.dev-oneprovider-krakow.default.svc.cluster.local: Bad value: provided \"token\" is not a valid token (see details)."}} request = <FixtureRequest for <Function 'test_user_cancels_acl_editing[1oz_1op_deployed]'>> @pytest.mark.usefixtures(*function_args) def scenario_wrapper(request): > _execute_scenario(feature, scenario, request, encoding) /usr/local/lib/python3.6/dist-packages/pytest_bdd/scenario.py:227: (47 more lines...) |
|||
Collapse |
test_oneprovider_acl_basic
test_user_changes_order_of_acl_entries[1oz_1op_deployed-move down-first]
|
22 secs | |
tests.utils.http_exceptions.HTTPBadRequest: [400] Bad Request: {"error":{"id":"errorOnNodes","details":{"hostnames":["dev-oneprovider-krakow-0.dev-oneprovider-krakow.default.svc.cluster.local"],"error":{"id":"badValueToken","details":{"tokenError":{"id":"badToken","description":"Provided token could not be understood by the server."},"key":"token"},"description":"Bad value: provided \"token\" is not a valid token (see details)."}},"description":"Error on nodes dev-oneprovider-krakow-0.dev-oneprovider-krakow.default.svc.cluster.local: Bad value: provided \"token\" is not a valid token (see details)."}} request = <FixtureRequest for <Function 'test_user_changes_order_of_acl_entries[1oz_1op_deployed-move down-first]'>> @pytest.mark.usefixtures(*function_args) def scenario_wrapper(request): > _execute_scenario(feature, scenario, request, encoding) /usr/local/lib/python3.6/dist-packages/pytest_bdd/scenario.py:227: (47 more lines...) |
|||
Collapse |
test_oneprovider_acl_basic
test_user_changes_order_of_acl_entries[1oz_1op_deployed-move up-second]
|
1 sec | |
tests.utils.http_exceptions.HTTPBadRequest: [400] Bad Request: {"error":{"id":"errorOnNodes","details":{"hostnames":["dev-oneprovider-krakow-0.dev-oneprovider-krakow.default.svc.cluster.local"],"error":{"id":"badValueToken","details":{"tokenError":{"id":"badToken","description":"Provided token could not be understood by the server."},"key":"token"},"description":"Bad value: provided \"token\" is not a valid token (see details)."}},"description":"Error on nodes dev-oneprovider-krakow-0.dev-oneprovider-krakow.default.svc.cluster.local: Bad value: provided \"token\" is not a valid token (see details)."}} request = <FixtureRequest for <Function 'test_user_changes_order_of_acl_entries[1oz_1op_deployed-move up-second]'>> @pytest.mark.usefixtures(*function_args) def scenario_wrapper(request): > _execute_scenario(feature, scenario, request, encoding) /usr/local/lib/python3.6/dist-packages/pytest_bdd/scenario.py:227: (47 more lines...) |
|||
Collapse |
test_oneprovider_acl_basic
test_user_removes_acl_record[1oz_1op_deployed-[acl:read acl]-user-space-owner-user]
|
1 sec | |
tests.utils.http_exceptions.HTTPBadRequest: [400] Bad Request: {"error":{"id":"errorOnNodes","details":{"hostnames":["dev-oneprovider-krakow-0.dev-oneprovider-krakow.default.svc.cluster.local"],"error":{"id":"badValueToken","details":{"tokenError":{"id":"badToken","description":"Provided token could not be understood by the server."},"key":"token"},"description":"Bad value: provided \"token\" is not a valid token (see details)."}},"description":"Error on nodes dev-oneprovider-krakow-0.dev-oneprovider-krakow.default.svc.cluster.local: Bad value: provided \"token\" is not a valid token (see details)."}} request = <FixtureRequest for <Function 'test_user_removes_acl_record[1oz_1op_deployed-[acl:read acl]-user-space-owner-user]'>> @pytest.mark.usefixtures(*function_args) def scenario_wrapper(request): > _execute_scenario(feature, scenario, request, encoding) /usr/local/lib/python3.6/dist-packages/pytest_bdd/scenario.py:227: (47 more lines...) |
|||
Collapse |
test_oneprovider_acl_basic
test_user_saves_acl_entries_for_user_and_group[1oz_1op_deployed]
|
1 sec | |
tests.utils.http_exceptions.HTTPBadRequest: [400] Bad Request: {"error":{"id":"errorOnNodes","details":{"hostnames":["dev-oneprovider-krakow-0.dev-oneprovider-krakow.default.svc.cluster.local"],"error":{"id":"badValueToken","details":{"tokenError":{"id":"badToken","description":"Provided token could not be understood by the server."},"key":"token"},"description":"Bad value: provided \"token\" is not a valid token (see details)."}},"description":"Error on nodes dev-oneprovider-krakow-0.dev-oneprovider-krakow.default.svc.cluster.local: Bad value: provided \"token\" is not a valid token (see details)."}} request = <FixtureRequest for <Function 'test_user_saves_acl_entries_for_user_and_group[1oz_1op_deployed]'>> @pytest.mark.usefixtures(*function_args) def scenario_wrapper(request): > _execute_scenario(feature, scenario, request, encoding) /usr/local/lib/python3.6/dist-packages/pytest_bdd/scenario.py:227: (47 more lines...) |
|||
Collapse |
test_oneprovider_acl_basic
test_user_sets_acl_for_multiple_directories[1oz_1op_deployed-[allow, acl:read acl]-user-space-owner-user]
|
1 sec | |
tests.utils.http_exceptions.HTTPBadRequest: [400] Bad Request: {"error":{"id":"errorOnNodes","details":{"hostnames":["dev-oneprovider-krakow-0.dev-oneprovider-krakow.default.svc.cluster.local"],"error":{"id":"badValueToken","details":{"tokenError":{"id":"badToken","description":"Provided token could not be understood by the server."},"key":"token"},"description":"Bad value: provided \"token\" is not a valid token (see details)."}},"description":"Error on nodes dev-oneprovider-krakow-0.dev-oneprovider-krakow.default.svc.cluster.local: Bad value: provided \"token\" is not a valid token (see details)."}} request = <FixtureRequest for <Function 'test_user_sets_acl_for_multiple_directories[1oz_1op_deployed-[allow, acl:read acl]-user-space-owner-user]'>> @pytest.mark.usefixtures(*function_args) def scenario_wrapper(request): > _execute_scenario(feature, scenario, request, encoding) /usr/local/lib/python3.6/dist-packages/pytest_bdd/scenario.py:227: (47 more lines...) |
|||
Collapse |
test_oneprovider_acl_basic
test_user_sets_acl_for_multiple_files[1oz_1op_deployed-[allow, acl:read acl]-user-space-owner-user]
|
1 sec | |
tests.utils.http_exceptions.HTTPBadRequest: [400] Bad Request: {"error":{"id":"errorOnNodes","details":{"hostnames":["dev-oneprovider-krakow-0.dev-oneprovider-krakow.default.svc.cluster.local"],"error":{"id":"badValueToken","details":{"tokenError":{"id":"badToken","description":"Provided token could not be understood by the server."},"key":"token"},"description":"Bad value: provided \"token\" is not a valid token (see details)."}},"description":"Error on nodes dev-oneprovider-krakow-0.dev-oneprovider-krakow.default.svc.cluster.local: Bad value: provided \"token\" is not a valid token (see details)."}} request = <FixtureRequest for <Function 'test_user_sets_acl_for_multiple_files[1oz_1op_deployed-[allow, acl:read acl]-user-space-owner-user]'>> @pytest.mark.usefixtures(*function_args) def scenario_wrapper(request): > _execute_scenario(feature, scenario, request, encoding) /usr/local/lib/python3.6/dist-packages/pytest_bdd/scenario.py:227: (47 more lines...) |
|||
Collapse |
test_oneprovider_acl_basic
test_user_sets_one_acl_record_for_directory_in_edit_permissions_modal[1oz_1op_deployed-[allow, acl:read acl]-group-group1]
|
2 mins | |
tests.utils.http_exceptions.HTTPBadRequest: [400] Bad Request: {"error":{"id":"errorOnNodes","details":{"hostnames":["dev-oneprovider-krakow-0.dev-oneprovider-krakow.default.svc.cluster.local"],"error":{"id":"badValueToken","details":{"tokenError":{"id":"badToken","description":"Provided token could not be understood by the server."},"key":"token"},"description":"Bad value: provided \"token\" is not a valid token (see details)."}},"description":"Error on nodes dev-oneprovider-krakow-0.dev-oneprovider-krakow.default.svc.cluster.local: Bad value: provided \"token\" is not a valid token (see details)."}} request = <FixtureRequest for <Function 'test_user_sets_one_acl_record_for_directory_in_edit_permissions_modal[1oz_1op_deployed-[allow, acl:read acl]-group-group1]'>> @pytest.mark.usefixtures(*function_args) def scenario_wrapper(request): > _execute_scenario(feature, scenario, request, encoding) /usr/local/lib/python3.6/dist-packages/pytest_bdd/scenario.py:227: (47 more lines...) |
|||
Collapse |
test_oneprovider_acl_basic
test_user_sets_one_acl_record_for_file_in_edit_permissions_modal[1oz_1op_deployed-[allow, acl:read acl]-group-group1]
|
1 sec | |
tests.utils.http_exceptions.HTTPBadRequest: [400] Bad Request: {"error":{"id":"errorOnNodes","details":{"hostnames":["dev-oneprovider-krakow-0.dev-oneprovider-krakow.default.svc.cluster.local"],"error":{"id":"badValueToken","details":{"tokenError":{"id":"badToken","description":"Provided token could not be understood by the server."},"key":"token"},"description":"Bad value: provided \"token\" is not a valid token (see details)."}},"description":"Error on nodes dev-oneprovider-krakow-0.dev-oneprovider-krakow.default.svc.cluster.local: Bad value: provided \"token\" is not a valid token (see details)."}} request = <FixtureRequest for <Function 'test_user_sets_one_acl_record_for_file_in_edit_permissions_modal[1oz_1op_deployed-[allow, acl:read acl]-group-group1]'>> @pytest.mark.usefixtures(*function_args) def scenario_wrapper(request): > _execute_scenario(feature, scenario, request, encoding) /usr/local/lib/python3.6/dist-packages/pytest_bdd/scenario.py:227: (47 more lines...) |
Error summary
The build generated some errors. See the full build log for more details.
% Total % Received % Xferd Average Speed Time Time Time Current
Dload Upload Total Spent Left Speed
0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0
100 4653 100 4653 0 0 44740 0 --:--:-- --:--:-- --:--:-- 44740
Error response from daemon: Cannot kill container: b9b240bfaacc: Container b9b240bfaacceee1c7c1f38c93d49ee9a26c6f2ec3968261c1e84721cff8f08e is not running
Submodule 'bamboos' (ssh://git@git.onedata.org:7999/vfs/bamboos.git) registered for path 'bamboos'
Submodule 'cdmi_swagger' (ssh://git@git.onedata.org:7999/vfs/cdmi-swagger.git) registered for path 'cdmi_swagger'
Submodule 'one_env' (ssh://git@git.onedata.org:7999/vfs/one-env.git) registered for path 'one_env'
Submodule 'onepanel_swagger' (ssh://git@git.onedata.org:7999/vfs/onepanel-swagger.git) registered for path 'onepanel_swagger'
Submodule 'oneprovider_swagger' (ssh://git@git.onedata.org:7999/vfs/oneprovider-swagger.git) registered for path 'oneprovider_swagger'
Submodule 'onezone_swagger' (ssh://git@git.onedata.org:7999/vfs/onezone-swagger.git) registered for path 'onezone_swagger'
Cloning into '/mnt/storage/bamboo-agent-home/xml-data/build-dir/ODSRV-GAPT-COAB/onedata/bamboos'...
Cloning into '/mnt/storage/bamboo-agent-home/xml-data/build-dir/ODSRV-GAPT-COAB/onedata/cdmi_swagger'...
Cloning into '/mnt/storage/bamboo-agent-home/xml-data/build-dir/ODSRV-GAPT-COAB/onedata/one_env'...
Cloning into '/mnt/storage/bamboo-agent-home/xml-data/build-dir/ODSRV-GAPT-COAB/onedata/onepanel_swagger'...
Cloning into '/mnt/storage/bamboo-agent-home/xml-data/build-dir/ODSRV-GAPT-COAB/onedata/oneprovider_swagger'...
Cloning into '/mnt/storage/bamboo-agent-home/xml-data/build-dir/ODSRV-GAPT-COAB/onedata/onezone_swagger'...
Submodule 'bamboos' (ssh://git@git.onedata.org:7999/vfs/bamboos.git) registered for path 'onepanel_swagger/bamboos'
Cloning into '/mnt/storage/bamboo-agent-home/xml-data/build-dir/ODSRV-GAPT-COAB/onedata/onepanel_swagger/bamboos'...
Submodule 'bamboos' (ssh://git@git.onedata.org:7999/vfs/bamboos.git) registered for path 'oneprovider_swagger/bamboos'
Cloning into '/mnt/storage/bamboo-agent-home/xml-data/build-dir/ODSRV-GAPT-COAB/onedata/oneprovider_swagger/bamboos'...
Submodule 'bamboos' (ssh://git@git.onedata.org:7999/vfs/bamboos.git) registered for path 'onezone_swagger/bamboos'
Cloning into '/mnt/storage/bamboo-agent-home/xml-data/build-dir/ODSRV-GAPT-COAB/onedata/onezone_swagger/bamboos'...
error: failed to create clusterrolebinding: clusterrolebindings.rbac.authorization.k8s.io "serviceaccounts-cluster-admin" already exists
Error: could not find tiller
Error: could not find tiller
error: failed to create clusterrolebinding: clusterrolebindings.rbac.authorization.k8s.io "serviceaccounts-cluster-admin" already exists
protocol version mismatch -- is your shell clean?
(see the rsync man page for an explanation)
rsync error: protocol incompatibility (code 2) at compat.c(178) [Receiver=3.1.2]
command terminated with exit code 126
protocol version mismatch -- is your shell clean?
(see the rsync man page for an explanation)
rsync error: protocol incompatibility (code 2) at compat.c(178) [Receiver=3.1.2]
command terminated with exit code 126
protocol version mismatch -- is your shell clean?
(see the rsync man page for an explanation)
rsync error: protocol incompatibility (code 2) at compat.c(178) [Receiver=3.1.2]
command terminated with exit code 126
protocol version mismatch -- is your shell clean?
(see the rsync man page for an explanation)
rsync error: protocol incompatibility (code 2) at compat.c(178) [Receiver=3.1.2]
command terminated with exit code 126
protocol version mismatch -- is your shell clean?
(see the rsync man page for an explanation)
rsync error: protocol incompatibility (code 2) at compat.c(178) [Receiver=3.1.2]
command terminated with exit code 126
protocol version mismatch -- is your shell clean?
(see the rsync man page for an explanation)
rsync error: protocol incompatibility (code 2) at compat.c(178) [Receiver=3.1.2]
command terminated with exit code 126
error: failed to create clusterrolebinding: clusterrolebindings.rbac.authorization.k8s.io "serviceaccounts-cluster-admin" already exists
protocol version mismatch -- is your shell clean?
(see the rsync man page for an explanation)
rsync error: protocol incompatibility (code 2) at compat.c(178) [Receiver=3.1.2]
command terminated with exit code 126
protocol version mismatch -- is your shell clean?
(see the rsync man page for an explanation)
rsync error: protocol incompatibility (code 2) at compat.c(178) [Receiver=3.1.2]
command terminated with exit code 126
protocol version mismatch -- is your shell clean?
(see the rsync man page for an explanation)
rsync error: protocol incompatibility (code 2) at compat.c(178) [Receiver=3.1.2]
command terminated with exit code 126
protocol version mismatch -- is your shell clean?
(see the rsync man page for an explanation)
rsync error: protocol incompatibility (code 2) at compat.c(178) [Receiver=3.1.2]
command terminated with exit code 126
protocol version mismatch -- is your shell clean?
(see the rsync man page for an explanation)
rsync error: protocol incompatibility (code 2) at compat.c(178) [Receiver=3.1.2]
command terminated with exit code 126
protocol version mismatch -- is your shell clean?
(see the rsync man page for an explanation)
rsync error: protocol incompatibility (code 2) at compat.c(178) [Receiver=3.1.2]
command terminated with exit code 126
W0816 20:04:52.272678 720 warnings.go:70] policy/v1beta1 PodDisruptionBudget is deprecated in v1.21+, unavailable in v1.25+; use policy/v1 PodDisruptionBudget
cp: cannot stat 'onedata/one_env/sources_info.yaml': No such file or directory
% Total % Received % Xferd Average Speed Time Time Time Current
Dload Upload Total Spent Left Speed
0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0
100 4653 100 4653 0 0 44740 0 --:--:-- --:--:-- --:--:-- 44740
Error response from daemon: Cannot kill container: 4b4d0864648b: Container 4b4d0864648b320b50f8ef48b06904e9fee2ba978529d2d38719ace50d57765b is not running
Error response from daemon: Cannot kill container: 9f6bcab51acb: Container 9f6bcab51acb4cd83b553d4bd2936b20d9ca836f813f52db2ec145c9d9acb199 is not running
Error response from daemon: Cannot kill container: 3b4766023e7a: Container 3b4766023e7ab014bd677634309cc7d6bbbf6699c72c70fcc50c321653bf2861 is not running
Error response from daemon: Cannot kill container: df607663e3e6: Container df607663e3e625f496950e3625d049245ef6f79413cfc21585146ba69590fb3d is not running
Error response from daemon: Cannot kill container: 2eec4f963d45: Container 2eec4f963d45ffe5a2a24c984c089f7b62d49a93527074b54fc6cdb013c3e71a is not running
Error response from daemon: Cannot kill container: ec68bcd9507c: Container ec68bcd9507cf5701686017c26782c5e260771f3102214da9fbbac87b615ad9a is not running
Error response from daemon: Cannot kill container: d791291bc617: Container d791291bc617164eaa380b1c136b15ef8a98269eaadc7ba280332ca7233045cd is not running
Error response from daemon: Cannot kill container: 396b51561f07: Container 396b51561f07665600e4e49f0b803a8e9c10f4761d70d886cec324f8108f1af8 is not running
Error response from daemon: Cannot kill container: a015de020ead: Container a015de020ead8cecc1cff81903751e129756c43550112f5701b083041734f2a6 is not running
Error response from daemon: Cannot kill container: ed7db1a55133: Container ed7db1a55133b8627bb2afa8f055220b16ff27db85599ebeb7300c7cd93de7f5 is not running
Error response from daemon: Cannot kill container: fe28b092709f: Container fe28b092709fffab38e1f6a76ef013fc31508d0ed98d5d82010b9eb4ea9f1a8a is not running
Error response from daemon: Cannot kill container: 775b2852a9b5: Container 775b2852a9b536bbfea26e13eefd74bb125883b59e7c7d77817a5c0b49228d46 is not running
Error response from daemon: Cannot kill container: 54b451744e51: Container 54b451744e51a8bd1392664d48cc2509f99342a6ea63a1b8f3be41803d4d7f25 is not running
Error response from daemon: Cannot kill container: 2bf4de2cc9c8: Container 2bf4de2cc9c8aea767c0484eb8de214bced188dd9db3cee304ca16c33f2c5e95 is not running
Error response from daemon: Cannot kill container: 98aeec507c16: Container 98aeec507c1602f0ccc1d145c1b271fbc2ee77cb5dd1003dccdd544d76d6d1dc is not running
Error response from daemon: Cannot kill container: e3f6c267c56b: Container e3f6c267c56b2a5282bbe04eb5425a3973145f9c06333adc3b3ee9e1b0a69dff is not running
Dload Upload Total Spent Left Speed
0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0
100 4653 100 4653 0 0 44740 0 --:--:-- --:--:-- --:--:-- 44740
Error response from daemon: Cannot kill container: b9b240bfaacc: Container b9b240bfaacceee1c7c1f38c93d49ee9a26c6f2ec3968261c1e84721cff8f08e is not running
Submodule 'bamboos' (ssh://git@git.onedata.org:7999/vfs/bamboos.git) registered for path 'bamboos'
Submodule 'cdmi_swagger' (ssh://git@git.onedata.org:7999/vfs/cdmi-swagger.git) registered for path 'cdmi_swagger'
Submodule 'one_env' (ssh://git@git.onedata.org:7999/vfs/one-env.git) registered for path 'one_env'
Submodule 'onepanel_swagger' (ssh://git@git.onedata.org:7999/vfs/onepanel-swagger.git) registered for path 'onepanel_swagger'
Submodule 'oneprovider_swagger' (ssh://git@git.onedata.org:7999/vfs/oneprovider-swagger.git) registered for path 'oneprovider_swagger'
Submodule 'onezone_swagger' (ssh://git@git.onedata.org:7999/vfs/onezone-swagger.git) registered for path 'onezone_swagger'
Cloning into '/mnt/storage/bamboo-agent-home/xml-data/build-dir/ODSRV-GAPT-COAB/onedata/bamboos'...
Cloning into '/mnt/storage/bamboo-agent-home/xml-data/build-dir/ODSRV-GAPT-COAB/onedata/cdmi_swagger'...
Cloning into '/mnt/storage/bamboo-agent-home/xml-data/build-dir/ODSRV-GAPT-COAB/onedata/one_env'...
Cloning into '/mnt/storage/bamboo-agent-home/xml-data/build-dir/ODSRV-GAPT-COAB/onedata/onepanel_swagger'...
Cloning into '/mnt/storage/bamboo-agent-home/xml-data/build-dir/ODSRV-GAPT-COAB/onedata/oneprovider_swagger'...
Cloning into '/mnt/storage/bamboo-agent-home/xml-data/build-dir/ODSRV-GAPT-COAB/onedata/onezone_swagger'...
Submodule 'bamboos' (ssh://git@git.onedata.org:7999/vfs/bamboos.git) registered for path 'onepanel_swagger/bamboos'
Cloning into '/mnt/storage/bamboo-agent-home/xml-data/build-dir/ODSRV-GAPT-COAB/onedata/onepanel_swagger/bamboos'...
Submodule 'bamboos' (ssh://git@git.onedata.org:7999/vfs/bamboos.git) registered for path 'oneprovider_swagger/bamboos'
Cloning into '/mnt/storage/bamboo-agent-home/xml-data/build-dir/ODSRV-GAPT-COAB/onedata/oneprovider_swagger/bamboos'...
Submodule 'bamboos' (ssh://git@git.onedata.org:7999/vfs/bamboos.git) registered for path 'onezone_swagger/bamboos'
Cloning into '/mnt/storage/bamboo-agent-home/xml-data/build-dir/ODSRV-GAPT-COAB/onedata/onezone_swagger/bamboos'...
error: failed to create clusterrolebinding: clusterrolebindings.rbac.authorization.k8s.io "serviceaccounts-cluster-admin" already exists
Error: could not find tiller
Error: could not find tiller
error: failed to create clusterrolebinding: clusterrolebindings.rbac.authorization.k8s.io "serviceaccounts-cluster-admin" already exists
protocol version mismatch -- is your shell clean?
(see the rsync man page for an explanation)
rsync error: protocol incompatibility (code 2) at compat.c(178) [Receiver=3.1.2]
command terminated with exit code 126
protocol version mismatch -- is your shell clean?
(see the rsync man page for an explanation)
rsync error: protocol incompatibility (code 2) at compat.c(178) [Receiver=3.1.2]
command terminated with exit code 126
protocol version mismatch -- is your shell clean?
(see the rsync man page for an explanation)
rsync error: protocol incompatibility (code 2) at compat.c(178) [Receiver=3.1.2]
command terminated with exit code 126
protocol version mismatch -- is your shell clean?
(see the rsync man page for an explanation)
rsync error: protocol incompatibility (code 2) at compat.c(178) [Receiver=3.1.2]
command terminated with exit code 126
protocol version mismatch -- is your shell clean?
(see the rsync man page for an explanation)
rsync error: protocol incompatibility (code 2) at compat.c(178) [Receiver=3.1.2]
command terminated with exit code 126
protocol version mismatch -- is your shell clean?
(see the rsync man page for an explanation)
rsync error: protocol incompatibility (code 2) at compat.c(178) [Receiver=3.1.2]
command terminated with exit code 126
error: failed to create clusterrolebinding: clusterrolebindings.rbac.authorization.k8s.io "serviceaccounts-cluster-admin" already exists
protocol version mismatch -- is your shell clean?
(see the rsync man page for an explanation)
rsync error: protocol incompatibility (code 2) at compat.c(178) [Receiver=3.1.2]
command terminated with exit code 126
protocol version mismatch -- is your shell clean?
(see the rsync man page for an explanation)
rsync error: protocol incompatibility (code 2) at compat.c(178) [Receiver=3.1.2]
command terminated with exit code 126
protocol version mismatch -- is your shell clean?
(see the rsync man page for an explanation)
rsync error: protocol incompatibility (code 2) at compat.c(178) [Receiver=3.1.2]
command terminated with exit code 126
protocol version mismatch -- is your shell clean?
(see the rsync man page for an explanation)
rsync error: protocol incompatibility (code 2) at compat.c(178) [Receiver=3.1.2]
command terminated with exit code 126
protocol version mismatch -- is your shell clean?
(see the rsync man page for an explanation)
rsync error: protocol incompatibility (code 2) at compat.c(178) [Receiver=3.1.2]
command terminated with exit code 126
protocol version mismatch -- is your shell clean?
(see the rsync man page for an explanation)
rsync error: protocol incompatibility (code 2) at compat.c(178) [Receiver=3.1.2]
command terminated with exit code 126
W0816 20:04:52.272678 720 warnings.go:70] policy/v1beta1 PodDisruptionBudget is deprecated in v1.21+, unavailable in v1.25+; use policy/v1 PodDisruptionBudget
cp: cannot stat 'onedata/one_env/sources_info.yaml': No such file or directory
% Total % Received % Xferd Average Speed Time Time Time Current
Dload Upload Total Spent Left Speed
0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0
100 4653 100 4653 0 0 44740 0 --:--:-- --:--:-- --:--:-- 44740
Error response from daemon: Cannot kill container: 4b4d0864648b: Container 4b4d0864648b320b50f8ef48b06904e9fee2ba978529d2d38719ace50d57765b is not running
Error response from daemon: Cannot kill container: 9f6bcab51acb: Container 9f6bcab51acb4cd83b553d4bd2936b20d9ca836f813f52db2ec145c9d9acb199 is not running
Error response from daemon: Cannot kill container: 3b4766023e7a: Container 3b4766023e7ab014bd677634309cc7d6bbbf6699c72c70fcc50c321653bf2861 is not running
Error response from daemon: Cannot kill container: df607663e3e6: Container df607663e3e625f496950e3625d049245ef6f79413cfc21585146ba69590fb3d is not running
Error response from daemon: Cannot kill container: 2eec4f963d45: Container 2eec4f963d45ffe5a2a24c984c089f7b62d49a93527074b54fc6cdb013c3e71a is not running
Error response from daemon: Cannot kill container: ec68bcd9507c: Container ec68bcd9507cf5701686017c26782c5e260771f3102214da9fbbac87b615ad9a is not running
Error response from daemon: Cannot kill container: d791291bc617: Container d791291bc617164eaa380b1c136b15ef8a98269eaadc7ba280332ca7233045cd is not running
Error response from daemon: Cannot kill container: 396b51561f07: Container 396b51561f07665600e4e49f0b803a8e9c10f4761d70d886cec324f8108f1af8 is not running
Error response from daemon: Cannot kill container: a015de020ead: Container a015de020ead8cecc1cff81903751e129756c43550112f5701b083041734f2a6 is not running
Error response from daemon: Cannot kill container: ed7db1a55133: Container ed7db1a55133b8627bb2afa8f055220b16ff27db85599ebeb7300c7cd93de7f5 is not running
Error response from daemon: Cannot kill container: fe28b092709f: Container fe28b092709fffab38e1f6a76ef013fc31508d0ed98d5d82010b9eb4ea9f1a8a is not running
Error response from daemon: Cannot kill container: 775b2852a9b5: Container 775b2852a9b536bbfea26e13eefd74bb125883b59e7c7d77817a5c0b49228d46 is not running
Error response from daemon: Cannot kill container: 54b451744e51: Container 54b451744e51a8bd1392664d48cc2509f99342a6ea63a1b8f3be41803d4d7f25 is not running
Error response from daemon: Cannot kill container: 2bf4de2cc9c8: Container 2bf4de2cc9c8aea767c0484eb8de214bced188dd9db3cee304ca16c33f2c5e95 is not running
Error response from daemon: Cannot kill container: 98aeec507c16: Container 98aeec507c1602f0ccc1d145c1b271fbc2ee77cb5dd1003dccdd544d76d6d1dc is not running
Error response from daemon: Cannot kill container: e3f6c267c56b: Container e3f6c267c56b2a5282bbe04eb5425a3973145f9c06333adc3b3ee9e1b0a69dff is not running