GUI acceptance tests using environment deployed from packages.
Build: #2261 failed
Job: Oneprovider hardlinks failed
Job result summary
- Completed
- Duration
- 6 minutes
- Revision
-
32a8637a9a01b830c75d008c42ff27e7892513dd
- Total tests
- 9
- Fixed in
- #2262 (Child of ODSRV-OPRPM-2186)
Tests
- 9 tests in total
- 9 tests failed
- 9 failures are new
- 3 minutes taken in total.
Status | Test | Duration | |
---|---|---|---|
Collapse |
test_oneprovider_hardlinks
test_hardlink_info_is_no_longer_visible_after_hardlink_removal[1oz_1op_deployed]
|
1 sec | |
tests.utils.http_exceptions.HTTPBadRequest: [400] Bad Request: {"error":{"id":"errorOnNodes","details":{"hostnames":["dev-oneprovider-krakow-0.dev-oneprovider-krakow.default.svc.cluster.local"],"error":{"id":"badValueToken","details":{"tokenError":{"id":"badToken","description":"Provided token could not be understood by the server."},"key":"token"},"description":"Bad value: provided \"token\" is not a valid token (see details)."}},"description":"Error on nodes dev-oneprovider-krakow-0.dev-oneprovider-krakow.default.svc.cluster.local: Bad value: provided \"token\" is not a valid token (see details)."}} request = <FixtureRequest for <Function 'test_hardlink_info_is_no_longer_visible_after_hardlink_removal[1oz_1op_deployed]'>> @pytest.mark.usefixtures(*function_args) def scenario_wrapper(request): > _execute_scenario(feature, scenario, request, encoding) /usr/local/lib/python3.6/dist-packages/pytest_bdd/scenario.py:227: (47 more lines...) |
|||
Collapse |
test_oneprovider_hardlinks
test_hardlink_tag_opens_file_details_modal_with_hardlinks_information[1oz_1op_deployed]
|
2 secs | |
tests.utils.http_exceptions.HTTPBadRequest: [400] Bad Request: {"error":{"id":"errorOnNodes","details":{"hostnames":["dev-oneprovider-krakow-0.dev-oneprovider-krakow.default.svc.cluster.local"],"error":{"id":"badValueToken","details":{"tokenError":{"id":"badToken","description":"Provided token could not be understood by the server."},"key":"token"},"description":"Bad value: provided \"token\" is not a valid token (see details)."}},"description":"Error on nodes dev-oneprovider-krakow-0.dev-oneprovider-krakow.default.svc.cluster.local: Bad value: provided \"token\" is not a valid token (see details)."}} request = <FixtureRequest for <Function 'test_hardlink_tag_opens_file_details_modal_with_hardlinks_information[1oz_1op_deployed]'>> @pytest.mark.usefixtures(*function_args) def scenario_wrapper(request): > _execute_scenario(feature, scenario, request, encoding) /usr/local/lib/python3.6/dist-packages/pytest_bdd/scenario.py:227: (47 more lines...) |
|||
Collapse |
test_oneprovider_hardlinks
test_new_hardlink_name_is_visible_after_hardlink_rename[1oz_1op_deployed]
|
1 sec | |
tests.utils.http_exceptions.HTTPBadRequest: [400] Bad Request: {"error":{"id":"errorOnNodes","details":{"hostnames":["dev-oneprovider-krakow-0.dev-oneprovider-krakow.default.svc.cluster.local"],"error":{"id":"badValueToken","details":{"tokenError":{"id":"badToken","description":"Provided token could not be understood by the server."},"key":"token"},"description":"Bad value: provided \"token\" is not a valid token (see details)."}},"description":"Error on nodes dev-oneprovider-krakow-0.dev-oneprovider-krakow.default.svc.cluster.local: Bad value: provided \"token\" is not a valid token (see details)."}} request = <FixtureRequest for <Function 'test_new_hardlink_name_is_visible_after_hardlink_rename[1oz_1op_deployed]'>> @pytest.mark.usefixtures(*function_args) def scenario_wrapper(request): > _execute_scenario(feature, scenario, request, encoding) /usr/local/lib/python3.6/dist-packages/pytest_bdd/scenario.py:227: (47 more lines...) |
|||
Collapse |
test_oneprovider_hardlinks
test_user_creates_hardlink_of_file_in_the_same_directory_in_file_browser_and_checks_its_presence[1oz_1op_deployed]
|
3 mins | |
tests.utils.http_exceptions.HTTPBadRequest: [400] Bad Request: {"error":{"id":"errorOnNodes","details":{"hostnames":["dev-oneprovider-krakow-0.dev-oneprovider-krakow.default.svc.cluster.local"],"error":{"id":"badValueToken","details":{"tokenError":{"id":"badToken","description":"Provided token could not be understood by the server."},"key":"token"},"description":"Bad value: provided \"token\" is not a valid token (see details)."}},"description":"Error on nodes dev-oneprovider-krakow-0.dev-oneprovider-krakow.default.svc.cluster.local: Bad value: provided \"token\" is not a valid token (see details)."}} request = <FixtureRequest for <Function 'test_user_creates_hardlink_of_file_in_the_same_directory_in_file_browser_and_checks_its_presence[1oz_1op_deployed]'>> @pytest.mark.usefixtures(*function_args) def scenario_wrapper(request): > _execute_scenario(feature, scenario, request, encoding) /usr/local/lib/python3.6/dist-packages/pytest_bdd/scenario.py:227: (47 more lines...) |
|||
Collapse |
test_oneprovider_hardlinks
test_user_creates_hardlink_of_hardlink[1oz_1op_deployed]
|
1 sec | |
tests.utils.http_exceptions.HTTPBadRequest: [400] Bad Request: {"error":{"id":"errorOnNodes","details":{"hostnames":["dev-oneprovider-krakow-0.dev-oneprovider-krakow.default.svc.cluster.local"],"error":{"id":"badValueToken","details":{"tokenError":{"id":"badToken","description":"Provided token could not be understood by the server."},"key":"token"},"description":"Bad value: provided \"token\" is not a valid token (see details)."}},"description":"Error on nodes dev-oneprovider-krakow-0.dev-oneprovider-krakow.default.svc.cluster.local: Bad value: provided \"token\" is not a valid token (see details)."}} request = <FixtureRequest for <Function 'test_user_creates_hardlink_of_hardlink[1oz_1op_deployed]'>> @pytest.mark.usefixtures(*function_args) def scenario_wrapper(request): > _execute_scenario(feature, scenario, request, encoding) /usr/local/lib/python3.6/dist-packages/pytest_bdd/scenario.py:227: (47 more lines...) |
|||
Collapse |
test_oneprovider_hardlinks
test_user_creates_hardlinks_in_other_directories_than_original_files[1oz_1op_deployed]
|
1 sec | |
tests.utils.http_exceptions.HTTPBadRequest: [400] Bad Request: {"error":{"id":"errorOnNodes","details":{"hostnames":["dev-oneprovider-krakow-0.dev-oneprovider-krakow.default.svc.cluster.local"],"error":{"id":"badValueToken","details":{"tokenError":{"id":"badToken","description":"Provided token could not be understood by the server."},"key":"token"},"description":"Bad value: provided \"token\" is not a valid token (see details)."}},"description":"Error on nodes dev-oneprovider-krakow-0.dev-oneprovider-krakow.default.svc.cluster.local: Bad value: provided \"token\" is not a valid token (see details)."}} request = <FixtureRequest for <Function 'test_user_creates_hardlinks_in_other_directories_than_original_files[1oz_1op_deployed]'>> @pytest.mark.usefixtures(*function_args) def scenario_wrapper(request): > _execute_scenario(feature, scenario, request, encoding) /usr/local/lib/python3.6/dist-packages/pytest_bdd/scenario.py:227: (47 more lines...) |
|||
Collapse |
test_oneprovider_hardlinks
test_user_downloads_hardlink_of_file[1oz_1op_deployed]
|
2 secs | |
tests.utils.http_exceptions.HTTPBadRequest: [400] Bad Request: {"error":{"id":"errorOnNodes","details":{"hostnames":["dev-oneprovider-krakow-0.dev-oneprovider-krakow.default.svc.cluster.local"],"error":{"id":"badValueToken","details":{"tokenError":{"id":"badToken","description":"Provided token could not be understood by the server."},"key":"token"},"description":"Bad value: provided \"token\" is not a valid token (see details)."}},"description":"Error on nodes dev-oneprovider-krakow-0.dev-oneprovider-krakow.default.svc.cluster.local: Bad value: provided \"token\" is not a valid token (see details)."}} request = <FixtureRequest for <Function 'test_user_downloads_hardlink_of_file[1oz_1op_deployed]'>> @pytest.mark.usefixtures(*function_args) def scenario_wrapper(request): > _execute_scenario(feature, scenario, request, encoding) /usr/local/lib/python3.6/dist-packages/pytest_bdd/scenario.py:227: (47 more lines...) |
|||
Collapse |
test_oneprovider_hardlinks
test_user_sees_change_of_hardlink_acl_permission_after_first_hardlink_permissions_change[1oz_1op_deployed]
|
24 secs | |
tests.utils.http_exceptions.HTTPBadRequest: [400] Bad Request: {"error":{"id":"errorOnNodes","details":{"hostnames":["dev-oneprovider-krakow-0.dev-oneprovider-krakow.default.svc.cluster.local"],"error":{"id":"badValueToken","details":{"tokenError":{"id":"badToken","description":"Provided token could not be understood by the server."},"key":"token"},"description":"Bad value: provided \"token\" is not a valid token (see details)."}},"description":"Error on nodes dev-oneprovider-krakow-0.dev-oneprovider-krakow.default.svc.cluster.local: Bad value: provided \"token\" is not a valid token (see details)."}} request = <FixtureRequest for <Function 'test_user_sees_change_of_hardlink_acl_permission_after_first_hardlink_permissions_change[1oz_1op_deployed]'>> @pytest.mark.usefixtures(*function_args) def scenario_wrapper(request): > _execute_scenario(feature, scenario, request, encoding) /usr/local/lib/python3.6/dist-packages/pytest_bdd/scenario.py:227: (47 more lines...) |
|||
Collapse |
test_oneprovider_hardlinks
test_user_sees_change_of_hardlink_posix_permission_after_second_hardlink_permissions_change[1oz_1op_deployed]
|
1 sec | |
tests.utils.http_exceptions.HTTPBadRequest: [400] Bad Request: {"error":{"id":"errorOnNodes","details":{"hostnames":["dev-oneprovider-krakow-0.dev-oneprovider-krakow.default.svc.cluster.local"],"error":{"id":"badValueToken","details":{"tokenError":{"id":"badToken","description":"Provided token could not be understood by the server."},"key":"token"},"description":"Bad value: provided \"token\" is not a valid token (see details)."}},"description":"Error on nodes dev-oneprovider-krakow-0.dev-oneprovider-krakow.default.svc.cluster.local: Bad value: provided \"token\" is not a valid token (see details)."}} request = <FixtureRequest for <Function 'test_user_sees_change_of_hardlink_posix_permission_after_second_hardlink_permissions_change[1oz_1op_deployed]'>> @pytest.mark.usefixtures(*function_args) def scenario_wrapper(request): > _execute_scenario(feature, scenario, request, encoding) /usr/local/lib/python3.6/dist-packages/pytest_bdd/scenario.py:227: (47 more lines...) |
Error summary
The build generated some errors. See the full build log for more details.
% Total % Received % Xferd Average Speed Time Time Time Current
Dload Upload Total Spent Left Speed
0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0
100 4653 100 4653 0 0 40460 0 --:--:-- --:--:-- --:--:-- 40815
Error response from daemon: Cannot kill container: ebfcc238d9b8: Container ebfcc238d9b8229e34166160a998089c45b8727567dbf68b9440a4fd83987bba is not running
Submodule 'bamboos' (ssh://git@git.onedata.org:7999/vfs/bamboos.git) registered for path 'bamboos'
Submodule 'cdmi_swagger' (ssh://git@git.onedata.org:7999/vfs/cdmi-swagger.git) registered for path 'cdmi_swagger'
Submodule 'one_env' (ssh://git@git.onedata.org:7999/vfs/one-env.git) registered for path 'one_env'
Submodule 'onepanel_swagger' (ssh://git@git.onedata.org:7999/vfs/onepanel-swagger.git) registered for path 'onepanel_swagger'
Submodule 'oneprovider_swagger' (ssh://git@git.onedata.org:7999/vfs/oneprovider-swagger.git) registered for path 'oneprovider_swagger'
Submodule 'onezone_swagger' (ssh://git@git.onedata.org:7999/vfs/onezone-swagger.git) registered for path 'onezone_swagger'
Cloning into '/mnt/storage/bamboo-agent-home/xml-data/build-dir/ODSRV-GAPT-COH/onedata/bamboos'...
Cloning into '/mnt/storage/bamboo-agent-home/xml-data/build-dir/ODSRV-GAPT-COH/onedata/cdmi_swagger'...
Cloning into '/mnt/storage/bamboo-agent-home/xml-data/build-dir/ODSRV-GAPT-COH/onedata/one_env'...
Cloning into '/mnt/storage/bamboo-agent-home/xml-data/build-dir/ODSRV-GAPT-COH/onedata/onepanel_swagger'...
Cloning into '/mnt/storage/bamboo-agent-home/xml-data/build-dir/ODSRV-GAPT-COH/onedata/oneprovider_swagger'...
Cloning into '/mnt/storage/bamboo-agent-home/xml-data/build-dir/ODSRV-GAPT-COH/onedata/onezone_swagger'...
Submodule 'bamboos' (ssh://git@git.onedata.org:7999/vfs/bamboos.git) registered for path 'onepanel_swagger/bamboos'
Cloning into '/mnt/storage/bamboo-agent-home/xml-data/build-dir/ODSRV-GAPT-COH/onedata/onepanel_swagger/bamboos'...
Submodule 'bamboos' (ssh://git@git.onedata.org:7999/vfs/bamboos.git) registered for path 'oneprovider_swagger/bamboos'
Cloning into '/mnt/storage/bamboo-agent-home/xml-data/build-dir/ODSRV-GAPT-COH/onedata/oneprovider_swagger/bamboos'...
Submodule 'bamboos' (ssh://git@git.onedata.org:7999/vfs/bamboos.git) registered for path 'onezone_swagger/bamboos'
Cloning into '/mnt/storage/bamboo-agent-home/xml-data/build-dir/ODSRV-GAPT-COH/onedata/onezone_swagger/bamboos'...
error: failed to create clusterrolebinding: clusterrolebindings.rbac.authorization.k8s.io "serviceaccounts-cluster-admin" already exists
Error: could not find tiller
Error: could not find tiller
error: failed to create clusterrolebinding: clusterrolebindings.rbac.authorization.k8s.io "serviceaccounts-cluster-admin" already exists
protocol version mismatch -- is your shell clean?
(see the rsync man page for an explanation)
rsync error: protocol incompatibility (code 2) at compat.c(178) [Receiver=3.1.2]
command terminated with exit code 126
protocol version mismatch -- is your shell clean?
(see the rsync man page for an explanation)
rsync error: protocol incompatibility (code 2) at compat.c(178) [Receiver=3.1.2]
command terminated with exit code 126
protocol version mismatch -- is your shell clean?
(see the rsync man page for an explanation)
rsync error: protocol incompatibility (code 2) at compat.c(178) [Receiver=3.1.2]
command terminated with exit code 126
protocol version mismatch -- is your shell clean?
(see the rsync man page for an explanation)
rsync error: protocol incompatibility (code 2) at compat.c(178) [Receiver=3.1.2]
command terminated with exit code 126
protocol version mismatch -- is your shell clean?
(see the rsync man page for an explanation)
rsync error: protocol incompatibility (code 2) at compat.c(178) [Receiver=3.1.2]
command terminated with exit code 126
protocol version mismatch -- is your shell clean?
(see the rsync man page for an explanation)
rsync error: protocol incompatibility (code 2) at compat.c(178) [Receiver=3.1.2]
command terminated with exit code 126
error: failed to create clusterrolebinding: clusterrolebindings.rbac.authorization.k8s.io "serviceaccounts-cluster-admin" already exists
protocol version mismatch -- is your shell clean?
(see the rsync man page for an explanation)
rsync error: protocol incompatibility (code 2) at compat.c(178) [Receiver=3.1.2]
command terminated with exit code 126
protocol version mismatch -- is your shell clean?
(see the rsync man page for an explanation)
rsync error: protocol incompatibility (code 2) at compat.c(178) [Receiver=3.1.2]
command terminated with exit code 126
protocol version mismatch -- is your shell clean?
(see the rsync man page for an explanation)
rsync error: protocol incompatibility (code 2) at compat.c(178) [Receiver=3.1.2]
command terminated with exit code 126
protocol version mismatch -- is your shell clean?
(see the rsync man page for an explanation)
rsync error: protocol incompatibility (code 2) at compat.c(178) [Receiver=3.1.2]
protocol version mismatch -- is your shell clean?
(see the rsync man page for an explanation)
rsync error: protocol incompatibility (code 2) at compat.c(178) [Receiver=3.1.2]
command terminated with exit code 126
protocol version mismatch -- is your shell clean?
(see the rsync man page for an explanation)
rsync error: protocol incompatibility (code 2) at compat.c(178) [Receiver=3.1.2]
command terminated with exit code 126
W0816 20:12:26.130059 721 warnings.go:70] policy/v1beta1 PodDisruptionBudget is deprecated in v1.21+, unavailable in v1.25+; use policy/v1 PodDisruptionBudget
cp: cannot stat 'onedata/one_env/sources_info.yaml': No such file or directory
% Total % Received % Xferd Average Speed Time Time Time Current
Dload Upload Total Spent Left Speed
0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0
100 4653 100 4653 0 0 30411 0 --:--:-- --:--:-- --:--:-- 30611
Error from server (NotFound): pods "dev-oneprovider-krakow-0" not found
Error: No such object: 3b8e55ed417f
Error: No such object: ed9dd0706344
Error: No such object: ed9dd0706344
Error: No such object: cf8aebd513a6
Error: No such object: cf8aebd513a6
Error: No such object: 368e7b8e88c0
Error: No such object: 368e7b8e88c0
Error: No such object: 936f3854fdf1
Error: No such object: 936f3854fdf1
Error: No such object: 3a42391a5303
Error: No such object: 3a42391a5303
Error: No such object: 040af239af96
Error: No such object: 040af239af96
Error: No such object: f842a741a7ab
Error: No such object: f842a741a7ab
Error: No such object: 5a1539f8969e
Error: No such object: 5a1539f8969e
Error: No such object: ec8f67806660
Error: No such object: ec8f67806660
Error: No such object: 06dd196cbb05
Error: No such object: 06dd196cbb05
Error: No such object: 3821e5b4ff83
Error: No such object: 3821e5b4ff83
Error: No such object: 55f7a34aa595
Error: No such object: 55f7a34aa595
Error: No such object: 7c5c5615d979
Error: No such object: 7c5c5615d979
Error: No such object: e06f41e99123
Error: No such object: e06f41e99123
Error response from daemon: Cannot kill container: 3c9026c3c4e4: Container 3c9026c3c4e476254eed7328ef5d75bbb5c0ed8a049d83bcaa29fcc4dadc2592 is not running
Error response from daemon: Cannot kill container: 3b8e55ed417f: No such container: 3b8e55ed417f
Error: No such container: 3b8e55ed417f
Error response from daemon: Cannot kill container: ed9dd0706344: No such container: ed9dd0706344
Error: No such container: ed9dd0706344
Error response from daemon: Cannot kill container: cf8aebd513a6: No such container: cf8aebd513a6
Error: No such container: cf8aebd513a6
Error response from daemon: Cannot kill container: 368e7b8e88c0: No such container: 368e7b8e88c0
Error: No such container: 368e7b8e88c0
Error response from daemon: Cannot kill container: 936f3854fdf1: No such container: 936f3854fdf1
Error: No such container: 936f3854fdf1
Error response from daemon: Cannot kill container: 3a42391a5303: No such container: 3a42391a5303
Error: No such container: 3a42391a5303
Error response from daemon: Cannot kill container: 040af239af96: No such container: 040af239af96
Error: No such container: 040af239af96
Error response from daemon: Cannot kill container: f842a741a7ab: No such container: f842a741a7ab
Error: No such container: f842a741a7ab
Error response from daemon: Cannot kill container: 5a1539f8969e: No such container: 5a1539f8969e
Error: No such container: 5a1539f8969e
Error response from daemon: Cannot kill container: ec8f67806660: No such container: ec8f67806660
Error: No such container: ec8f67806660
Error response from daemon: Cannot kill container: 06dd196cbb05: No such container: 06dd196cbb05
Error: No such container: 06dd196cbb05
Error response from daemon: Cannot kill container: 3821e5b4ff83: No such container: 3821e5b4ff83
Error: No such container: 3821e5b4ff83
Error response from daemon: Cannot kill container: 55f7a34aa595: No such container: 55f7a34aa595
Error: No such container: 55f7a34aa595
Error response from daemon: Cannot kill container: 7c5c5615d979: No such container: 7c5c5615d979
Error: No such container: 7c5c5615d979
Error response from daemon: Cannot kill container: e06f41e99123: No such container: e06f41e99123
Error: No such container: e06f41e99123
Dload Upload Total Spent Left Speed
0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0
100 4653 100 4653 0 0 40460 0 --:--:-- --:--:-- --:--:-- 40815
Error response from daemon: Cannot kill container: ebfcc238d9b8: Container ebfcc238d9b8229e34166160a998089c45b8727567dbf68b9440a4fd83987bba is not running
Submodule 'bamboos' (ssh://git@git.onedata.org:7999/vfs/bamboos.git) registered for path 'bamboos'
Submodule 'cdmi_swagger' (ssh://git@git.onedata.org:7999/vfs/cdmi-swagger.git) registered for path 'cdmi_swagger'
Submodule 'one_env' (ssh://git@git.onedata.org:7999/vfs/one-env.git) registered for path 'one_env'
Submodule 'onepanel_swagger' (ssh://git@git.onedata.org:7999/vfs/onepanel-swagger.git) registered for path 'onepanel_swagger'
Submodule 'oneprovider_swagger' (ssh://git@git.onedata.org:7999/vfs/oneprovider-swagger.git) registered for path 'oneprovider_swagger'
Submodule 'onezone_swagger' (ssh://git@git.onedata.org:7999/vfs/onezone-swagger.git) registered for path 'onezone_swagger'
Cloning into '/mnt/storage/bamboo-agent-home/xml-data/build-dir/ODSRV-GAPT-COH/onedata/bamboos'...
Cloning into '/mnt/storage/bamboo-agent-home/xml-data/build-dir/ODSRV-GAPT-COH/onedata/cdmi_swagger'...
Cloning into '/mnt/storage/bamboo-agent-home/xml-data/build-dir/ODSRV-GAPT-COH/onedata/one_env'...
Cloning into '/mnt/storage/bamboo-agent-home/xml-data/build-dir/ODSRV-GAPT-COH/onedata/onepanel_swagger'...
Cloning into '/mnt/storage/bamboo-agent-home/xml-data/build-dir/ODSRV-GAPT-COH/onedata/oneprovider_swagger'...
Cloning into '/mnt/storage/bamboo-agent-home/xml-data/build-dir/ODSRV-GAPT-COH/onedata/onezone_swagger'...
Submodule 'bamboos' (ssh://git@git.onedata.org:7999/vfs/bamboos.git) registered for path 'onepanel_swagger/bamboos'
Cloning into '/mnt/storage/bamboo-agent-home/xml-data/build-dir/ODSRV-GAPT-COH/onedata/onepanel_swagger/bamboos'...
Submodule 'bamboos' (ssh://git@git.onedata.org:7999/vfs/bamboos.git) registered for path 'oneprovider_swagger/bamboos'
Cloning into '/mnt/storage/bamboo-agent-home/xml-data/build-dir/ODSRV-GAPT-COH/onedata/oneprovider_swagger/bamboos'...
Submodule 'bamboos' (ssh://git@git.onedata.org:7999/vfs/bamboos.git) registered for path 'onezone_swagger/bamboos'
Cloning into '/mnt/storage/bamboo-agent-home/xml-data/build-dir/ODSRV-GAPT-COH/onedata/onezone_swagger/bamboos'...
error: failed to create clusterrolebinding: clusterrolebindings.rbac.authorization.k8s.io "serviceaccounts-cluster-admin" already exists
Error: could not find tiller
Error: could not find tiller
error: failed to create clusterrolebinding: clusterrolebindings.rbac.authorization.k8s.io "serviceaccounts-cluster-admin" already exists
protocol version mismatch -- is your shell clean?
(see the rsync man page for an explanation)
rsync error: protocol incompatibility (code 2) at compat.c(178) [Receiver=3.1.2]
command terminated with exit code 126
protocol version mismatch -- is your shell clean?
(see the rsync man page for an explanation)
rsync error: protocol incompatibility (code 2) at compat.c(178) [Receiver=3.1.2]
command terminated with exit code 126
protocol version mismatch -- is your shell clean?
(see the rsync man page for an explanation)
rsync error: protocol incompatibility (code 2) at compat.c(178) [Receiver=3.1.2]
command terminated with exit code 126
protocol version mismatch -- is your shell clean?
(see the rsync man page for an explanation)
rsync error: protocol incompatibility (code 2) at compat.c(178) [Receiver=3.1.2]
command terminated with exit code 126
protocol version mismatch -- is your shell clean?
(see the rsync man page for an explanation)
rsync error: protocol incompatibility (code 2) at compat.c(178) [Receiver=3.1.2]
command terminated with exit code 126
protocol version mismatch -- is your shell clean?
(see the rsync man page for an explanation)
rsync error: protocol incompatibility (code 2) at compat.c(178) [Receiver=3.1.2]
command terminated with exit code 126
error: failed to create clusterrolebinding: clusterrolebindings.rbac.authorization.k8s.io "serviceaccounts-cluster-admin" already exists
protocol version mismatch -- is your shell clean?
(see the rsync man page for an explanation)
rsync error: protocol incompatibility (code 2) at compat.c(178) [Receiver=3.1.2]
command terminated with exit code 126
protocol version mismatch -- is your shell clean?
(see the rsync man page for an explanation)
rsync error: protocol incompatibility (code 2) at compat.c(178) [Receiver=3.1.2]
command terminated with exit code 126
protocol version mismatch -- is your shell clean?
(see the rsync man page for an explanation)
rsync error: protocol incompatibility (code 2) at compat.c(178) [Receiver=3.1.2]
command terminated with exit code 126
protocol version mismatch -- is your shell clean?
(see the rsync man page for an explanation)
rsync error: protocol incompatibility (code 2) at compat.c(178) [Receiver=3.1.2]
protocol version mismatch -- is your shell clean?
(see the rsync man page for an explanation)
rsync error: protocol incompatibility (code 2) at compat.c(178) [Receiver=3.1.2]
command terminated with exit code 126
protocol version mismatch -- is your shell clean?
(see the rsync man page for an explanation)
rsync error: protocol incompatibility (code 2) at compat.c(178) [Receiver=3.1.2]
command terminated with exit code 126
W0816 20:12:26.130059 721 warnings.go:70] policy/v1beta1 PodDisruptionBudget is deprecated in v1.21+, unavailable in v1.25+; use policy/v1 PodDisruptionBudget
cp: cannot stat 'onedata/one_env/sources_info.yaml': No such file or directory
% Total % Received % Xferd Average Speed Time Time Time Current
Dload Upload Total Spent Left Speed
0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0
100 4653 100 4653 0 0 30411 0 --:--:-- --:--:-- --:--:-- 30611
Error from server (NotFound): pods "dev-oneprovider-krakow-0" not found
Error: No such object: 3b8e55ed417f
Error: No such object: ed9dd0706344
Error: No such object: ed9dd0706344
Error: No such object: cf8aebd513a6
Error: No such object: cf8aebd513a6
Error: No such object: 368e7b8e88c0
Error: No such object: 368e7b8e88c0
Error: No such object: 936f3854fdf1
Error: No such object: 936f3854fdf1
Error: No such object: 3a42391a5303
Error: No such object: 3a42391a5303
Error: No such object: 040af239af96
Error: No such object: 040af239af96
Error: No such object: f842a741a7ab
Error: No such object: f842a741a7ab
Error: No such object: 5a1539f8969e
Error: No such object: 5a1539f8969e
Error: No such object: ec8f67806660
Error: No such object: ec8f67806660
Error: No such object: 06dd196cbb05
Error: No such object: 06dd196cbb05
Error: No such object: 3821e5b4ff83
Error: No such object: 3821e5b4ff83
Error: No such object: 55f7a34aa595
Error: No such object: 55f7a34aa595
Error: No such object: 7c5c5615d979
Error: No such object: 7c5c5615d979
Error: No such object: e06f41e99123
Error: No such object: e06f41e99123
Error response from daemon: Cannot kill container: 3c9026c3c4e4: Container 3c9026c3c4e476254eed7328ef5d75bbb5c0ed8a049d83bcaa29fcc4dadc2592 is not running
Error response from daemon: Cannot kill container: 3b8e55ed417f: No such container: 3b8e55ed417f
Error: No such container: 3b8e55ed417f
Error response from daemon: Cannot kill container: ed9dd0706344: No such container: ed9dd0706344
Error: No such container: ed9dd0706344
Error response from daemon: Cannot kill container: cf8aebd513a6: No such container: cf8aebd513a6
Error: No such container: cf8aebd513a6
Error response from daemon: Cannot kill container: 368e7b8e88c0: No such container: 368e7b8e88c0
Error: No such container: 368e7b8e88c0
Error response from daemon: Cannot kill container: 936f3854fdf1: No such container: 936f3854fdf1
Error: No such container: 936f3854fdf1
Error response from daemon: Cannot kill container: 3a42391a5303: No such container: 3a42391a5303
Error: No such container: 3a42391a5303
Error response from daemon: Cannot kill container: 040af239af96: No such container: 040af239af96
Error: No such container: 040af239af96
Error response from daemon: Cannot kill container: f842a741a7ab: No such container: f842a741a7ab
Error: No such container: f842a741a7ab
Error response from daemon: Cannot kill container: 5a1539f8969e: No such container: 5a1539f8969e
Error: No such container: 5a1539f8969e
Error response from daemon: Cannot kill container: ec8f67806660: No such container: ec8f67806660
Error: No such container: ec8f67806660
Error response from daemon: Cannot kill container: 06dd196cbb05: No such container: 06dd196cbb05
Error: No such container: 06dd196cbb05
Error response from daemon: Cannot kill container: 3821e5b4ff83: No such container: 3821e5b4ff83
Error: No such container: 3821e5b4ff83
Error response from daemon: Cannot kill container: 55f7a34aa595: No such container: 55f7a34aa595
Error: No such container: 55f7a34aa595
Error response from daemon: Cannot kill container: 7c5c5615d979: No such container: 7c5c5615d979
Error: No such container: 7c5c5615d979
Error response from daemon: Cannot kill container: e06f41e99123: No such container: e06f41e99123
Error: No such container: e06f41e99123