Build: #5 was successful
Job: Workflows with input files was successful
user sees successful execution of all workflows from automationexamples with their example input files[1oz 1op openfaas]: Test case result
The below summarizes the result of the test " user sees successful execution of all workflows from automationexamples with their example input files[1oz 1op openfaas]" in build 5 of Onedata Products - mixed acceptance pkg - feature-VFS-9425-mixed-tests-test-api-for-share-from-file-details-modal - Workflows with input files.
- Description
- user sees successful execution of all workflows from automationexamples with their example input files[1oz 1op openfaas]
- Test class
- mixed.scenarios.test_workflows_with_input_files
- Method
- test_user_sees_successful_execution_of_all_workflows_from_automationexamples_with_their_example_input_files[1oz_1op_openfaas]
- Duration
- 14 mins
- Status
- Failed (Existing Failure)
Error Log
urllib3.exceptions.ProtocolError: ('Connection aborted.', RemoteDisconnected('Remote end closed connection without response')) self = <urllib3.connectionpool.HTTPSConnectionPool object at 0x7f89f9bc7fe0> method = 'POST', url = '/api/v3/oneprovider/lookup-file-id/space1/dir1' body = '{}' headers = HTTPHeaderDict({'Accept': 'application/json', 'X-Auth-Token': 'MDAzM2xvY2F00aW9uIGRldi1vbmV6b25lLmRlZmF1bHQuc3ZjLmNsdX...K16r002iiH400xOcHPaoiXBDvJbul78K', 'User-Agent': 'Swagger-Codegen/21.02.7/python', 'Content-Type': 'application/json'}) retries = Retry(total=3, connect=None, read=None, redirect=None, status=None) redirect = False, assert_same_host = False, timeout = None, pool_timeout = None release_conn = True, chunked = False, body_pos = None, preload_content = True decode_content = True, response_kw = {} parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/api/v3/oneprovider/lookup-file-id/space1/dir1', query=None, fragment=None) destination_scheme = None, conn = None, release_this_conn = True http_tunnel_required = False, err = None, clean_exit = False def urlopen( # type: ignore[override] self, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | bool | int | None = None, redirect: bool = True, assert_same_host: bool = True, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, pool_timeout: int | None = None, release_conn: bool | None = None, chunked: bool = False, body_pos: _TYPE_BODY_POSITION | None = None, preload_content: bool = True, decode_content: bool = True, **response_kw: typing.Any, ) -> BaseHTTPResponse: """ Get a connection from the pool and perform an HTTP request. This is the lowest level call for making a request, so you'll need to specify all the raw details. .. note:: More commonly, it's appropriate to use a convenience method such as :meth:`request`. .. note:: `release_conn` will only behave as expected if `preload_content=False` because we want to make `preload_content=False` the default behaviour someday soon without breaking backwards compatibility. :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. If ``None`` (default) will retry 3 times, see ``Retry.DEFAULT``. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param redirect: If True, automatically handle redirects (status codes 301, 302, 303, 307, 308). Each redirect counts as a retry. Disabling retries will disable redirect, too. :param assert_same_host: If ``True``, will make sure that the host of the pool requests is consistent else will raise HostChangedError. When ``False``, you can use the pool on an HTTP proxy and request foreign hosts. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param pool_timeout: If set and the pool is set to block=True, then this method will block for ``pool_timeout`` seconds and raise EmptyPoolError if no connection is available within the time period. :param bool preload_content: If True, the response's body will be preloaded into memory. :param bool decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param release_conn: If False, then the urlopen call will not release the connection back into the pool once a response is received (but will release if you read the entire contents of the response such as when `preload_content=True`). This is useful if you're not preloading the response's content immediately. You will need to call ``r.release_conn()`` on the response ``r`` to return the connection back into the pool. If None, it takes the value of ``preload_content`` which defaults to ``True``. :param bool chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param int body_pos: Position to seek to in file-like body in the event of a retry or redirect. Typically this won't need to be set because urllib3 will auto-populate the value when needed. """ parsed_url = parse_url(url) destination_scheme = parsed_url.scheme if headers is None: headers = self.headers if not isinstance(retries, Retry): retries = Retry.from_int(retries, redirect=redirect, default=self.retries) if release_conn is None: release_conn = preload_content # Check host if assert_same_host and not self.is_same_host(url): raise HostChangedError(self, url, retries) # Ensure that the URL we're connecting to is properly encoded if url.startswith("/"): url = to_str(_encode_target(url)) else: url = to_str(parsed_url.url) conn = None # Track whether `conn` needs to be released before # returning/raising/recursing. Update this variable if necessary, and # leave `release_conn` constant throughout the function. That way, if # the function recurses, the original value of `release_conn` will be # passed down into the recursive call, and its value will be respected. # # See issue #651 [1] for details. # # [1] <https://github.com/urllib3/urllib3/issues/651> release_this_conn = release_conn http_tunnel_required = connection_requires_http_tunnel( self.proxy, self.proxy_config, destination_scheme ) # Merge the proxy headers. Only done when not using HTTP CONNECT. We # have to copy the headers dict so we can safely change it without those # changes being reflected in anyone else's copy. if not http_tunnel_required: headers = headers.copy() # type: ignore[attr-defined] headers.update(self.proxy_headers) # type: ignore[union-attr] # Must keep the exception bound to a separate variable or else Python 3 # complains about UnboundLocalError. err = None # Keep track of whether we cleanly exited the except block. This # ensures we do proper cleanup in finally. clean_exit = False # Rewind body position, if needed. Record current position # for future rewinds in the event of a redirect/retry. body_pos = set_file_position(body, body_pos) try: # Request a connection from the queue. timeout_obj = self._get_timeout(timeout) conn = self._get_conn(timeout=pool_timeout) conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] # Is this a closed/new connection that requires CONNECT tunnelling? if self.proxy is not None and http_tunnel_required and conn.is_closed: try: self._prepare_proxy(conn) except (BaseSSLError, OSError, SocketTimeout) as e: self._raise_timeout( err=e, url=self.proxy.url, timeout_value=conn.timeout ) raise # If we're going to release the connection in ``finally:``, then # the response doesn't need to know about the connection. Otherwise # it will also try to release it and we'll have a double-release # mess. response_conn = conn if not release_conn else None # Make the request on the HTTPConnection object > response = self._make_request( conn, method, url, timeout=timeout_obj, body=body, headers=headers, chunked=chunked, retries=retries, response_conn=response_conn, preload_content=preload_content, decode_content=decode_content, **response_kw, ) /usr/local/lib/python3.12/dist-packages/urllib3/connectionpool.py:789: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/local/lib/python3.12/dist-packages/urllib3/connectionpool.py:536: in _make_request response = conn.getresponse() /usr/local/lib/python3.12/dist-packages/urllib3/connection.py:464: in getresponse httplib_response = super().getresponse() /usr/lib/python3.12/http/client.py:1428: in getresponse response.begin() /usr/lib/python3.12/http/client.py:331: in begin version, status, reason = self._read_status() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <http.client.HTTPResponse object at 0x7f89f9891c90> def _read_status(self): line = str(self.fp.readline(_MAXLINE + 1), "iso-8859-1") if len(line) > _MAXLINE: raise LineTooLong("status line") if self.debuglevel > 0: print("reply:", repr(line)) if not line: # Presumably, the server closed the connection before # sending a valid response. > raise RemoteDisconnected("Remote end closed connection without" " response") E http.client.RemoteDisconnected: Remote end closed connection without response /usr/lib/python3.12/http/client.py:300: RemoteDisconnected During handling of the above exception, another exception occurred: fixturefunc = <function execute_all_workflows at 0x7f89f9ef91c0> request = <FixtureRequest for <Function test_user_sees_successful_execution_of_all_workflows_from_automationexamples_with_their_example_input_files[1oz_1op_openfaas]>> kwargs = {'groups': {'group1': '1b3ac690383169bb55709f6a868c9536ch04c4'}, 'host': 'oneprovider-1', 'hosts': {'oneprovider-1': {...': 'dev-onezone.default.svc.cluster.local', 'ip': '10.244.29.21', 'name': 'dev-onezone', ...}}, 'space': 'space1', ...} def call_fixture_func( fixturefunc: "_FixtureFunc[FixtureValue]", request: FixtureRequest, kwargs ) -> FixtureValue: if is_generator(fixturefunc): fixturefunc = cast( Callable[..., Generator[FixtureValue, None, None]], fixturefunc ) generator = fixturefunc(**kwargs) try: fixture_result = next(generator) except StopIteration: raise ValueError(f"{request.fixturename} did not yield a value") from None finalizer = functools.partial(_teardown_yield_fixture, fixturefunc, generator) request.addfinalizer(finalizer) else: fixturefunc = cast(Callable[..., FixtureValue], fixturefunc) > fixture_result = fixturefunc(**kwargs) /usr/local/lib/python3.12/dist-packages/_pytest/fixtures.py:913: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/utils/bdd_utils.py:78: in wrapper return fun(*ba.args, **ba.kwargs) tests/mixed/steps/rest/onezone/automation.py:421: in execute_all_workflows example_initial_store_content, input_files = getattr( tests/mixed/utils/example_workflow_executions.py:89: in download_files "destination": {"fileId": self.resolve_file_id(destination)}, tests/mixed/steps/rest/oneprovider/data.py:37: in _lookup_file_id file_id = resolve_file_path_api.lookup_file_id(path).file_id tests/mixed/oneprovider_client/api/file_path_resolution_api.py:55: in lookup_file_id (data) = self.lookup_file_id_with_http_info(path, **kwargs) # noqa: E501 tests/mixed/oneprovider_client/api/file_path_resolution_api.py:115: in lookup_file_id_with_http_info return self.api_client.call_api( tests/mixed/oneprovider_client/api_client.py:326: in call_api return self.__call_api(resource_path, method, tests/mixed/oneprovider_client/api_client.py:158: in __call_api response_data = self.request( tests/mixed/oneprovider_client/api_client.py:368: in request return self.rest_client.POST(url, tests/mixed/oneprovider_client/rest.py:269: in POST return self.request("POST", url, tests/mixed/oneprovider_client/rest.py:162: in request r = self.pool_manager.request( /usr/local/lib/python3.12/dist-packages/urllib3/_request_methods.py:144: in request return self.request_encode_body( /usr/local/lib/python3.12/dist-packages/urllib3/_request_methods.py:279: in request_encode_body return self.urlopen(method, url, **extra_kw) /usr/local/lib/python3.12/dist-packages/urllib3/poolmanager.py:443: in urlopen response = conn.urlopen(method, u.request_uri, **kw) /usr/local/lib/python3.12/dist-packages/urllib3/connectionpool.py:843: in urlopen retries = retries.increment( /usr/local/lib/python3.12/dist-packages/urllib3/util/retry.py:474: in increment raise reraise(type(error), error, _stacktrace) /usr/local/lib/python3.12/dist-packages/urllib3/util/util.py:38: in reraise raise value.with_traceback(tb) /usr/local/lib/python3.12/dist-packages/urllib3/connectionpool.py:789: in urlopen response = self._make_request( /usr/local/lib/python3.12/dist-packages/urllib3/connectionpool.py:536: in _make_request response = conn.getresponse() /usr/local/lib/python3.12/dist-packages/urllib3/connection.py:464: in getresponse httplib_response = super().getresponse() /usr/lib/python3.12/http/client.py:1428: in getresponse response.begin() /usr/lib/python3.12/http/client.py:331: in begin version, status, reason = self._read_status() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <http.client.HTTPResponse object at 0x7f89f9891c90> def _read_status(self): line = str(self.fp.readline(_MAXLINE + 1), "iso-8859-1") if len(line) > _MAXLINE: raise LineTooLong("status line") if self.debuglevel > 0: print("reply:", repr(line)) if not line: # Presumably, the server closed the connection before # sending a valid response. > raise RemoteDisconnected("Remote end closed connection without" " response") E urllib3.exceptions.ProtocolError: ('Connection aborted.', RemoteDisconnected('Remote end closed connection without response')) /usr/lib/python3.12/http/client.py:300: ProtocolError