Acceptance tests using different clients concurrently. Environment deployed from packages.
Build: #3187 failed
Job: Workflows with input files was successful
user sees successful execution of all workflows from automationexamples with their example input files[1oz 1op openfaas]: Test case result
The below summarizes the result of the test " user sees successful execution of all workflows from automationexamples with their example input files[1oz 1op openfaas]" in build 3,187 of Onedata Products - mixed acceptance pkg - Workflows with input files.
- Description
- user sees successful execution of all workflows from automationexamples with their example input files[1oz 1op openfaas]
- Test class
- mixed.scenarios.test_workflows_with_input_files
- Method
- test_user_sees_successful_execution_of_all_workflows_from_automationexamples_with_their_example_input_files[1oz_1op_openfaas]
- Jira Issue
-
- Duration
- 6 mins
- Status
- Failed (Existing Failure)
Error Log
urllib3.exceptions.ProtocolError: ('Connection aborted.', RemoteDisconnected('Remote end closed connection without response')) self = <urllib3.connectionpool.HTTPSConnectionPool object at 0x7f2fe6efe490> method = 'POST', url = '/api/v3/oneprovider/lookup-file-id/space1/dir1' body = '{}' headers = {'Accept': 'application/json', 'Content-Type': 'application/json', 'User-Agent': 'Swagger-Codegen/21.02.5/python', 'X-...c00YzMzNGNoNzdhMgowMDFhY2lkIHRpbWUgPCAxNzUyMTUzNjY4CjAwMmZzaWduYXR1cmUgzfcTdxuqq4RBHd1RXjztRyf8bnKDXiwDVW102wtE00u3AK'} retries = Retry(total=3, connect=None, read=None, redirect=None, status=None) redirect = False, assert_same_host = False, timeout = None, pool_timeout = None release_conn = True, chunked = False, body_pos = None response_kw = {'preload_content': True, 'request_url': 'https://dev-oneprovider-krakow.default.svc.cluster.local:443/api/v3/oneprovider/lookup-file-id/space1/dir1'} conn = None, release_this_conn = True, err = None, clean_exit = False timeout_obj = <urllib3.util.timeout.Timeout object at 0x7f2fe6f07dc0> is_new_proxy_conn = False def urlopen(self, method, url, body=None, headers=None, retries=None, redirect=True, assert_same_host=True, timeout=_Default, pool_timeout=None, release_conn=None, chunked=False, body_pos=None, **response_kw): """ Get a connection from the pool and perform an HTTP request. This is the lowest level call for making a request, so you'll need to specify all the raw details. .. note:: More commonly, it's appropriate to use a convenience method provided by :class:`.RequestMethods`, such as :meth:`request`. .. note:: `release_conn` will only behave as expected if `preload_content=False` because we want to make `preload_content=False` the default behaviour someday soon without breaking backwards compatibility. :param method: HTTP request method (such as GET, POST, PUT, etc.) :param body: Data to send in the request body (useful for creating POST requests, see HTTPConnectionPool.post_url for more convenience). :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param redirect: If True, automatically handle redirects (status codes 301, 302, 303, 307, 308). Each redirect counts as a retry. Disabling retries will disable redirect, too. :param assert_same_host: If ``True``, will make sure that the host of the pool requests is consistent else will raise HostChangedError. When False, you can use the pool on an HTTP proxy and request foreign hosts. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param pool_timeout: If set and the pool is set to block=True, then this method will block for ``pool_timeout`` seconds and raise EmptyPoolError if no connection is available within the time period. :param release_conn: If False, then the urlopen call will not release the connection back into the pool once a response is received (but will release if you read the entire contents of the response such as when `preload_content=True`). This is useful if you're not preloading the response's content immediately. You will need to call ``r.release_conn()`` on the response ``r`` to return the connection back into the pool. If None, it takes the value of ``response_kw.get('preload_content', True)``. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param int body_pos: Position to seek to in file-like body in the event of a retry or redirect. Typically this won't need to be set because urllib3 will auto-populate the value when needed. :param \\**response_kw: Additional parameters are passed to :meth:`urllib3.response.HTTPResponse.from_httplib` """ if headers is None: headers = self.headers if not isinstance(retries, Retry): retries = Retry.from_int(retries, redirect=redirect, default=self.retries) if release_conn is None: release_conn = response_kw.get('preload_content', True) # Check host if assert_same_host and not self.is_same_host(url): raise HostChangedError(self, url, retries) conn = None # Track whether `conn` needs to be released before # returning/raising/recursing. Update this variable if necessary, and # leave `release_conn` constant throughout the function. That way, if # the function recurses, the original value of `release_conn` will be # passed down into the recursive call, and its value will be respected. # # See issue #651 [1] for details. # # [1] <https://github.com/shazow/urllib3/issues/651> release_this_conn = release_conn # Merge the proxy headers. Only do this in HTTP. We have to copy the # headers dict so we can safely change it without those changes being # reflected in anyone else's copy. if self.scheme == 'http': headers = headers.copy() headers.update(self.proxy_headers) # Must keep the exception bound to a separate variable or else Python 3 # complains about UnboundLocalError. err = None # Keep track of whether we cleanly exited the except block. This # ensures we do proper cleanup in finally. clean_exit = False # Rewind body position, if needed. Record current position # for future rewinds in the event of a redirect/retry. body_pos = set_file_position(body, body_pos) try: # Request a connection from the queue. timeout_obj = self._get_timeout(timeout) conn = self._get_conn(timeout=pool_timeout) conn.timeout = timeout_obj.connect_timeout is_new_proxy_conn = self.proxy is not None and not getattr(conn, 'sock', None) if is_new_proxy_conn: self._prepare_proxy(conn) # Make the request on the httplib connection object. > httplib_response = self._make_request(conn, method, url, timeout=timeout_obj, body=body, headers=headers, chunked=chunked) /usr/local/lib/python3.8/dist-packages/urllib3/connectionpool.py:597: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib3.connectionpool.HTTPSConnectionPool object at 0x7f2fe6efe490> conn = <urllib3.connection.VerifiedHTTPSConnection object at 0x7f2fe6efe100> method = 'POST', url = '/api/v3/oneprovider/lookup-file-id/space1/dir1' timeout = <urllib3.util.timeout.Timeout object at 0x7f2fe6f07dc0> chunked = False httplib_request_kw = {'body': '{}', 'headers': {'Accept': 'application/json', 'Content-Type': 'application/json', 'User-Agent': 'Swagger-Co...00YzMzNGNoNzdhMgowMDFhY2lkIHRpbWUgPCAxNzUyMTUzNjY4CjAwMmZzaWduYXR1cmUgzfcTdxuqq4RBHd1RXjztRyf8bnKDXiwDVW102wtE00u3AK'}} timeout_obj = <urllib3.util.timeout.Timeout object at 0x7f2fe6f0f040> read_timeout = None def _make_request(self, conn, method, url, timeout=_Default, chunked=False, **httplib_request_kw): """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param timeout: Socket timeout in seconds for the request. This can be a float or integer, which will set the same timeout value for the socket connect and the socket read, or an instance of :class:`urllib3.util.Timeout`, which gives you more fine-grained control over your timeouts. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = timeout_obj.connect_timeout # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: # Py2 raises this as a BaseSSLError, Py3 raises it as socket timeout. self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # conn.request() calls httplib.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. if chunked: conn.request_chunked(method, url, **httplib_request_kw) else: conn.request(method, url, **httplib_request_kw) # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout # App Engine doesn't have a sock attr if getattr(conn, 'sock', None): # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, "Read timed out. (read timeout=%s)" % read_timeout) if read_timeout is Timeout.DEFAULT_TIMEOUT: conn.sock.settimeout(socket.getdefaulttimeout()) else: # None or a value conn.sock.settimeout(read_timeout) # Receive the response from the server try: try: # Python 2.7, use buffering of HTTP responses httplib_response = conn.getresponse(buffering=True) except TypeError: # Python 2.6 and older, Python 3 try: httplib_response = conn.getresponse() except Exception as e: # Remove the TypeError from the exception chain in Python 3; # otherwise it looks like a programming error was the cause. > six.raise_from(e, None) /usr/local/lib/python3.8/dist-packages/urllib3/connectionpool.py:384: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ value = RemoteDisconnected('Remote end closed connection without response') from_value = None > ??? <string>:2: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib3.connectionpool.HTTPSConnectionPool object at 0x7f2fe6efe490> conn = <urllib3.connection.VerifiedHTTPSConnection object at 0x7f2fe6efe100> method = 'POST', url = '/api/v3/oneprovider/lookup-file-id/space1/dir1' timeout = <urllib3.util.timeout.Timeout object at 0x7f2fe6f07dc0> chunked = False httplib_request_kw = {'body': '{}', 'headers': {'Accept': 'application/json', 'Content-Type': 'application/json', 'User-Agent': 'Swagger-Co...00YzMzNGNoNzdhMgowMDFhY2lkIHRpbWUgPCAxNzUyMTUzNjY4CjAwMmZzaWduYXR1cmUgzfcTdxuqq4RBHd1RXjztRyf8bnKDXiwDVW102wtE00u3AK'}} timeout_obj = <urllib3.util.timeout.Timeout object at 0x7f2fe6f0f040> read_timeout = None def _make_request(self, conn, method, url, timeout=_Default, chunked=False, **httplib_request_kw): """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param timeout: Socket timeout in seconds for the request. This can be a float or integer, which will set the same timeout value for the socket connect and the socket read, or an instance of :class:`urllib3.util.Timeout`, which gives you more fine-grained control over your timeouts. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = timeout_obj.connect_timeout # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: # Py2 raises this as a BaseSSLError, Py3 raises it as socket timeout. self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # conn.request() calls httplib.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. if chunked: conn.request_chunked(method, url, **httplib_request_kw) else: conn.request(method, url, **httplib_request_kw) # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout # App Engine doesn't have a sock attr if getattr(conn, 'sock', None): # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, "Read timed out. (read timeout=%s)" % read_timeout) if read_timeout is Timeout.DEFAULT_TIMEOUT: conn.sock.settimeout(socket.getdefaulttimeout()) else: # None or a value conn.sock.settimeout(read_timeout) # Receive the response from the server try: try: # Python 2.7, use buffering of HTTP responses httplib_response = conn.getresponse(buffering=True) except TypeError: # Python 2.6 and older, Python 3 try: > httplib_response = conn.getresponse() /usr/local/lib/python3.8/dist-packages/urllib3/connectionpool.py:380: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib3.connection.VerifiedHTTPSConnection object at 0x7f2fe6efe100> def getresponse(self): """Get the response from the server. If the HTTPConnection is in the correct state, returns an instance of HTTPResponse or of whatever object is returned by the response_class variable. If a request has not been sent or if a previous response has not be handled, ResponseNotReady is raised. If the HTTP response indicates that the connection should be closed, then it will be closed before the response is returned. When the connection is closed, the underlying socket is closed. """ # if a prior response has been completed, then forget about it. if self.__response and self.__response.isclosed(): self.__response = None # if a prior response exists, then it must be completed (otherwise, we # cannot read this response's header to determine the connection-close # behavior) # # note: if a prior response existed, but was connection-close, then the # socket and response were made independent of this HTTPConnection # object since a new request requires that we open a whole new # connection # # this means the prior response had one of two states: # 1) will_close: this connection was reset and the prior socket and # response operate independently # 2) persistent: the response was retained and we await its # isclosed() status to become true. # if self.__state != _CS_REQ_SENT or self.__response: raise ResponseNotReady(self.__state) if self.debuglevel > 0: response = self.response_class(self.sock, self.debuglevel, method=self._method) else: response = self.response_class(self.sock, method=self._method) try: try: > response.begin() /usr/lib/python3.8/http/client.py:1348: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <http.client.HTTPResponse object at 0x7f2fe6fd69a0> def begin(self): if self.headers is not None: # we've already started reading the response return # read until we get a non-100 response while True: > version, status, reason = self._read_status() /usr/lib/python3.8/http/client.py:316: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <http.client.HTTPResponse object at 0x7f2fe6fd69a0> def _read_status(self): line = str(self.fp.readline(_MAXLINE + 1), "iso-8859-1") if len(line) > _MAXLINE: raise LineTooLong("status line") if self.debuglevel > 0: print("reply:", repr(line)) if not line: # Presumably, the server closed the connection before # sending a valid response. > raise RemoteDisconnected("Remote end closed connection without" " response") E http.client.RemoteDisconnected: Remote end closed connection without response /usr/lib/python3.8/http/client.py:285: RemoteDisconnected During handling of the above exception, another exception occurred: request = <FixtureRequest for <Function 'test_user_sees_successful_execution_of_all_workflows_from_automationexamples_with_their_example_input_files[1oz_1op_openfaas]'>> @pytest.mark.usefixtures(*function_args) def scenario_wrapper(request): > _execute_scenario(feature, scenario, request, encoding) /usr/local/lib/python3.8/dist-packages/pytest_bdd/scenario.py:227: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/local/lib/python3.8/dist-packages/pytest_bdd/scenario.py:189: in _execute_scenario _execute_step_function(request, scenario, step, step_func) /usr/local/lib/python3.8/dist-packages/pytest_bdd/scenario.py:130: in _execute_step_function step_func(**kwargs) tests/utils/bdd_utils.py:78: in wrapper return fun(*ba.args, **ba.kwargs) tests/mixed/steps/rest/onezone/automation.py:101: in execute_all_workflows example_initial_store_content, input_files = getattr( tests/mixed/utils/example_workflow_executions.py:36: in bagit_uploader return [{ tests/mixed/utils/example_workflow_executions.py:38: in <listcomp> 'destination-directory': {'fileId': self.resolve_file_id(dest_dir)} tests/mixed/steps/rest/oneprovider/data.py:32: in _lookup_file_id file_id = resolve_file_path_api.lookup_file_id(path).file_id tests/mixed/oneprovider_client/api/file_path_resolution_api.py:55: in lookup_file_id (data) = self.lookup_file_id_with_http_info(path, **kwargs) # noqa: E501 tests/mixed/oneprovider_client/api/file_path_resolution_api.py:115: in lookup_file_id_with_http_info return self.api_client.call_api( tests/mixed/oneprovider_client/api_client.py:326: in call_api return self.__call_api(resource_path, method, tests/mixed/oneprovider_client/api_client.py:158: in __call_api response_data = self.request( tests/mixed/oneprovider_client/api_client.py:368: in request return self.rest_client.POST(url, tests/mixed/oneprovider_client/rest.py:269: in POST return self.request("POST", url, tests/mixed/oneprovider_client/rest.py:162: in request r = self.pool_manager.request( /usr/local/lib/python3.8/dist-packages/urllib3/request.py:70: in request return self.request_encode_body(method, url, fields=fields, /usr/local/lib/python3.8/dist-packages/urllib3/request.py:150: in request_encode_body return self.urlopen(method, url, **extra_kw) /usr/local/lib/python3.8/dist-packages/urllib3/poolmanager.py:322: in urlopen response = conn.urlopen(method, u.request_uri, **kw) /usr/local/lib/python3.8/dist-packages/urllib3/connectionpool.py:637: in urlopen retries = retries.increment(method, url, error=e, _pool=self, /usr/local/lib/python3.8/dist-packages/urllib3/util/retry.py:367: in increment raise six.reraise(type(error), error, _stacktrace) /usr/local/lib/python3.8/dist-packages/urllib3/packages/six.py:685: in reraise raise value.with_traceback(tb) /usr/local/lib/python3.8/dist-packages/urllib3/connectionpool.py:597: in urlopen httplib_response = self._make_request(conn, method, url, /usr/local/lib/python3.8/dist-packages/urllib3/connectionpool.py:384: in _make_request six.raise_from(e, None) <string>:2: in raise_from ??? /usr/local/lib/python3.8/dist-packages/urllib3/connectionpool.py:380: in _make_request httplib_response = conn.getresponse() /usr/lib/python3.8/http/client.py:1348: in getresponse response.begin() /usr/lib/python3.8/http/client.py:316: in begin version, status, reason = self._read_status() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <http.client.HTTPResponse object at 0x7f2fe6fd69a0> def _read_status(self): line = str(self.fp.readline(_MAXLINE + 1), "iso-8859-1") if len(line) > _MAXLINE: raise LineTooLong("status line") if self.debuglevel > 0: print("reply:", repr(line)) if not line: # Presumably, the server closed the connection before # sending a valid response. > raise RemoteDisconnected("Remote end closed connection without" " response") E urllib3.exceptions.ProtocolError: ('Connection aborted.', RemoteDisconnected('Remote end closed connection without response')) /usr/lib/python3.8/http/client.py:285: ProtocolError