GUI acceptance tests using environment deployed from packages.
Build: #2520 failed
Job: Atm workflow execution failed
Test results
- 21 tests in total
- 3 tests failed
- 3 failures are new
- 48 minutes taken in total.
Build 2,520 has the following 3 errors: 3 new failure(s) occurred since the previous build.
Status | Test | Duration | |
---|---|---|---|
Collapse |
test_oneprovider_atm_workflows_execution
test_user_checks_pods_activity_events_after_checksumcountingdifferentlambdas_workflow_execution[1oz_1op_openfaas]
|
6 mins | |
RuntimeError: no item found in Task in ParallelBox in WorkflowLane in WorkflowVisualiser in WorkflowExecutionPage in Oneprovider page web_elem_root = <selenium.webdriver.remote.webelement.WebElement (session="6d0847dd0721526b59f0c10b87b6293e", element="f34e04d6-188b-4441-aaa7-e5378d1207ca")> css_sel = '.view-task-pods-activity-action-trigger' err_msg = 'no item found in Task in ParallelBox in WorkflowLane in WorkflowVisualiser in WorkflowExecutionPage in Oneprovider page' def find_web_elem(web_elem_root, css_sel, err_msg): try: _scroll_to_css_sel(web_elem_root, css_sel) (310 more lines...) |
|||
Collapse |
test_oneprovider_atm_workflows_execution
test_user_checks_time_series_charts_after_execution_of_uploaded_countingdifferentchecksums_workflow[1oz_1op_openfaas]
|
2 mins | |
AssertionError: Workflow status is not equal to Finished request = <FixtureRequest for <Function 'test_user_checks_time_series_charts_after_execution_of_uploaded_countingdifferentchecksums_workflow[1oz_1op_openfaas]'>> @pytest.mark.usefixtures(*function_args) def scenario_wrapper(request): > _execute_scenario(feature, scenario, request, encoding) /usr/local/lib/python3.8/dist-packages/pytest_bdd/scenario.py:227: (29 more lines...) |
|||
Collapse |
test_oneprovider_atm_workflows_execution
test_user_sees_lane_run_indicators_and_statuses_after_rerunning_workflow[1oz_1op_openfaas]
|
3 mins | |
Exception: After awaiting for workflow "checksum-counting-different-lambdas" for 30 seconds its status is not Finished as expected request = <FixtureRequest for <Function 'test_user_sees_lane_run_indicators_and_statuses_after_rerunning_workflow[1oz_1op_openfaas]'>> @pytest.mark.usefixtures(*function_args) def scenario_wrapper(request): > _execute_scenario(feature, scenario, request, encoding) /usr/local/lib/python3.8/dist-packages/pytest_bdd/scenario.py:227: (33 more lines...) |