1. Obtain API information in scenarios
1. The id in each scene has been obtained through the commercial id. We access the API in the current scene through the scene id. Each scene will contain many APIs. At this time, we obtain it through traversal.
2. Whether to obtain the data or filter it through regularity.
#Travel the API in each scenefor TestReporTGGet in range(0, TestReporTgCount001): TestReporTGName = TestReporTG[TestReporTGGet] # print(TestReporTGName,"type is:",type(TestReporTGName)) # Special symbol filtering TestReporTGNameFiltration = ('['!"#$%&\'()*+,./:;<=>?@,。?★、…【】《》?“”‘'![\\]^_`{|}~\s]+', "", TestReporTGName) # Special\Blade Filter TestReporTGNameFiltration01 = (r'\\', "", TestReporTGNameFiltration) # Get the response body url ResponseUrl = "http://ms.***.*****:****/api/scenario/report/selectReportContent/" + TestReporTGNameFiltration01 print("Node address in the scene:"+ResponseUrl) ResponseUrlResp = (url=ResponseUrl, headers=TestReportHeaders) = "utf-8" #print() ResponseUrlRespGet = # Get the name of each process node and filter it TestResponseResp = ('","name":"(.+?)",', ResponseUrlRespGet) TestResponseRespGet=str(TestResponseResp) TestResponseRespGetFiltration=('[a-zA-Z0-9'!"#$%&\'()*+,-./:;<=>?@,。?★、…【】《》?“”‘'![\\]^_`{|}~\s]+', "", TestResponseRespGet) print("Root node processing report name:" + TestResponseRespGetFiltration) #Getting the root node successfully or failed RootNodeSucceed=('"pass":(.+?)}]}}}',ResponseUrlRespGet) RootNodeSucceedGet=str(RootNodeSucceed) print("Root node failed or successful status identities:"+RootNodeSucceedGet) #Define local variables, use state translation #if #Get back-reference data of the root node #ReturnParameter=('"vars":"(.+?)}]}}}',ResponseUrlRespGet) #ReturnParameter=('"data":(.+?)}}}',ResponseUrlRespGet) #ReturnParameterGet=str(ReturnParameter) # Get the root node return parameter data ReturnParameter=('"body":"{(.+?)]}}}',ResponseUrlRespGet) ReturnParameterGet=str(ReturnParameter) ReturnParameterGettest=(r'\\', "", ReturnParameterGet) print("Filtered message fields"+ReturnParameterGettest) #print("Length:" + len(ReturnParameterGet)) Namenge_list = list(ReturnParameterGet) #print("Length:" + len(Namenge_list)) is_contain_ch29 = '"error":0' in ResponseUrlRespGet # Get the root node status code # RootState=('"responseCode":"(.+?)","',ResponseUrlRespGet) # RootStateGet=str(RootState) # print("root node status code;"+RootStateGet) #Get insurance policy number InsureGain=('proposalNo:(.+?)nqueryCode',ResponseUrlRespGet) InsureGainGet=str(InsureGain) InsureGainGetDispose=('['!"#$%&\'()*+,-./:;<=>?@,。?★、…【】《》?“”‘'![\\]^_`{|}~\s]+', "",InsureGainGet) InsureGainGetDispose01=(r'\\', "",InsureGainGetDispose) print("Insurance policy number:",InsureGainGetDispose01) #Get the quotation number PriceSheet=('quotationNo:(.+?)nquotationNo1:',ResponseUrlRespGet) PriceSheetGet=str(PriceSheet) PriceSheetGetDispose=('['!"#$%&\'()*+,-./:;<=>?@,。?★、…【】《》?“”‘'![\\]^_`{|}~\s]+', "",PriceSheetGet) PriceSheetGetDispose01=(r'\\', "",PriceSheetGetDispose) print("Quote number:",PriceSheetGetDispose01) #Get orderMessageList orderMessageList=('"assertions":(.+?)"pass":true',ResponseUrlRespGet) orderMessageListGet=str(orderMessageList) print("Get orderMessageList:",orderMessageListGet) print("===============================================================================") #print("root node return parameter:"+ReturnParameterGet) print("Whether it is:",is_contain_ch29) print("The interface of the root node:"+ReportParticularsUrl) print("===============================================================================")
2. After the preliminary crawler work has been completed, we mainly carry out storage processing.
1. Read the table automatically generated in the previous section and locate it.
2. is_contain_ch29 = '"error":0' in ResponseUrlRespGet The main purpose is to determine whether the execution result of this node is successful or failed. Here you can determine the success value set in ms.
# Read the original formworkbook = xlrd.open_workbook("") # Get the name of the first sheet in the original form (I only have 1 sheet here)all_sheet = workbook.sheet_names() first_sheet = workbook.sheet_by_name(all_sheet[0]) # Get the number of rows written to the first sheet of the original tablerows = first_sheet.nrows # Copy the new excel and append the write from the rows linenew_workbook = copy(workbook) new_sheet = new_workbook.get_sheet(0) # Create a titlenew_sheet.write(0, 0, "Scene") # Scenenew_sheet.write(0, 1, "Report Name") # Report namenew_sheet.write(0, 2, "interface") #Interfacenew_sheet.write(0, 3, "Return status code") # Return status codenew_sheet.write(0, 4, "Insurance policy number") # Insurance policy numbernew_sheet.write(0, 5, "Quote Number") # Quote numbernew_sheet.write(0, 6, "state") # Quote numbernew_sheet.write(0, 7, "orderMessageList") # Quote numbernew_sheet.write(0, 8, "path") # Pathnew_sheet.write(0, 9, "Document time") # Quote number# (row, column, value)new_sheet.write(rows, 0, GetSceneNameGain) # Scenenew_sheet.write(rows, 1, TestResponseRespGetFiltration) # Report namenew_sheet.write(rows, 2, ReportParticularsUrl) #Interfacenew_sheet.write(rows, 3, GetStateGain) # Return status codenew_sheet.write(rows, 4, InsureGainGetDispose01) # Insurance policy numbernew_sheet.write(rows, 5, PriceSheetGetDispose01) # Quote numberif is_contain_ch29 == True: new_sheet.write(rows, 6, "success") # state new_sheet.write(rows, 8, "") # Return to the messageelse: print("ReturnParameterGettest length is:",len(ReturnParameterGettest)) if len(ReturnParameterGettest)<=32000: new_sheet.write(rows, 8, ReturnParameterGettest) # Return to the message else: new_sheet.write(rows, 8, "Execution failed, the message is long, the document cannot be stored") # Return to the message new_sheet.write(rows, 6, "Failed or not executed") # statenew_sheet.write(rows, 7, orderMessageListGet) # orderMessageList new_sheet.write(rows, 9, str(current_time)) # Current system timenew_workbook.save("")
Previous article "Using the Python crawler MeterSphere Platform Execution Report (II)"
The above is the detailed content of the actual implementation report of the Python crawler MeterSphere platform. For more information about the Python crawler MeterSphere, please follow my other related articles!