Predictive Experiment graph not implemented in web-service RRS feed

  • Question

  • I have trained a model using some feature pre-processing in a python module.  When I deploy this model as a predictive experiment web-service, the python module pre-processing seems to be skipped.

    When I test the web service, the features from the python pre-processing are missing.

    How do I fix this?  I would love to post images, but I am not verified.

    Friday, August 2, 2019 7:41 PM

All replies

  • Hello Lucas,

    Based on the description of the issue, It looks like you are using the execute python script to pre-process your data before it is used by your trained model. In the deployed web service it could be the web service input is connected to a module after the python module which is skipping the pre-processing of your data. Could you please enable logging for your web service to check for more details?

    It would be great if you can post more details along with screen shot of the experiment?


    Monday, August 5, 2019 10:51 AM
  • I currently have the web service input feeding directly into the python script.  As I said above, I cannot post images because my user account is not "verified" yet.

    I am using Visual Interface for ML Services, so I don't see a configure option to turn on logging.


    Monday, August 5, 2019 6:02 PM
  • Hi Lucas,

    Once the web service is deployed you will have the option to enable logging from the configure tab. Please check the steps mentioned here.

    Set the Enable Logging option to Error (to log only errors) or All (for full logging).

    Select logging level

    Tuesday, August 6, 2019 4:31 AM
  • Thank you for the response. I do see the logging option from your screenshot in ML Studio.  However, as I stated above, I am using Visual Interface for ML Service.  That web-service screen does not have a "Configure" tab, and does not have the option in your screenshot.  Please address all responses to Visual Interface for ML Service.
    Wednesday, August 7, 2019 1:50 PM
  • Hello Lucas,

    Apologies for referring to the ML studio v1 service screen. Yes, The logging option does not show for the visual interface currently. 

    Regarding the custom python module pre-processing it looks like when the predictive experiment is setup the module steps might not be included in the Apply Transformation module of your predictive experiment. You can check the log of this module by right clicking on it -> View Log -> Output Log. If your custom processing is not in the log it might not be included in the predictive experiment. You can try to include the same by adding the same module from training into the predictive experiment and run the same again before re-publishing the web service. 

    Overwrite of web service.

    Friday, August 9, 2019 10:39 AM
  • Yes, the python module is connected to the web service input, and I do not see an Apply Transformation module in either of my graphs.

    What I am doing in python is the same procedure as the "Convert to Indicator Values" module using pd.get_dummies().  So, I switched to using the Convert to Indicator module instead of the Python code module, but I still have the same issue.  My imported training data has an indicator column "indcr" that can have Y or N in it.  My training dataset after the Indicator module then has two columns indcr-N and indcr-Y.  However, when I train a model, deploy as a web service, and send one row to test, say with indcr=N, I get an error and a note that the indcr-Y field is missing.  When I change the value to Y, I also get the error and a note that the indcr-N field is now missing.

    This appears to be a bug on your part, although correcting it could be quite challenging.  I would need to pull up the data schema and fill in 0s for all possible values from the training set that do not come from the single web service row. 

    Can you rectify on your side?


    • Edited by Lucas-Finco Thursday, August 15, 2019 11:55 PM grammar
    Thursday, August 15, 2019 11:46 PM