none
Score Model (AFx Library) : table: The data set being scored must contain all features used during training, missing feature(s): 'hello'. RRS feed

  • Question

  • Hi All,

    I created the text category classification based on below classifier. 

    https://gallery.azure.ai/Experiment/BBC-News-Classifier-1

    Dataset is in CSV file which has two columns such as text and category. I could build the training experiment, predictive experiment and publish "New" web service. Everything is working fine and tested the web service using web service test page. 

    I am implementing the retraining model. I follow the steps from below article. 

    https://docs.microsoft.com/en-us/azure/machine-learning/studio/retrain-machine-learning-model

    I could update ilearner file in the existing web service. The issue is that I used new data set which contains new "hello" keyword under "Text" column and category is the existing category. After training using retraining model Batch web service and updating the ilearner file in the existing web service for prediction, I test the web service. I got below error. "Hello" is new keyword which doesn't exist in original dataset but it is included in new dataset during the retraining process. Please advise how to solve below error. 

    Score Model (AFx Library) : table: The data set being scored must contain all features used during training, missing feature(s): 'hello'.

    Wednesday, October 30, 2019 8:36 AM

Answers

  • Hi Rohit,

    Found the root cause that just need to update the source file of Import Data module with updated data file (with hello keyword). After that, the error is resolved. The reason is that I use different data file from source file of Import Data module and didn't update the source file of Import Data module when retraining the model.

    Every time calling the prediction web service will create the dictionary again based on source file of Import Data module. Source file doesn't include "Hello" keyword therefore it doesn't create a keyword in the dictionary. After updating the source file of Import Data module, the web service is working fine.

    I believe the current design will cost more compute hour. Therefore, the solution should be changed to save the dictionary and reuse it instead of creating new dictionary again whenever the system requests for prediction.

    Anyway, thank you for your help. 


    Wednesday, November 20, 2019 9:52 AM

All replies

  • Hello EK1985,

    I think the web service did not update correctly to point to the new ilearner file if you have followed the steps mentioned in the documentation mentioned above for retraining model.

    The update the web service section provides the command to update the service but without the ServiceUpdates parameter which has the new configuration or the ilearner file.  The command should instead be:

    Update-AzMlWebService -Name '<web_Service_names>' -ResourceGroupName 'RG_Name' -ServiceUpdates $wsd
    

    Please try to update the service again and check if the error shown by the web service goes away. 

    -----------------------------------------------------------------------------------------------------------
    If you found this post helpful, please give it a "Helpful" vote. 
    Please remember to mark the replies as answers if they help. 
    Friday, November 1, 2019 5:16 AM
    Moderator
  • Hi RohitMungi,

    Thank you for the reply.

    I used update-azmlwebservice command with "$wsd". new ilearner file is updated that i exported the json and check it again. Because of new ilearner file, the error is thrown. If it is with old ilearner file, the error was not thrown. 


    • Edited by EK1985 Tuesday, November 5, 2019 1:22 AM
    Tuesday, November 5, 2019 1:11 AM
  • Hello EK1985,

    I have tried to run the experiment for the gallery and followed the steps to re-run the experiment along with running the BES to get a prediction. According to the experiment in the gallery the web service output does not output a trained model, if you have modified the experiment to output a model could you please publish your experiment to check if there are any issues?

    For the standard experiment from gallery here is what I have run.

    1. Correct the experiment to run the web service to provide correct output.

    2. As per the retraining web service document I have used the BES to provide a input dataset with a new category and the output csv file provided the output for the new category too.

    Input:

     

    Output:

    According to the published experiment you can run the steps of BES or single request-response to predict for a new category. Since the default experiment does not provide a ilearner file as output could you please publish your experiment to check if there is any issue in the training and predictive experiment?

    -Rohit

    Tuesday, November 5, 2019 10:53 AM
    Moderator
  • Hi Rohit,

    Thank you for the reply. 

    I added web service output under "Tune Mode Hyperparameter" and then ilearner file is generated. I am not sure it is correct way. If default experiment can't generate ilearner file, do you have any idea how I can retrain the model with new dataset programmatically?    

    Here is my training experiment in galllery

    https://gallery.cortanaintelligence.com/Experiment/EKFeedbackClassifier 

    I added input and output web service based on this article (https://docs.microsoft.com/en-us/azure/machine-learning/studio/retrain-machine-learning-model) for retrain. 

    I created predictive experiment based on this article 

    https://social.msdn.microsoft.com/Forums/SqlServer/en-US/6acd5004-1a9b-4585-be9d-a4ce2d0ef0d1/experiment-runs-ok-but-web-service-api-test-fails-failedtoevaluaterscriptinternal-library-error?forum=MachineLearning 

    I run training experiment, publish "training experiment" as a "new" web service, predictive experiment and publish "predictive experiment as a "new" web service for the prediction. I tested the prediction using test page (Request-Response) of published predictive experiment web service. It is working fine.  

    However, i need to automate the retraining process. Therefore, i followed the steps from the article.

    https://docs.microsoft.com/en-us/azure/machine-learning/studio/retrain-machine-learning-model 

    I will create .net console application to retrain the model by copying C# code from batch tab under "Consume" tab of the "training experiment web service (retrain web service). 

    I added below record to the original csv file for retrain the model using console application. 

    Feedback,Category

    Hello good service,Positive

    After retraining the model, i update .ilearner file path from the output of console application into Predictive experiment web service using powershell command. 

     I tested the prediction using test page (Request-Response) of published predictive experiment web service which ilearner file has been updated. I got below error.

    Score Model (AFx Library) : table: The data set being scored must contain all features used during training, missing feature(s): 'hello'.

    Using new dataset file, I ran training experiment, predictive experiment and publish predictive experiment web service in ML studio. After that, I tested the prediction using test page of published predictive experiment web service that is working. 

    Appreciate your help. 




    • Edited by EK1985 Wednesday, November 6, 2019 12:32 AM
    Tuesday, November 5, 2019 3:03 PM
  • Hi Rohit,

    I posted it in retrain the model article page as well. 

    Let me know whether my understanding is correct.

    Publishing predictive web service in ML studio

    1. Create ML solution in Training Experiment in ML studio

    2. After developing the solution in Training Experiment, run Training Experiment in ML studio

    3. Create Predictive Experiment from Training Experiment in ML studio.

    4. Add web service (input and output) in Predictive Experiment in ML studio.

    5. Run Predictive Experiment in ML studio

    6. Publish "new" web service in Predictive Experiment.

    Publishing retrain web service in ML studio

    1. Under Training Experiment, add web service input and outputs according to the https://docs.microsoft.com/en-us/azure/machine-learning/studio/retrain-machine-learning-model

    2. Run Training Experiment in ML stuido

    3. Publish "new" web service in Training Experiment

    Retraining ML model

    1. Use the sample code from "Batch" tab of "Consume" page of Retrain web service (Training Experiment web service)

    2. The code will upload new data file into Azure Blob storage and call Retrain web service that will generate ilearner file. 

    3. Use Import-AzMlWebService and Update-AzMlWebService powershell commend to update new ilearner path in Predictive experiment web service. 

    According to this retraining ML model, i believe the code under point number 2 of Retraining ML model is similar to "Run Training Experiment in ML studio" step. Point number 3 is to update the updated trained model to the predictive web service. Based on this, it miss "Run Predictive Experiment in ML studio" step. 

    If I follow step 1 to 6 under Publishing predictive web service in ML studio section with new data file, predictive web service doesn't throw an error

     How can we run Predictive Experiment programmatically instead of ML studio? My goal is to implement retraining solution in .net console application. Therefore, the model will be up to date with new data. 

    Wednesday, November 6, 2019 1:34 AM
  • Hello EK1985,

    Yes, the steps mentioned above are the correct way to create a new model file and update the same against the web service that is using it. 

    In this case I have downloaded your experiment and run the training and predictive experiments before trying to create a new model or ilearner file to update against the web service. In this case as seen in the screen shots the model file will not get created as the web output for the trained model is not available. If I try to provide a ilearner file in the code for download the web service returns the following error.

    Are you able to locate the new model file in your storage container after running the BES code snippet as the current experiment does not have a trained model as web output to update the same for a retraining web service.

    -Rohit

    Wednesday, November 6, 2019 9:35 AM
    Moderator
  • Hi Rohit,

    There are two web service outputs such as trainedModelOutput and evaluationModelOutput. We need to set .learner file under "trainedModelOutput" and evaluationModelOutput should be .csv file. 

    At my end, .ilearner file is generated after running the below code and I can find .ilearner file in blob storage container that i configured in the code. I used below code. It hits to this ProcessResults function that web service call is successfully completed. However, I have an error on SaveBlobToFile function which saves the file from blob storage into local that i can fix it later. 

    // This code requires the Nuget package Microsoft.AspNet.WebApi.Client to be installed.
    // Instructions for doing this in Visual Studio:
    // Tools -> Nuget Package Manager -> Package Manager Console
    // Install-Package Microsoft.AspNet.WebApi.Client
    //
    // Also, add a reference to Microsoft.WindowsAzure.Storage.dll for reading from and writing to the Azure blob storage

    using System;
    using System.Collections.Generic;
    using System.Diagnostics;
    using System.Globalization;
    using System.IO;
    using System.Linq;
    using System.Net.Http;
    using System.Net.Http.Formatting;
    using System.Net.Http.Headers;
    using System.Runtime.Serialization;
    using System.Text;
    using System.Threading;
    using System.Threading.Tasks;

    using Microsoft.WindowsAzure.Storage;
    using Microsoft.WindowsAzure.Storage.Auth;
    using Microsoft.WindowsAzure.Storage.Blob;

    namespace MLRetrainingTestApp
    {
        public class AzureBlobDataReference
        {
            // Storage connection string used for regular blobs. It has the following format:
            // DefaultEndpointsProtocol=https;AccountName=ACCOUNT_NAME;AccountKey=ACCOUNT_KEY
            // It's not used for shared access signature blobs.
            public string ConnectionString { get; set; }

            // Relative uri for the blob, used for regular blobs as well as shared access
            // signature blobs.
            public string RelativeLocation { get; set; }

            // Base url, only used for shared access signature blobs.
            public string BaseLocation { get; set; }

            // Shared access signature, only used for shared access signature blobs.
            public string SasBlobToken { get; set; }
        }

        public enum BatchScoreStatusCode
        {
            NotStarted,
            Running,
            Failed,
            Cancelled,
            Finished
        }

        public class BatchScoreStatus
        {
            // Status code for the batch scoring job
            public BatchScoreStatusCode StatusCode { get; set; }

            // Locations for the potential multiple batch scoring outputs
            public IDictionary<string, AzureBlobDataReference> Results { get; set; }

            // Error details, if any
            public string Details { get; set; }
        }

        public class BatchExecutionRequest
        {

            public IDictionary<string, AzureBlobDataReference> Inputs { get; set; }

            public IDictionary<string, string> GlobalParameters { get; set; }

            // Locations for the potential multiple batch scoring outputs
            public IDictionary<string, AzureBlobDataReference> Outputs { get; set; }
        }

        class Program
        {
            static void Main(string[] args)
            {
                InvokeBatchExecutionService().Wait();
            }

            static async Task WriteFailedResponse(HttpResponseMessage response)
            {
                Console.WriteLine(string.Format("The request failed with status code: {0}", response.StatusCode));

                // Print the headers - they include the requert ID and the timestamp, which are useful for debugging the failure
                Console.WriteLine(response.Headers.ToString());

                string responseContent = await response.Content.ReadAsStringAsync();
                Console.WriteLine(responseContent);
            }

            static void SaveBlobToFile(AzureBlobDataReference blobLocation, string resultsLabel)
            {
                const string OutputFileLocation = @"D:\xxx\outputfeedbackdata.csv"; // Replace this with the location you would like to use for your output file, and valid file extension (usually .csv for scoring results, or .ilearner for trained models)

                var credentials = new StorageCredentials(blobLocation.SasBlobToken);
                var blobUrl = new Uri(new Uri(blobLocation.BaseLocation), blobLocation.RelativeLocation);
                var cloudBlob = new CloudBlockBlob(blobUrl, credentials);

                Console.WriteLine(string.Format("Reading the result from {0}", blobUrl.ToString()));
                cloudBlob.DownloadToFile(OutputFileLocation, FileMode.Create);

                Console.WriteLine(string.Format("{0} have been written to the file {1}", resultsLabel, OutputFileLocation));
            }

            static void UploadFileToBlob(string inputFileLocation, string inputBlobName, string storageContainerName, string storageConnectionString)
            {
                // Make sure the file exists
                if (!File.Exists(inputFileLocation))
                {
                    throw new FileNotFoundException(
                        string.Format(
                            CultureInfo.InvariantCulture,
                            "File {0} doesn't exist on local computer.",
                            inputFileLocation));
                }

                Console.WriteLine("Uploading the input to blob storage...");

                var blobClient = CloudStorageAccount.Parse(storageConnectionString).CreateCloudBlobClient();
                var container = blobClient.GetContainerReference(storageContainerName);
                container.CreateIfNotExists();
                var blob = container.GetBlockBlobReference(inputBlobName);
                blob.UploadFromFile(inputFileLocation);
            }

            static void ProcessResults(BatchScoreStatus status)
            {
                bool first = true;
                foreach (var output in status.Results)
                {
                    var blobLocation = output.Value;
                    Console.WriteLine(string.Format("The result '{0}' is available at the following Azure Storage location:", output.Key));
                    Console.WriteLine(string.Format("BaseLocation: {0}", blobLocation.BaseLocation));
                    Console.WriteLine(string.Format("RelativeLocation: {0}", blobLocation.RelativeLocation));
                    Console.WriteLine(string.Format("SasBlobToken: {0}", blobLocation.SasBlobToken));
                    Console.WriteLine();

                    // Save the first output to disk
                    if (first)
                    {
                        first = false;
                        SaveBlobToFile(blobLocation, string.Format("The results for {0}", output.Key));
                    }
                }
            }

            static async Task InvokeBatchExecutionService()
            {
                // How this works:
                //
                // 1. Assume the input is present in a local file (if the web service accepts input)
                // 2. Upload the file to an Azure blob - you'd need an Azure storage account
                // 3. Call the Batch Execution Service to process the data in the blob. Any output is written to Azure blobs.
                // 4. Download the output blob, if any, to local file

                const string BaseUrl = "https://xxx/jobs";

                const string StorageAccountName = "xxx"; // Replace this with your Azure Storage Account name
                const string StorageAccountKey = "xxx"; // Replace this with your Azure Storage Key
                const string StorageContainerName = "xxx"; // Replace this with your Azure Storage Container name
                string storageConnectionString = string.Format("DefaultEndpointsProtocol=https;AccountName={0};AccountKey={1}", StorageAccountName, StorageAccountKey);

                const string apiKey = "xxx"; // Replace this with the API key for the web service

                // set a time out for polling status
                const int TimeOutInMilliseconds = 120 * 1000; // Set a timeout of 2 minutes

                UploadFileToBlob(@"D:\xxx\feedbacktext.csv" /*Replace this with the location of your input file, and valid file extension (usually .csv)*/,
                        "feedbacktext.csv" /*Replace this with the name you would like to use for your Azure blob; this needs to have the same extension as the input file */,
                        StorageContainerName, storageConnectionString);

                using (HttpClient client = new HttpClient())
                {
                    var request = new BatchExecutionRequest()
                    {
                        Inputs = new Dictionary<string, AzureBlobDataReference>() {
                            {
                                "input1",
                                 new AzureBlobDataReference()
                                 {
                                     ConnectionString = storageConnectionString,
                                     RelativeLocation = string.Format("{0}/feedbacktext.csv", StorageContainerName)
                                 }
                            },
                        },

                        Outputs = new Dictionary<string, AzureBlobDataReference>() {
                            {
                                "trainedModelOutput",
                                new AzureBlobDataReference()
                                {
                                    ConnectionString = storageConnectionString,
                                    RelativeLocation = string.Format("{0}/feedbacktexttrainedModelOutputresults.ilearner", StorageContainerName) /*Replace this with the location you would like to use for your output file, and valid file extension (usually .csv for scoring results, or .ilearner for trained models)*/
                                }
                            },
                            {
                                "evaluationModelOutput",
                                new AzureBlobDataReference()
                                {
                                    ConnectionString = storageConnectionString,
                                    RelativeLocation = string.Format("{0}/feedbacktextevaluationModelOutputresults.csv", StorageContainerName) /*Replace this with the location you would like to use for your output file, and valid file extension (usually .csv for scoring results, or .ilearner for trained models)*/
                                }
                            },
                        },

                        GlobalParameters = new Dictionary<string, string>()
                        {
                        }
                    };

                    client.DefaultRequestHeaders.Authorization = new AuthenticationHeaderValue("Bearer", apiKey);

                    // WARNING: The 'await' statement below can result in a deadlock
                    // if you are calling this code from the UI thread of an ASP.Net application.
                    // One way to address this would be to call ConfigureAwait(false)
                    // so that the execution does not attempt to resume on the original context.
                    // For instance, replace code such as:
                    //      result = await DoSomeTask()
                    // with the following:
                    //      result = await DoSomeTask().ConfigureAwait(false)

                    Console.WriteLine("Submitting the job...");

                    // submit the job
                    var response = await client.PostAsJsonAsync(BaseUrl + "?api-version=2.0", request);

                    if (!response.IsSuccessStatusCode)
                    {
                        await WriteFailedResponse(response);
                        return;
                    }

                    string jobId = await response.Content.ReadAsAsync<string>();
                    Console.WriteLine(string.Format("Job ID: {0}", jobId));

                    // start the job
                    Console.WriteLine("Starting the job...");
                    response = await client.PostAsync(BaseUrl + "/" + jobId + "/start?api-version=2.0", null);
                    if (!response.IsSuccessStatusCode)
                    {
                        await WriteFailedResponse(response);
                        return;
                    }

                    string jobLocation = BaseUrl + "/" + jobId + "?api-version=2.0";
                    Stopwatch watch = Stopwatch.StartNew();
                    bool done = false;
                    while (!done)
                    {
                        Console.WriteLine("Checking the job status...");
                        response = await client.GetAsync(jobLocation);
                        if (!response.IsSuccessStatusCode)
                        {
                            await WriteFailedResponse(response);
                            return;
                        }

                        BatchScoreStatus status = await response.Content.ReadAsAsync<BatchScoreStatus>();
                        if (watch.ElapsedMilliseconds > TimeOutInMilliseconds)
                        {
                            done = true;
                            Console.WriteLine(string.Format("Timed out. Deleting job {0} ...", jobId));
                            await client.DeleteAsync(jobLocation);
                        }
                        switch (status.StatusCode)
                        {
                            case BatchScoreStatusCode.NotStarted:
                                Console.WriteLine(string.Format("Job {0} not yet started...", jobId));
                                break;
                            case BatchScoreStatusCode.Running:
                                Console.WriteLine(string.Format("Job {0} running...", jobId));
                                break;
                            case BatchScoreStatusCode.Failed:
                                Console.WriteLine(string.Format("Job {0} failed!", jobId));
                                Console.WriteLine(string.Format("Error details: {0}", status.Details));
                                done = true;
                                break;
                            case BatchScoreStatusCode.Cancelled:
                                Console.WriteLine(string.Format("Job {0} cancelled!", jobId));
                                done = true;
                                break;
                            case BatchScoreStatusCode.Finished:
                                done = true;
                                Console.WriteLine(string.Format("Job {0} finished!", jobId));
                                ProcessResults(status);
                                break;
                        }

                        if (!done)
                        {
                            Thread.Sleep(1000); // Wait one second
                        }
                    }
                }
            }
        }
    }


    • Edited by EK1985 Wednesday, November 6, 2019 10:07 AM
    Wednesday, November 6, 2019 10:01 AM
  • Hi Rohit,

    Are you able to replicate the issue based on based on above code? 

    If there is a new keyword, it generates new feature column. If there is no new keyword during the retrain the model, it is working fine. Do you know how I can add those new feature columns (which are created by new keywords) into score model during the retraining process? 

    Friday, November 8, 2019 3:36 AM
  • Hello EK1985,

    I was able to create the web service with the ilearner output file after creating a web service directly from the training experiment. Previously, I was updating the predictive experiment and deploying a web service from the predictive experiment. But, during the update of the web service Update-AzMlWebService the command fails. 

    Could you please confirm for which asset did you update the ilearner file info? I have tried to use the new ilearner file for score model as we do not have a train model module as mentioned in the retraining documentation.

    Also, for this experiment's scenario I will reach out internally to check if all the new features used during training are required while scoring as the train module is missing during the re-training process.

    -Rohit

    Friday, November 8, 2019 11:18 AM
    Moderator
  • Hi Rohit,

    ilearner file is generated from Tune Model Hyperparameters as a web service output. Because there is no train model in this experiment. 

    After getting ilearner file from Tune Model Hyperparameters by running retraining web service, I updated to "aseet7" of the predictive experiment web service using powershell script. 

    =============

      "asset7": {
            "name": "Feedback Classifier [trained model]",
            "type": "Resource",
            "locationInfo": {
              "uri": "xxxx/feedbacktexttrainedModelOutputresults.ilearner"
            },
            "outputPorts": {
              "Results dataset": {
                "type": "Dataset"
              }
            }
          }

    ========================




    • Edited by EK1985 Monday, November 11, 2019 2:12 AM
    Monday, November 11, 2019 2:12 AM
  • Hello EK1985,

    I have created a new experiment and web service test the same with the updated ilearner file and I did not get the error seen in your case. Here are the steps I followed.

    1. Created and ran the following training experiment.

    2. Created the following predictive experiment

    3. Exported and updated the ilearner file reference.


    4. Tested the words not in the training and the service provided the following values.

    Since we are unable to repro the issue could you please check if recreating the experiment works for you as well?

    -Rohit 

    Tuesday, November 12, 2019 9:21 AM
    Moderator
  • Hi Rohit,

    Thank you for the reply.

    I am still getting the error when creating another experiment. Can I confirm that you use different data file for training in ML studio and retraining the model using the code and PowerShell? Because it is working fine if i use same data files for training in ML studio and retraining the model using the code and PowerShell. I got the error in Predictive Web Service test page after adding new record (new keyword) into the data file when retraining the model using the code and PowerShell. New keyword (Hello) was not included in the data file when running the experiments in ML studio. 

    Below data file is used when running training experiment and predictive experiment in ML studio. 

    Feedback,Category
    A very good service provided.,Positive
    Overall is good.,Positive
    Fast and clear instruction and giving a very pleasant experience.,Positive
    Seamless and efficient service.,Positive
    Very Good.,Positive
    Thank u so much for the quick reply. This is really helpful.,Positive
    Excellent service & quick response to customer's application! Good.,Positive
    Thank you very much for being very informative! Extremely helpful and kind.,Positive
    Make waiting time shorter.,Negative
    Bad customer service.,Negative

    Below data file is used when calling retrain web service with code

    Feedback,Category
    A very good service provided.,Positive
    Overall is good.,Positive
    Fast and clear instruction and giving a very pleasant experience.,Positive
    Seamless and efficient service.,Positive
    Very Good.,Positive
    Thank u so much for the quick reply. This is really helpful.,Positive
    Excellent service & quick response to customer's application! Good.,Positive
    Thank you very much for being very informative! Extremely helpful and kind.,Positive
    Make waiting time shorter.,Negative
    Bad customer service.,Negative
    Hello good service,Positive

    Tuesday, November 12, 2019 11:03 AM
  • Hello EK1985,

    I have retried this experiment with different datasets and was able to repro the error you have seen. Initially, it looked like an error with the powershell commandlets not updating the predictive service correctly but retraining the dataset with original data using the batch script works so I think this could be an issue with the way the experiment is built which might not work with the retraining scenario mentioned in the document. We will try to check for other experiments that work with retraining datasets and modify this experiment to check if it works as expected. Thanks for helping to understand this scenario!!

    -Rohit

    Thursday, November 14, 2019 12:17 PM
    Moderator
  • Hi Rohit,

    Thank you for the reply. 

    Do you know if there is an API to programmatically run the existing predictive experiment and publish the web service like in ML studio? If we have that API, I believe we can solve this issue. Look like score model in predictive experiment is not updated in the published web service.

    Additionally, do you have any other solution that will work for text analytics because I need to retrain the model every day to have the updated data to predict for the text for next day ? Therefore, it is not feasible that user is to run it every day manually.  

    Friday, November 15, 2019 12:16 AM
  • Hi Rohit,

    Could you please replicate it in other experiments? If yes, will it be a bug in ML studio? 

    Tuesday, November 19, 2019 2:12 AM
  • Hi Rohit,

    Found the root cause that just need to update the source file of Import Data module with updated data file (with hello keyword). After that, the error is resolved. The reason is that I use different data file from source file of Import Data module and didn't update the source file of Import Data module when retraining the model.

    Every time calling the prediction web service will create the dictionary again based on source file of Import Data module. Source file doesn't include "Hello" keyword therefore it doesn't create a keyword in the dictionary. After updating the source file of Import Data module, the web service is working fine.

    I believe the current design will cost more compute hour. Therefore, the solution should be changed to save the dictionary and reuse it instead of creating new dictionary again whenever the system requests for prediction.

    Anyway, thank you for your help. 


    Wednesday, November 20, 2019 9:52 AM