none
Remove unnecessary segments RRS feed

  • Question

  • Hi all,

    I would like to modify my EDI schemas to only contain the expected segments from trading partners. This will should reduce the schema file size. I used to do this with Mercator\Ascential DataStage Type Tree. I know I can do the same in BizTalk; however, how can I test the schema against the incoming edi file? I tried by right clicking the a map and selecting "Test Map", but I was getting all sorts of errors when the EDI Notepad application had no issues analyzing it. Is there any other way to test my schema against the incoming EDI file?

    Thanks in advance.

    Wednesday, March 26, 2014 6:57 PM

Answers

  • Yes, you can customize any EDI Schema.  But you must then assign a custom namespace to differentiate it from other flavors of that transaction.

    Testing modified Schemas is no different than testing the out-of-the-box Schemas.  You can Validate Instance againt the ST...SE content on the .xsd or Test Map with the input type set to Native.  Again, only the ST...SE can appear in the instance.

    • Marked as answer by admdev Wednesday, March 26, 2014 8:55 PM
    Wednesday, March 26, 2014 7:29 PM

All replies

  • Yes, you can customize any EDI Schema.  But you must then assign a custom namespace to differentiate it from other flavors of that transaction.

    Testing modified Schemas is no different than testing the out-of-the-box Schemas.  You can Validate Instance againt the ST...SE content on the .xsd or Test Map with the input type set to Native.  Again, only the ST...SE can appear in the instance.

    • Marked as answer by admdev Wednesday, March 26, 2014 8:55 PM
    Wednesday, March 26, 2014 7:29 PM
  • Thanks boatseller.

    After a few errors, I believe I am able to validate now. As you suggested, by right clicking the schema and selecting "Validate Instance" I got it to validate. You also suggested to remove the ISA, GS, GE, and IEA segments, which seemed to do the trick; however, when one selects "Validate Instance" a window pops up asking for ISA related information. Is this only used as a reference for the rest of the data?

    Thanks.

    Wednesday, March 26, 2014 8:17 PM
  • The popup is just for information that would be in the ISA segment, delimiters etc.  It's not persisted anywhere.
    Wednesday, March 26, 2014 8:28 PM
  • Got it.

    Thanks.

    Wednesday, March 26, 2014 8:55 PM
  • boatseller,

    Another question - I am trying to validate data for another Trading Partner. The data delimiters are different than the one I previously validated. Below is the sample data.

    ISA~00~          ~00~          ~01~000000000      ~12~123456789TST  ~131112~1236~U~00401~000000001~0~T~>^
    GS~PO~000000000~123456789TST~20131112~1236~1~X~004010^

    For this data, what values should I enter in the EDI instances Properties popup? I entered "~" for the Data Element separator. For Component Element Separator I entered ">". For Segment Separator I used "^". And lastly for the Segment Separator suffix I used "LF" Line feed. However, I am getting this error "Segment separator (106th char of ISA): Delimiter value is duplicate.".

    Any ideas?

    Thanks

    Thursday, March 27, 2014 5:48 PM
  • All your choices look correct.  Double check, ~ is the default Segment Separator.
    Thursday, March 27, 2014 6:59 PM
  • Default Segment Separator where?

    Thanks.

    Thursday, March 27, 2014 7:33 PM
  • No, I mean the Segment Separator defaults to ~ so mane sure all the changes stick.
    Thursday, March 27, 2014 7:38 PM
  • If I remove the "^" from the segments, then it works. I need to use this terminator. Is it possible that when validating a schema, it's always expecting a line feed? I will run a test to map some data and see if it will actually work after I deploy the application.

    Any ideas why this is failing?

    Thanks.

    Thursday, March 27, 2014 8:35 PM
  • Try removing any/all CR/LF and try with no suffix.

    It works for sure, but all the delimiter's have to set perfectly.

    Thursday, March 27, 2014 9:49 PM