SI_Documents

Sterling Integrator related documents.
Mirjana's picture

Perimeter Server - ports

Mirjana's picture

Assign Service - bpml code

There is an Assign Service that we can use for multiple assigns, that is always better solution than lot of Assign Elements.
You will be able to see Assign Service in service configurations, but not in Graphical Process Modeler.
It must be added manually into bpml code:
 
<process name="Assign_test_process">
 
          <operation name="AssignService">
                   <participant name="AssignService"/>
                   <output message="AssignOutputMessage">
                             <assign to="." from="*"/>
                            
<!-- add constant value -->
                            
<assign to="name1">value</assign>
 
<!-- add xpath value -->
 
<assign to="name2" from="pathToElement/text()"></assign>
                   </output>
                   <input message="AssignInputMessage">
                             <assign to="." from="*"/>
                   </input>
          </operation>
 
</process>
 

Mirjana's picture

SFG Static Route, Dynamic Route, Custom Layer and Custom Protocol

This document will explain how to create a simple Custom Layer, Custom Protocol and Dynamic Route. All of them will run the same BP, that will do only Translation of the input file. This is for comparing different options for processing an input file, and see how and where translation is handled as well as how it will be reported in the case of errors.
Static route example is also included, but it is not so important as static route normally do not do anything (any processing) with the file.
 
This is the flow that shows how the file, depending on its name, will be routed and processed:
 
 
  • Input File
input-des-21.des (this is just for my reference, can be any input you want)
 
  • Map

 
PRES_map_inhouseToDESADV.map (can be any map that translates the input file)
 
  • Business Process (that will be used in custom layer, custom protocol and dynamic route)
VDC_SFG_translation.bp
 
<process name="default">
 <sequence name="mainStart">
    <operation name="Translation">
      <participant name="Translation"/>
      <output message="TranslationTypeInputMessage">
        <assign to="." from="*"></assign>
        <assign to="map_name">PRES_map_inhouseToDESADV</assign>
      </output>
     <input message="inmsg">
        <assign to="." from="*"></assign>
      </input>
    </operation>
 
    <operation name="FileGatewayRouteEventServiceType">
      <participant name="FileGatewayRouteEventService"/>
      <output message="FileGatewayRouteEventServiceTypeInputMessage">
        <assign to="." from="*"></assign>
        <assign to="." from="RouteEntityType"></assign>
        <assign to="." from="RouteEntityKey"></assign>
        <assign to="." from="RouteMetaData"></assign>
        <assign to="." from="RouteDataflowId"></assign>
        <assign to="EventCode">CUST_0001</assign>
        <assign to="EventAttributes/test">PRES_map_inhouseToDESADV.map</assign>
      </output>
      <input message="inmsg">
        <assign to="." from="*"></assign>
      </input>
    </operation>
 
    <onFault>
      <sequence name="errorStart">
        <operation name="FileGatewayRouteEventServiceType">
          <participant name="FileGatewayRouteEventService"/>
          <output message="FileGatewayRouteEventServiceTypeInputMessage">
            <assign to="." from="*"></assign>
            <assign to="." from="RouteEntityType"></assign>
            <assign to="." from="RouteEntityKey"></assign>
            <assign to="." from="RouteMetaData"></assign>
            <assign to="." from="RouteDataflowId"></assign>
            <assign to="EventCode">CUST_0051</assign>
            <assign to="EventAttributes/test">PRES_map_inhouseToDESADV.map</assign>
          </output>
          <input message="inmsg">
            <assign to="." from="*"></assign>
          </input>
        </operation>
 
      </sequence>
    </onFault>
 </sequence>
</process>
 
  • Custom Events
Two custom events, used in the previous BP (CUST_0001 and CUST_0051), will be defined in customer_overrides.properties. CUST_0001 is to mark successful translation and CUST_0051 is for failed translation.
 
My Example for custom events in customer_overrides.properties:
 
#Event Codes for BP that will be used in dynamic route, custom layer and custom protocol
 
filegateway_eventcodes.CUST_0001.name=Translation in custom layer/protocol/route successful
filegateway_eventcodes.CUST_0001.attributes=ProducerName,ConsumerFilename
filegateway_eventcodes.CUST_0001.text=File {0} translated
filegateway_eventcodes.CUST_0001.description=Message Translation - PASSED
filegateway_eventcodes.CUST_0001.permissions=producer,consumer,subscription
 
 
filegateway_eventcodes.CUST_0051.name=Translation in custom layer/protocol/route Failed
filegateway_eventcodes.CUST_0051.attributes=ProducerName,ConsumerFilename
filegateway_eventcodes.CUST_0051.text=File {0} NOT translated
filegateway_eventcodes.CUST_0051.description=Message Translation - FAILED
filegateway_eventcodes.CUST_0051.permissions=producer,consumer,subscription
 
  • Custom Protocol
 
    1. Add a protocol in AFTExtensionsCustomer.xml (copy AFTExtensions.xml)
 
[installFolder]\install\container\Applications\aft\WEB-INF\classes\resources\xml\ AFTExtensionsCustomer.xml
 
<AFTExtensions>
….
<!—adding custom protocol for translation à
 
<AFTExtension name="translation-protocol" type="consumer-delivery-protocol" label="translation.protocol.label.translationprotocol" bp="VDC_SFG_translation">
                   <GROUP title="translation.instance.group1.title">
                            <VARDEF varname="MapName" type="String" htmlType="text" label="translation.label.translationprotocol.mapname" size="30" maxsize="250" validator="ALPHANUMERIC" required="yes"/>
                   </GROUP>
         </AFTExtension>
        
</AFTExtensions>
 
    1. Add labels for the protocol in AFTExtensionsCustomer.properties (copy of AFTExtensions.properties)
 
[installFolder]\install\container\Applications\aft\WEB-INF\classes\resources\AFTExtensionsCustomer.properties
 
#######################################################
# Translation custom protocol
#######################################################
translation.protocol.label.translationprotocol = Translation Delivery
translation.instance.group1.title = Translation Delivery
translation.label.translationprotocol.mapname = Map Name
 
Stop Sterling File Gateway.
Run <install_dir>\bin\setupfiles.sh.
Run <install_dir>\bin\deployer.sh.
Start Sterling File Gateway.
Note: Once when custom protocol has been added, you must add it to Community first (that Partner belongs to) and a new partner will see a new protocol.
 
  • Custom Layer

 
Input for creating a custom layer:
 
<MultiApi>
         <API Name="manageFgProducerFileLayerType">
                   <!-- This API affects the second Producer File Layer Type
and the Parameter types associated with it. See the
FG_P_FLR_TYPE, FG_P_FLR_PRM_TYPE, and FG_P_FLR_TRANS
table information below. -->
                   <Input>
                            <FgProducerFileLayerType ContainsName="N" Description="Translation" DisplayLabel="Translation" IsContainer="N" LayerType="FGC_TXNCN" ProducerFileLayerTypeKey="TXPNCN_GUID">
                                      <FgProducerFileLayerParameterTypeList TotalNumberOfRecords="2">
                                               <FgProducerFileLayerParameterType DefaultValue=".+" Description="File name pattern as regular expression" DisplayLabel="File name pattern" DisplayType="String" Ordinal="0" ParameterName="FILENAME_PATTERN" ProducerFileLayerParameterTypeKey="TXPNCN_P1_GUID" ProducerFileLayerTypeKey="TXPNCN_GUID"/>
                                               <FgProducerFileLayerParameterType Description="File name pattern group fact names,
comma delimited" DisplayLabel="File name pattern fact names" DisplayType="String" Ordinal="1" ParameterName="FILENAME_PATTERN_FACTNAMES" ProducerFileLayerParameterTypeKey="TXPNCN_P2_GUID" ProducerFileLayerTypeKey="TXPNCN_GUID"/>
                                      </FgProducerFileLayerParameterTypeList>
                            </FgProducerFileLayerType>
                   </Input>
         </API>
         <API Name="manageFgConsumerFileLayerType">
                   <!-- This API affects the second Consumer File Layer Type
(non-container type) and the Parameter types associated
with it. See the FG_C_FLR_TYPE, FG_C_FLR_PRM_TYPE, and
FG_C_FLR_TRANS table information below.
-->
                   <Input>
                            <FgConsumerFileLayerType ConsumerFileLayerTypeKey="TXCNCN_GUID" Description="Translation" DisplayLabel="Translation" IsContainer="N" LayerType="FGC_TXCNCN">
                                      <FgConsumerFileLayerParameterTypeList TotalNumberOfRecords="2">
                                               <FgConsumerFileLayerParameterType ConsumerFileLayerParameterTypeKey="TXCNCON_P1_GUID" ConsumerFileLayerTypeKey="TXCNCN_GUID" DefaultValue="${ProducerFilename}" Description="File name format" DisplayLabel="File name format" DisplayType="String" Ordinal="0" ParameterName="FILENAME_FORMAT"/>
                                               <FgConsumerFileLayerParameterType ConsumerFileLayerParameterTypeKey="TXCNCON_P2_GUID" ConsumerFileLayerTypeKey="TXCNCN_GUID" Description="Map Name" DisplayLabel="Map Name" DisplayType="String" Ordinal="1" ParameterName="Map_Name"/>
                                      </FgConsumerFileLayerParameterTypeList>
                            </FgConsumerFileLayerType>
                   </Input>
         </API>
         <API Name="manageFgConsumerFileLayerTranslation">
                   <Input>
                            <FgConsumerFileLayerTranslation BusinessProcessName="VDC_SFG_translation" ConsumerFileLayerTranslationKey="TXCTRAN1_GUID" ConsumerFileLayerTypeKey="TXCNCN_GUID" ProducerFileLayerTypeKey="TXPNCN_GUID"/>
                   </Input>
         </API>
</MultiApi>
 
This file is an input for XAPI Service and api action is multiApi. After running that service, custom layer will be added in SFG:
 
<process name="VDC_SFG_XAPI">
          <sequence>
                   <operation name="XAPI Service">
                             <participant name="XAPIService"/>
                             <output message="XAPIServiceTypeInputMessage">
                                      <assign to="." from="*"/>
                                      <assign to="api">multiApi</assign>
                             </output>
                             <input message="inmsg">
                                      <assign to="." from="*"/>
                             </input>
                   </operation>
          </sequence>
</process>
 
I will show
 
·         Routing Channel Templates
·         Routing Channels
·         Arrived File Events
·         Route Events
·         Delivery Events
·         Message in a MBX (dashboard interface)
·         Consumer's file in myFileGateway
·         Error in BP used for processing, shown in filegateway and dashboard interface
 
... for the following routing and delivery scenarios:
 
·         Static Route
·         Dynamic Route
·         Custom Layer and
·         Custom Protocol

 

Mirjana's picture

OnFault - error handling in Sterling Integrator BPs

Questions:
 
*******************************************************************
 
Q1) In a BP which has many sequences( some with its own Onfault ...some with no Onfault), if an error occurs in a sequence, which onfault would capture it...one or more than one?
 
A1) It is always an OnFault connected with a sequence in which error has happened.
 
*******************************************************************
 
Q2) How to instruct the process to go to a specific onFault upon encoutering a specific error.
 
A2) First topic in below explanation will answer this question
 
*******************************************************************
 
Q3) If an error occurs in a sequence( has its own Onfault) which is contained in another sequence,  is the error handled only by the Onfault of the sequence in which the error occured or is it handled by its own onfault as well as the onfault of the sequence that it is contained in(parent sequence)?
 
A3) Error is handled through an OnFault connected to a sequence, but if error happens in OnFault part, then that error will be handled by outermost sequence (if exists) and also by a parent process (if exists and if errors are propagated to a parent process). Explained in 4rd and 5th topic below.
 
*******************************************************************
 
Q4) What happens if there is an error in executing the code contained with in a Onfault ?   which Onfault would handle that error?
 
A4)
  • You can have a generic OnFault that will always work because you can simply put just an assign in it, as explained  in 4th topic.
  • OnFault inside of OnFault will not work and give an expected result (6th topic)
  • My opinion is not to force handling all the errors, e.g. if error happens in OnFault part. For example if we send a mail through OnFault, and mail server is down, then this process will finish in Halted state, but SI has a system BP, BPRecovery, that will find such processes in the system and try to Restart/Resume it every 10 minutes (depending on setting in the BP). Sometimes, it is good to have a BP in Halted state, for manual intervention as well, as sometimes problem has to be resolved by our manual intervention. It all depends on the process and expected errors, if they can be automatically resolved just by another restart or need a manual intervention.
 
*******************************************************************
 
Q5) If this BP is called from another BP, will the process exits after completing the execution of code in Onfault or is the control handed over back to the Parent process?
 
A5) Explained in 5th topic.
 
*******************************************************************
There are 6 topics that will explain different scenarios for error handling in Sterling Integrator …
 
Mirjana's picture

MESA Skin Editor - instructions

 
This document is created based on test done on SI 5.0 build 5006. It is a basic version of steps that must be done to use MESA Skin Editor. For detailed explanation you should find Sterling’s documentation!
 
 
1.    Install eclipse
 
If we have SI installed download and install Eclipse version
3.3.x. For more information, see http://eclipse.org/downloads/index.php.
 
2.    Download and install Java 2 SDK
 
Download and install Java 2 SDK Standard Edition 5.0 (JDK 1.5.0_14 or higher) on the same PC that you installed Eclipse. It is important that you have the full JDK and not just the JRE.
 
3.    Set JAVA_HOME variable to JDK previously installed
 
4.    Set correct java used by eclipse
 
Window menu >> select Preferences >> Expand the Java section and select Installed JREs
 
 
 
5.    Start WebDAV server
 
runDAVServer.sh
 
The default WebDAV port is the base install port + 46
WebDAV server is started automatically in Windows system, by startWindowsService.cmd.
 
6.    Installing MESA Developer Studio
 
You must download and install MESA Developer Studio Eclipse plug-in components from your application instance.
 
SI and Web DAV must be up and running.
 
7.    Install MESA plugins
 
Eclipse Help menu >> Software Updates >> Find and install >>
Search for new features to install >> Next >> New Remote Site
 
  • Name - type a descriptive name for the remote application server.
  • URL - type the server name or IP address, followed by a colon and the WebDAV port number, followed by a slash (/) and the word "eclipse," in this format: new_serverWebDAVportnumber/eclipse
 
 
Finish!!!
 
The system verifies the selected site and displays the results. On the search results page, expand the update site node and select from the following plug-ins, according to your licenses:
• MESA Studio
• MESA Developer Studio SDK
• MESA Developer Studio Skin Editor
• Reporting Services (automatically selects all three Reporting Services plug-ins: Fact Model Editor, Report Editor, and Report Format Editor)
 
8.    Set Up an Application Instance
 
Note: MESA Studio must be used, and an instance configured in it, before you make connection to an application instance and use MESA Develop Studio Skin Editor!
 
From the Window menu, select Open Perspective >> Other >> Select MESA Studio >> OK
 
 
 
9.    Create a New instance in MESA Studio
 
In the MESA Studio view in the upper left, right-click and select New instance, and set details for server connection
 
 
 
Once when you set a new instance, you can check the connection (green light) and right click on the instance and choose Change skin option.
 
Note: Change skin option is not available if you created an instance for the first time, and never downloaded skin. So, you should got to MESA Developer Studio Skin Editor and download the skin for the first time. You can get it by the following procedure:
 
Windows >> Open Perspective >> Other >> MESA Developer Studio Skin Editor >> Skin (menu) >> Download skin
 
 
 
You must check (right click on instance and Refresh) if there is connection to server (WebDAV), in order to be able to make skin modification of any of the following applications:
  • myAFT
  • Login
  • MBI
  • Community Management
  • Administration
  • Dashboard
  • AFT
 
You can download existent skin from server:
 
Skin menu >> Download skin
 
After making changes …
 
 
 
 … you can deploy it back:
 
Skin menu >> Deploy changes
 
The result is:
 
 
 
10.          Problem
 
I found the situation where I could change colours of dashboard but could not change the logo. Whatever I put as a new image for the logo and deploy it back, it would stay the same original logo in the site.
There are 3 changes I made and finally got it:
 
  • Installed new eclipse with a new workspace
  • Renamed dashboard.war (in noapp/deploy) into dashboard_OLD.war, to get completely new dashboard.war deployed from eclipse
  • Set java 1.5.0_16 in eclipse, instead of java 1.6 (that I used previously)
Mirjana's picture

Code Lists (Map and TP)

Trading Partner Code List

Here is the definition of Code List from the documentation:

"Sterling Integrator uses code pairs in code lists to identify items in transactions between two or more trading partners. A trading partner code list consists of one or many pairs of code values containing a sender code and a receiver code. Each code pair has one description and up to four additional codes relating to the pair. Code lists are dynamic and are stored in a database."

You can always export it and use in another system.

Map Code List

Code List that we use in a map is something completely different, it is not saved in a database, there is no interface to add or change it through GIS dashboard, and that kind of Code List is saved in a map only. It is used only to check if a specific value from a field is listed in a code list allocated to that field. If you apply 'Use Code' Standard Rule to one field, then value from that field will be validated against the values from that Code List, and if it does not match, error will be thrown.

Here is a part from the documentation:

"The Use Code standard rule enables you to use code lists to validate the contents of a field and use as a reference to look up an associated description for a field. A code list is a list of values and their corresponding descriptions. Code lists seldom change and are stored within the map itself. A field with a Use Code standard rule enables values to be either checked against or selected from the codes in a specified code list. Code lists are typically used to qualify another field.
You create code lists in the Map Editor as part of a map and manage them through the Map Editor. You can import and export code lists and copy and paste code lists between maps.

Code lists differ from the trading partner code lists used by the Select standard rule. Map Code Lists are generally static and stored within the map file."

Mirjana's picture

Correlation Service

Service description (taken from the Sterling Commerce documentation):

 

Adds a record to the correlation table to enable you to track a document or business process.

Collects the information for a specific name and value pair from either documents that pass through a business process, or from the business process itself. The correlation name and value pairs are saved in the correlation table.

 

Configuration parameters description (taken from the Sterling Commerce documentation):

 

Field

Description

Config

Name of the service configuration. Required.

NAME

Name associated with this correlation. For example, PONUMBER.

Object_ID

ID of the document or business process that correlates with a specific name/value pair. Generally, this field is left blank.

Type

The information this correlation will track. Valid values are Document and Business Process.

Value

Value of the correlation. For example, a purchase order number such as 12345.

 BP used for test:

 

<process name="default">
         <sequence name="Sequence Start">
             <assign to="PurchaseOrderNumber">32</assign>
             <
operation name="Correlation Service">
                <
participant name="CorrelationService"/>
                   <output message="CorrelationServiceTypeInputMessage">
                        <assign to="." from="*"/>
                        <assign to="TYPE">DOCUMENT</assign>
                        <assign to="FOR_UPDATE">true</assign>
                        <assign to="OBJECT_ID" from="string(substring-before(PrimaryDocument/@SCIObjectID,':')"/>
                        <assign to="VALUE" from="PurchaseOrderNumber/text()"/>
                        <assign to="NAME">PurchaseOrderNumber</assign>
                   </output>
                   <input message="inmsg">
                       
<assign to="." from="*"/>
                   </input>
          
</operation>
        </sequence>
</process>

 

Test results (different combinations of options for TYPE, OBJECT_ID parameter and PrimaryDocument):


I tested it and seems object_id can be taken from SCIObjectID attribute when you have a PrimaryDocument in the process, but without Primary Document you will get „Mandatory parameters for the service are invalid or missing.“

The result of my test is:

1.

TYPE -> DOCUMENT
OBJECT_ID -> N/A (empty)
PrimaryDocument -> N/A

Correlation service will throw the error „Mandatory parameters for the service are invalid or missing.“, and correlation will not be written.

2.

TYPE -> DOCUMENT
OBJECT_ID -> N/A (empty)
PrimaryDocument -> Available

Correlation is written, search of correlation will show Document Name that is null (with link), Business Process also with link.

3.

TYPE -> DOCUMENT
OBJECT_ID -> constant (e.g. objectId)
PrimaryDocument -> N/A

Correlation is written, search of correlation will show DocumentName that is objectId (Archived/Purged) (with link), but Business Process field is EMPTY!!!

4.

TYPE -> DOCUMENT
OBJECT_ID -> constant (e.g. objectId)
PrimaryDocument -> Available

Correlation service finishes without error, but I cannot find that new correlation in result of search of Correlations.

******************************************************************
5.

TYPE -> BUSINESS PROCESS
OBJECT_ID -> N/A (empty)
PrimaryDocument -> N/A

Correlation can be found when clicking on link of correlation (BP Monitor for all the processes containing correlations will be shown)

6.

TYPE -> BUSINESS PROCESS
OBJECT_ID -> N/A (empty)
PrimaryDocument -> Available

Correlation can be found when clicking on link of correlation (BP Monitor for all the processes containing correlations will be shown)

7.

TYPE -> BUSINESS PROCESS
OBJECT_ID -> constant (e.g. objectId)
PrimaryDocument -> N/A

Correlation will not be written!!!! Although there is no error in the service. Seems OBJECT_ID should not be defined when TYPE is BUSINESS PROCESS.

8.

TYPE -> BUSINESS PROCESS
OBJECT_ID -> constant (e.g. objectId)
PrimaryDocument -> Available

Correlation will not be written!!!! Although there is no error in the service. Seems OBJECT_ID should not be defined when TYPE is BUSINESS PROCESS.

9.

TYPE -> BUSINESS PROCESS
OBJECT_ID -> constant (e.g. Xpath ... string(PrimaryDocument/@SCIObjectID))
PrimaryDocument -> N/A

Correlation is written and found by search.

10.

TYPE -> BUSINESS PROCESS
OBJECT_ID -> constant (e.g. Xpath ... string(PrimaryDocument/@SCIObjectID))
PrimaryDocument -> Available

Correlation will not be written!!!! Although there is no error in the service.

My conclusion:

NAME and VALUE parameters in all the options are always the same.

Seems options 2, 5, 6 and 9 really write the correlation and returns result of search by all the values populated.

Mirjana's picture

Primary Document and Process Data

BPML
 
The BPML specification provides an abstract model and XML syntax for expressing business processes and supporting entities. BPML itself does not define any application semantics such as particular processes or application of processes in a specific domain; rather it defines an abstract model and grammar for expressing generic processes. This allows BPML to be used for a variety of purposes that include, but are not limited to, the definition of enterprise business processes, the definition of complex Web services, and the definition of multi-party collaborations.
 
BPML is a meta-language for the modeling of business processes, just as XML is a meta-language for the modeling of business data.
 
BPML code includes activities and elements. An activity is a step in a business process, and may be comprised of multiple elements. Elements are defined components of code that provide structure and instructions regarding the activity they embody.
 
 
Process
 
A process is a progressively continuing procedure that consists of a series of controlled activities systematically directed toward a particular result of end. A process is defined as performing activities of varying complexity.
Any entity that a process communicates is defined as a participant. Participants can be business applications (e.g., ERP, CRM), customers, partners, and other processes. They can be either static or dynamic. Static participant is defined in the process and is referenced by activities in the process. Dynamic participant is retrieved from process data using XPath expression
 
Activity
 
An activity is a component that performs a specific function within the process. For example, invoking another process. Activities can be as simple as sending or receiving a message, or as complex as coordinating the execution of other processes and activities.
Activities are either atomic or complex.
 
As a simple activity is always based on sending and receiving message, that is always in XML format, it is important to understand XML format and the ways XML can be handled in a business process, by some functions and/or services in a business process in GIS.
 
Context
 
Activities always execute within a context. The context retains an association between the activities and information and services they utilize during execution, for example, properties that they can access, security credentials, transaction, exception handling, etc.
 
Context maintains the state of the business process from service to service. It contains, among other things, the document being manipulated by the business process. This is also where each service reports errors and status. The GIS infrastructure is designed to persist the context between steps.
The context contains several components:
  • Input Parameters - Retrieve parameters before beginning the operation
  • Workflow Document Body - Set up the document body
  • Error Reporting - Set up status and error reporting
Context can be process data, message that the process sends to a participant or message the
process receives from a participant.
 
Messages
 
Messages are used to exchange information between a process and its participants. All messages are in XML format, and a schema is used to define the format. Upon receiving a message, an activity can use rules to decide whether it will consume the message. Consumed messages are transformed into process data using assignment. XPath is used to retrieve and transform messages.
 
 
Properties
 
A property definition is used to define a property within a particular context. The context can be part of a process.
 
A property is a named value. Activities can access a property’s value or establish a new value. Properties are access as part of the context in which an activity is executed, also known as its current context. Properties are communicated between loosely coupled processes by performing operations and mapping properties to the message exchanged in these operations.
 
Components that do the job in a business process are Services or Adapters.
 
  • every service has 2 areas at disposal to use for processing, one is Process Data and another is Primary Document – these 2 areas will be explained in the following sections.
  • properties in output message that is XML message sent from a business process to a service can be used by a service
  • every service also produce an XML that we call input message. We can also manipulate with properties from an input message.
 
 
Primary Document and Process Data
 
 
Definition and explanation of primary document and process data
 
Primary Document
 
The primary document is the core document in a business process, a document that is supposed to be processed in a business process or only transferred without changing.
 
It can be changed by translation services (such as Translation Service, XSLT Service or XML Encoder).
 
Primary Document can be transferred by the services through a business process and if we do not use any service that can change the content of the primary document, it will be the same in all the steps of a business process.
 
Process Data
 
We can say it is a unique section of storage space for saving variables and evaluating syntax.
Also, process data is area where information of activities or services during the execution of a business process are saved. So, all the data related to a business process are collected into process data. Process data is always in XML format and saved under root element which is <ProcessData>.
 
Services and activities put information in process data, and also they can access and use information from the process data to complete the business process activities.
 
Both areas, primary document as well as process data are important for processing data in a business process.
Some services use only primary document and do not use process data.
Some services only put some information into process data and do not do anything with a primary document.
 
Also, there are services that use primary document and process data and both areas are changed or updated after such services finish execution.
 
Short resume ...
 
Process Data can include the following:
 
  • data extracted from a primary document
  • data assigned in a business process explicitly by assign element
  •  data placed by a service
 
  •  Meaning and explanation of the name PrimaryDocument
 
    • About services that use PrimaryDocument
 
There are some services that use primary document for processing or operate on a primary document. Any service that needs a document for processing knows how to take a document with the name PrimaryDocument. As process data can contain many documents, only one can be named as PrimaryDocument, others have other names. Also, maybe, none of them have PrimaryDocument name after processing.
If we have a document(s) with name(s) different than PrimaryDocument, we would have to rename one into the right name and service will recognize it then and take for processing.
 
There are services that work on primary document and need a document with that exact name in the process data, and it is the PrimaryDocument, and such services are:
 
-          Translation Service
-          XSLT Service (if configured to translate the primary document, not the process data)
-          SMTP Send Adapter
-          Document Extraction Service
-          Document Keyword Replace Service
-          FTP Client PUT Service
-          SFTP Client PUT Service
-          HTTP Client POST Service
-          CD Server CopyTo Service
-          Command Line Adapter (in useInput parameter is set to Yes)
-          EDI Deenvelope Service
-          EDI Encode Service
-          EDI Envelope Service
-          XML Encoder (in modes: Encode non-XL document and Use existing XML document)
-          XML Validation Service (if xml_input_from is set to PrimaryDocument)
 
... and some other services.
 
Typical error that you get when a service expects a PrimaryDocument but cannot find it, and can be found in a Status Report, is:

com.sterlingcommerce.woodstock.workflow.WorkFlowException: There is no Primary Document, this service operates on the Primary Document

... or for EDI Encode Service ...

Error encoding primary document - EDI ENCODER SERVICE: There is no document to encode.

… or for EDI Envelope Service …

com.sterlingcommerce.woodstock.workflow.WorkFlowException: No document to envelope

    • About services that need PrimaryDocument for an input and/or produce a PrimaryDocument
 
As we mentioned above that some services use only primary document and do not use process data and some services only put some information into process data and do not do anything with a primary document or any other combination, here is few examples of different usages from practice mainly regarding PrimaryDocument.
 
- There are lot of services that needs the PrimaryDocument to operate on and produce only one document which name is always PrimaryDocument.
 
For example
 
  • Translation Service - when we translate a document by Translation Service it will always use the PrimaryDocument as the input for processing and the result of processing will go into the PrimaryDocument as well (if we do not say to go into another document, but more on that in the section about Input/Output messages).
  • XSLT Service - Or another example, if we translate a document by XSLT, input can be the PrimaryDocument (although Process Data can also be used as the input for translation) and the result is saved in the PrimaryDocument.
  • XML Encoder - XML Encoder in the mode 'Encode a non-XML document' also uses PrimaryDocument as input and produce the result that is placed into the Primary Document.
  • Document Keyword Replace Service – uses PrimaryDocument as an input and result of replace placed back to the PrimaryDocument.
  • Mail Mime Service – depending on setting of this service, it will use the PrimaryDocument for creating a raw mail MIME message and produced MIME message will be put back into the Primary Document.
 
 
- Some other services only need PrimaryDocument as input for processing, but do not change it, that means, do not produce a new PrimaryDocument in the process data area as the output result:
 
For example
 
  • SMTP Service – simply uses PrimaryDocument to use for sending mail, but do not produce any result back into the Primary Document
  • XML Encoder - in mode 'Use existing XML document', XML document from the PrimaryDocument will be taken and placed into the process data. So Primary Document is used for the input, but a new Primary Document is not created as the output result.
  • XML Validation Service – validates the PrimaryDocument,
  • FTP Client PUT Service – takes the PrimaryDocument and send it to an FTP server. There is no any change in the PrimaryDocument after the service finishes processing
  • HTTP Client POST Service – needs a PrimaryDocument for processing and does not produce a new one or makes any change in the PrimaryDocument.
  • CD Server CopyTo Service – similar to any other communicatio service that sends out a document, so needs a PrimaryDocument for processing, but does not produce a new one after finishing.
 
 
    • About services that produce PrimaryDocument or documents with other names (different than the PrimaryDocument):
 
Some services need PrimaryDocument to operate on, some other services or even the same as mentioned above also produce as the result of its activity a document with the name PrimaryDocument or documents with other names.
 
Examples of services and processes that produce more than one document in the process data follows …
 
  1. File System Adapter
 
When the File System Adapter collects multiple files from the file system, more than one document will be placed in the process data with names FSA_Document1, FSA_Document2, …
 
Business proces example:
 
<process name="default">
 <sequence>
    <operation name="File System Adapter">
      <participant name="FSA_name"/>
      <output message="FileSystemInputMessage">
        <assign to="Action">FS_COLLECT</assign>
        <assign to="collectionFolder">C:\collectionDirecory</assign>
        <assign to="collectMultiple">true</assign>
        <assign to="deleteAfterCollect">false</assign>
        <assign to="filter">*</assign>
        <assign to="." from="*"></assign>
      </output>
      <input message="inmsg">
        <assign to="." from="*"></assign>
      </input>
    </operation>
 
 </sequence>
</process>
 
Result of the File System Adapter (multiple collect) in the process data:
 
<?xml version="1.0" encoding="UTF-8"?>
<ProcessData>
 <FSA_Document1 SCIObjectID="serverName:578ceb:1184f7a12a5:e05" filename="fileName_1.txt"/>
 <FSA_Document2 SCIObjectID="serverName:578ceb:1184f7a12a5:e06" filename="filename_2.txt "/>
 <FSA_Document3 SCIObjectID="serverName:578ceb:1184f7a12a5:e07" filename="filename_3.txt "/>
 <FSA_DocumentCount>3</FSA_DocumentCount>
</ProcessData>
 
We can see that there are more than one document in the process data, none of them has the name PrimaryDocument. Documents are saved under tags FSA_Document[n], where n is an order number of collected document. Also, there is an attribute 'filename' in every element with values that correspond to the file names from the file system. 
 
  1. FTP Client GET Service
 
After multiple GET (mget) files from an FTP server site, we also get more than one file where none of them has the name PrimaryDocument but they are saved under tags with original names gotten from the server.
 
 Business proces example:
 
<process name="default">
 <sequence>
    <operation name="FTP Client Begin Session Service">
      <participant name="FTPClientBeginSession"/>
      <output message="FTPClientBeginSessionServiceTypeInputMessage">
        <assign to="FTPClientAdapter">FTPClientAdapter</assign>
        <assign to="RemoteHost">serverName</assign>
        <assign to="RemotePasswd">password</assign>
        <assign to="RemotePort">21</assign>
        <assign to="RemoteUserId">userId</assign>
        <assign to="UsingRevealedPasswd">true</assign>
        <assign to="." from="*"></assign>
      </output>
      <input message="inmsg">
        <assign to="." from="*"></assign>
      </input>
    </operation>
 
    <operation name="FTP Client GET Service">
      <participant name="FTPClientGet"/>
      <output message="FTPClientGetServiceTypeInputMessage">
        <assign to="RemoteFilePattern">*</assign>
        <assign to="SessionToken" from="SessionToken/text()"></assign>
        <assign to="." from="*"></assign>
      </output>
      <input message="inmsg">
        <assign to="." from="*"></assign>
      </input>
    </operation>
 
    <operation name="FTP Client End Session Service">
      <participant name="FTPClientEndSession"/>
      <output message="FTPClientEndSessionServiceTypeInputMessage">
        <assign to="SessionToken" from="SessionToken/text()"></assign>
        <assign to="." from="*"></assign>
      </output>
      <input message="inmsg">
        <assign to="." from="*"></assign>
      </input>
    </operation>
 
 </sequence>
</process>
 
Result of the FTP Client GET Service (mget), in the process data:
 
<?xml version="1.0" encoding="UTF-8"?>
<ProcessData>
 <SessionBeginTime>2008-02-26 11:59:36.218</SessionBeginTime>
 <SessionToken>FTPClientAdapter_FTPClientAdapter_node1_12040235762181001:5222</SessionToken>
 <Status>0</Status>
 <ServerResponse>
    <Code>230</Code>
    <Text>230 User userName logged in.</Text>
 </ServerResponse>
 <TranscriptDocumentId> serverName:578ceb:1184f7a12a5:e60</TranscriptDocumentId>
 <TranscriptDocument_1 SCIObjectID="serverName:578ceb:1184f7a12a5:e63"/>
 <TranscriptDocument_3 SCIObjectID="serverName:578ceb:1184f7a12a5:eb7"/>
 <fileName_1.txt SCIObjectID="serverName:578ceb:1184f7a12a5:eb8"/>
 <fileName_2.txt SCIObjectID="serverName:578ceb:1184f7a12a5:eb9"/>
 <fileName_3.txt SCIObjectID="serverName:578ceb:1184f7a12a5:eba"/>
 <Status>0</Status>
 <TranscriptDocumentId>
serverName:578ceb:1184f7a12a5:ead</TranscriptDocumentId>
 <ServerResponse>
    <Code>226</Code>
    <Text>226 Transfer complete.</Text>
 </ServerResponse>
 <DocumentList>
    <DocumentId>serverName:578ceb:1184f7a12a5:e81</DocumentId>
    <DocumentId>serverName:578ceb:1184f7a12a5:e8a</DocumentId>
    <DocumentId>serverName:578ceb:1184f7a12a5:e8d</DocumentId>
 </DocumentList>
</ProcessData>
 
TranscriptDocument_[n] contains response of ftp server in particular steps. Every document contains response from one step in ftp session or response of one ftp command.
Documents gotten by mget command from ftp site are saved under tags named with original file names.
There is also list of DocumentId(s) under tag DocumentList. They are unique id(s) under which we can find documents in this particular workflow context. Some services can process a document based on its DocumentId . For example, FTP Client PUT Service can be provided by DocumentId parameter and based on that value document will be taken from a current workflow context and put at the ftp site by the service. If no value for DocumentId is specified then the service will put the primary document to the remote server.
 
 
  1. B2B Mail Client Adapter
 
One more example shows documents gotten from a mail server. We can find body of mail in the PrimaryDocument and attachments under Mail_Mime_DOC_[n] tags.
 
Business proces example that analyze raw mail message taken from the mail server containing body and 3 attachments:
 
<process name="default">
 <sequence>
    <operation name="Mail Mime Service">
      <participant name="MailMimeService"/>
      <output message="MailMimeServiceInputMessage">
        <assign to="mail-mime-operation">parse</assign>
        <assign to="parse">true</assign>
        <assign to="." from="*"></assign>
      </output>
      <input message="inmsg">
        <assign to="." from="*"></assign>
      </input>
    </operation>
 
 </sequence>
</process>
 
Result of the Mail Mime Service that parses mail MIME message:
 
<?xml version="1.0" encoding="UTF-8"?>
<ProcessData>
 <Mail_Client>
    <Headers>
      <X-MimeOLE>Produced By Microsoft MimeOLE V6.00.2900.3198</X-MimeOLE>
      <To><[email protected]></To>
      <From>"sender" <[email protected]></From>
      <Received>from [127.0.0.1]</Received>
      <Content-Type>multipart/mixed;
          boundary="----=_NextPart_000_0003_01C87889.481382B0"</Content-Type>
      <Date>Tue, 26 Feb 2008 15:07:19 +0100</Date>
      <Attachment_Count>3</Attachment_Count>
      <X-Mailer>Microsoft Outlook Express 6.00.2900.3138</X-Mailer>
      <MIME-Version>1.0</MIME-Version>
      <Message-ID><[email protected]></Message-ID>
      <Subject>a</Subject>
      <X-Priority>3</X-Priority>
      <X-MSMail-Priority>Normal</X-MSMail-Priority>
    </Headers>
    <Attachments>
      <Filenames>
        <Filename3>attachmentName_3.txt</Filename3>
        <Filename2>attachmentName_2.txt</Filename2>
        <Filename1>attachmentName_1.txt</Filename1>
      </Filenames>
      <FileExtensions>
        <FileExtension3>txt</FileExtension3>
        <FileExtension2>txt</FileExtension2>
        <FileExtension1>txt</FileExtension1>
      </FileExtensions>
      <ContentTypes>
        <Content_Type3>text/plain;
          name=" attachmentName_3.txt"</Content_Type3>
        <Content_Type2>text/plain;
          name=" attachmentName_2.txt"</Content_Type2>
        <Content_Type1>text/plain;
          name=" attachmentName_1.txt"</Content_Type1>
      </ContentTypes>
    </Attachments>
 </Mail_Client>
 <b2b-raw-message>true</b2b-raw-message>
 <b2b-protocol>smtp</b2b-protocol>
 <PrimaryDocument SCIObjectID="serverName:578ceb:1184f7a12a5:1eb4"/>
 <Mail_Mime_DOC_2 SCIObjectID="serverName:578ceb:1184f7a12a5:1eb7"/>
 <Mail_Mime_DOC_3 SCIObjectID="serverName:578ceb:1184f7a12a5:1eba"/>
 <Mail_Mime_DOC_4 SCIObjectID="servername:578ceb:1184f7a12a5:1ebd"/>
 <Mail_Mime>
    <Total_Message_Content>4</Total_Message_Content>
 </Mail_Mime>
</ProcessData>
 
 
This is one more example when more than one document is created in the processs data. But in this case we can find the PrimaryDocument as well as other documents. Body of mail is placed in the PrimaryDocument and attachments can be found in the document under Mail_Mime_DOC_[n] tag names.
 
 
We saw 3 different examples where services produce more than one document in the process data.
 
In the case we want to give those files to a service that operates on the PrimaryDocument, one by one, we can create a loop and in every iteration of the loop, document will be renamed to the PrimaryDocument in order a service that will operate on a document can recognize the name. There is no service that can take e.g. FSA_Document1 document from the process data. Such name always has to be changed into PrimaryDocument before giving that document to a service that operates on a document.
 
Section about renaming document in the process data follows.
 
 
  • Renaming document and its purpose
 
    • Taking a document from the PrimaryDocument and saving under another name
 
If we want to take a document saved under the PrimaryDocument tag and save it under another tag name, e.g. TEMP_Storage, it can be done by the following assign statement.
 
    • Assign statement
 
<assign to="TEMP_Storage" from="PrimaryDocument/@SCIObjectID"></assign>
 
    • Configuration in the Graphical Process Modeler:
 
 
    • Result in the process data
 
<?xml version="1.0" encoding="UTF-8"?>
<ProcessData>
 <PrimaryDocument SCIObjectID="serverName:4741d6:11859e01178:-3e47"/>
 <TEMP_Storage SCIObjectID="serverName:4741d6:11859e01178:-3e47"/>
</ProcessData>
 
    • Purpose for such an action
 
We know now that some services create a PrimaryDocument in process data area. If we imagine that we have 2 or more services where every of them will place its result into the PrimaryDocument, then we will see that the second service will rewrite the result of the first one in the PrimaryDocument, the third seervice will rewrite the result of the second one also in the primaryDocument, etc ...
If we want to save the result of a specific service or do not allow another service rewrite it, we will simply take the content of the PrimaryDocument and save the whole document under any tag name (in our example it is TEMP_Storage). But we can also see that both tags containing these 2 documents exist in the process data. When we save a document under another name, nothing disappeared from the process data, just another element is added with link to a document. If we want or need to remove PrimaryDocument we have to release it explicitly with the Release Service. Let say that we do not rename a document in the process data, it is better to say we create a new one. Any action in a business process can remove an element from the process data except the Release Service. 
 
 
    • Releasing the PrimaryDocument after creating the TEMP_Storage (if necessary)
 
Configuration of the Release Service to release the PrimaryDocument is:
 
<operation name="Release Service">
      <participant name="ReleaseService"/>
      <output message="ReleaseServiceTypeInputMessage">
        <assign to="TARGET">PrimaryDocument</assign>
        <assign to="." from="*"></assign>
      </output>
      <input message="inmsg">
        <assign to="." from="*"></assign>
      </input>
</operation>
 
 
 
    • Taking a document from a tag which name is different than the PrimaryDocument and saving under PrimaryDocument
 
If we want to take a document saved under the TEMP_Storage tag and save it under the PrimaryDocument, it can be done by the following assign statement:
 
    • Assign statement
 
<assign to="PrimaryDocument" from="TEMP_Storage/@SCIObjectID"></assign>
 
    • Configuration in the Graphical Process Modeler:
 
 
    • Result in the process data
 
<?xml version="1.0" encoding="UTF-8"?>
<ProcessData>
 <TEMP_Storage SCIObjectID="MIRJANA:4741d6:11859e01178:-2b08"/>
 <PrimaryDocument SCIObjectID="MIRJANA:4741d6:11859e01178:-2b08"/>
</ProcessData>
 
    • Purpose for such an action
 
If the content of a document that is saved under the name different than the PrimaryDocument has to be taken by a service in a business process, first of all we have to save it under the PrimaryDocument. As we told before, any service in GIS can only recognize the PrimaryDocument name, not any other name in a process data area.
Once when we return content of document saved under the tag TEMP_Storage into the PrimaryDocument, maybe we want to release the TEMP_Storage element where document has been saved.
 
    • Releasing the TEMP_Storage after creating the PrimaryDocument (if necessary)
 
Configuration of the Release Service to release the TEMP_Storage is:
 
<operation name="Release Service">
      <participant name="ReleaseService"/>
      <output message="ReleaseServiceTypeInputMessage">
        <assign to="TARGET">TEMP_Storage</assign>
        <assign to="." from="*"></assign>
      </output>
      <input message="inmsg">
        <assign to="." from="*"></assign>
      </input>
</operation>
 
 
 
In both of these cases, we do not rename a document, but create a new document in the process data under different tag name.

There are other ways to preserve a document without using assign for creating a new document explicitly, but that topic will be explained in one new document .... when I find time to write it  ...

 

Mirjana's picture

Assign Element - append option

If you have ever asked yourself what an append parameter is used for in the Assign Element, here is a simple example that could explain it. It works the same in the Output or Input messages inside of services configuration.

 
BPML, 3 assigns without append option:
 
<process name="default">
 <sequence>
    <assign to="AAA">aaa</assign>
    <assign to="BBB">bbb</assign>
    <assign to="AAA">a new value of AAA element</assign>
 </sequence>
</process>
 
Process Data after the 1st assign:
 
<?xml version="1.0" encoding="UTF-8"?>
<ProcessData>
 <AAA>aaa</AAA>
</ProcessData>
 
Process Data after the 2nd assign:
 
<?xml version="1.0" encoding="UTF-8"?>
<ProcessData>
 <AAA>aaa</AAA>
 <BBB>bbb</BBB>
</ProcessData>
 
Process Data after the 3rd assign (AAA is overwritten):
 
<?xml version="1.0" encoding="UTF-8"?>
<ProcessData>
 <AAA>a new value of AAA element</AAA>
 <BBB>bbb</BBB>
</ProcessData>
 
BPML, 3 assigns, the last one with append option:
 
<process name="default">
 <sequence>
    <assign to="AAA">aaa</assign>
    <assign to="BBB">bbb</assign>
    <assign to="AAA" append="true">a new value of AAA element</assign>
 </sequence>
</process>
 
Process Data after the 1st assign:
 
<?xml version="1.0" encoding="UTF-8"?>
<ProcessData>
 <AAA>aaa</AAA>
</ProcessData>
 
Process Data after the 2nd assign:
 
<?xml version="1.0" encoding="UTF-8"?>
<ProcessData>
 <AAA>aaa</AAA>
 <BBB>bbb</BBB>
</ProcessData>
 
Process Data after the 3rd assign (AAA is appended):
 
<?xml version="1.0" encoding="UTF-8"?>
<ProcessData>
 <AAA>aaa</AAA>
 <BBB>bbb</BBB>
 <AAA>a new value of AAA element</AAA>
</ProcessData>
 
If we assign an XML element in the Process Data by assign with append=”true” option, and such element already exists, then a new one will be added. If append is set to false, an existent element in Process Data will be overridden by a new one.
 

 

Mirjana's picture

Transaction Register

Transaction Register Theory
 
Transaction Register Definition (taken from Sterling Commerce documentation)
 
Transaction Register - the translator invokes the check against the specified data (held
in memory in Field1 through Field 6) to determine if the data is duplicate.
 
Transaction register enables you to specify a field for which you want to see if it contains duplicate data (using the Update standard rule Transaction Register function) and then invoke it (using the Select standard rule Transaction Register function) to verify whether the fieldcontains duplicate data. Using the Update standard rule you can load document data into up to six fields in memory (Field 1 through Field 6).
 
Note: You must define the data for which the translator will check by adding a Transaction Register Update standard rule prior to the point in the map where you invoke the Transaction Register Select standard rule to check for duplicate data. The updates do not go directly to the database; they are kept in memory until the eventual select, and then they are checked against the database and inserted if necessary.
 
  • Then, using the Select standard rule Transaction Register function, you can invoke the check against the Transaction Register database table. If the data is validated as duplicate data, the translator notes the error in the translator report.
  •  If there is no matching data already in the Transaction Register table, the content of Field1 through Field 6 is inserted as a row in the table.
Note: If the document translation does not succeed, the field information (for which you have specified the translator should verify whether it is duplicate) is not added to the Transaction Register database table.
 
Purging the Transaction Register Database Table
 
By default, data in the Transaction Register table is deleted after thirty days (from the date the data was added to the table), so that the table does not continue to grow unchecked. If you wish, you can change the number of days after which data is deleted or remove this control altogether.
To change the number of days that data will be kept in the Transaction Register database table before being purged, or to remove the purge control altogether:
  • Open the customer_overrides.properties.in file (in <install dir>/properties)
  • Specify the number of days you want the Transaction Register table to retain data in the mapper.maximumTransactionRegisterAge value
  • To remove the purge control, delete this entire line from the customer_overrides.properties.in file:
mapper.maximumTransactionRegisterAge=30
  • After you complete the edit, execute <install_dir>/bin/setupfiles.sh and then stop and restart the application to verify the changes take effect.
Field Size Limit

Field1
150 characters
Field2
35 characters
Field3
35 characters
Field4
35 characters
Field5
35 characters
Field6
30 characters

 
Transaction Register Test Results
 
Select can be used without Update.
Update cannot be used without Select.
If they are together, Update must be used prior to Select.
 
UPDATE & SELECT
 
If we do/use Update and Select in the same map, and updated field is one that already
exist in TRANSACTION REGISTER (transact_register table), we get the error in
Translation Report:
 
Report Entry:
       Section: OUTPUT       Severity: ERROR
       SyntaxSpecific: false       Syntax: -1        Code: 144 Standard Rule Duplicate Transaction Register Error
 

UPDATE (without SELECT)

If we use Update without Select later in the map, there is the following Warning:

 
Report Entry:
       Section:        Severity: WARNING
       SyntaxSpecific: false       Syntax: -1        Code: 160 Unwritten Transaction Register. Use SELECT rule.
 

SELECT (without UPDATE & compliance check turned ON)

If we do Select without doing Update previously, and option ‘Raise a compliance error if matching data is not found’ is checked (ticked off), then the error we get is:

Report Entry:
       Section: OUTPUT       Severity: ERROR
       SyntaxSpecific: false       Syntax: -1        Code: 141 Standard Rule Select Data Missing
 

SELECT (without UPDATE & compliance check turned OFF)

If we do Select without doing Update previously, and option ‘Raise a compliance error if matching data is not found’ is NOT checked, then there is no error.

We cannot use only Select and expect a field on which select is applied will be checked against TRANSACTION REGISTER. Only value that we used in Update is checked against TRANSACTION REGISTER.
 
My conclusion is that only SELECT can be used but it will do nothing if UPDATE does not exist in the same map (prior to select).
 
Data written in a DB table with an UPDATE statement:
 
Transaction Register table used for EDI Control Number History
 
Transaction register table is also used for saving history of EDI Control Numbers.
 
Example 1 (EDIFACT):
 
UNB+UNOA:3+3830035970021+3838551999996+050615:1514+100135++++++1'
UNH+5+ORDERS:D:96A:UN:EAN008'
BGM+80E+3830035970001-00000178+9'
DTM+137:20050615:203'
DTM+2:20050615:203'
FTX+PUR+++67921'
...
UNT+33+5'
UNH+6+ORDERS:D:96A:UN:EAN008'
BGM+80E+3830035970001-00000179+9'
DTM+137:20050615:203'
DTM+2:20050617:203'
FTX+PUR+++67934'
RFF+ON:3830035970001-00000179'
...
CNT+2:1'
UNT+18+6'
UNZ+2+100135'
 
  •  If we have UNB Control Number check set to Yes, where CN is 100135, the following record is written into table:
 
  •  If UNH Control number check is set to Yes, then every one is written in a database table:
 
 
Example 2 (X12):
ISA{00{          {00{          {14{012371711XXX   {14{3210800340100 {101105{0842{^{00406{200517507{1{P{>~
GS{TX{123271711Y{321080034{20101105{0842{500883{X{004060~
ST{864{0905~
...
 

 

Syndicate content