Positional Flat File Handling in B2B

Problem:
There is a big problem with handling positional flat files in Oracle B2B. We are trying to handle Positional Flat File, in UTF-8 with BOM encoding, after creating document definition and setting the agreement, we received:
B2B-51507 Error: The data starting at position 0 is not recognized as a valid data transmission.
What’s more, we remembered to locate Parser file into Config/schema location, which usually fixes problem.
Full description:
We have a sample flat file, with UNIX (LF) endline character, encoded in UTF-8 with BOM:
11B6Kamil Mazur     Lazurowa       8 12 CityABCD       PL
Junior Consultant        MyCompanyX
StreetAAAAAA       00-000 CityABCD      
First of all, we have created a guideline *.ecs and schema XSD, like that:
imgur: the simple image sharer
During XSD creation we have remembered to confirm, that our endline character is in UNIX standard (LF).
After that, we have created Parser Schema File, and paste it in the schema location in our server:
/oracle/Middleware/MyDomain/soa/thirdparty/edifecs/XEngine/config/schema,
after that we paste the schema location in XRegistry file:
imgur: the simple image sharer
After that action, we restarted SOA Server (which was running during these changes).
When SOA Server was in alive again, we’ve created a document definition :
imgur: the simple image sharer
Starting and Ending positions are from 2-5, because of BOM character existence. Then, we created Trading Partner and Inbound Agreement, based on this Document Definition:
imgur: the simple image sharer
Then we posted out test file via Generic File Listening Channel, with no Java Callout and other properties. Unfortunately, we receive error like this:
Business Message :
Id
AC18017B146145ADDCD00000178AA4F9
Message Id
AC18017B146145ADDCB00000178AA4F8-1
Refer To Message
Refer To Message
Sender Type
Name
Sender Value
Szkolenie
Receiver Type
Name
Receiver Value
MyCompany
Sender
Szkolenie
Receiver
MyCompany
Agreement Id
PersonIn
Agreement
PersonIn
Document Type
Person
Document Protocol
PositionalFlatFile
Document Version
Logistics2013
Message Type
REQ
Direction
INBOUND
State
MSG_ERROR
Acknowledgement Mode
NONE
Response Mode
ASYNC
Send Time Stamp
2014-05-19 14:00
Receive Time Stamp
2014-05-19 14:00
Document Retry Interval(Channel)
0
Document Remaining Retry(Channel)
0
Document Retry Interval(Agreement)
Document Remaining Retry(Agreement)
Native Message Size
141
Translated Message Size
Business Action Name
Business Transaction Name
Xpath Name1
Xpath Value1
Xpath Expression1
Xpath Name2
Xpath Value2
Xpath Expression2
Xpath Name3
Xpath Value3
Xpath Expression3
Correlation From XPath Name
Correlation From XPath Value
Correlation From XPath Expression
Correlation To XPath Name
Correlation To XPath Value
Correlation To XPath Expression
Wire Message
Wire Message
Application Message
Application Message
Payload Storage
Payload Storage
Attachment
Attachment
Label
soa_b2b_ - Thu May 15 15:04:40 CEST 2014 - 1
Collaboration Id
AC18017B146145ADD9200000178AA4F7
Collaboration Name
Collaboration Version
Business Action Name
Exchange Protocol Name
Generic File
Exchange Protocol Version
1.0
Interchange Control Number
Group Control Number
Transaction Set Control Number
Error Code
B2B-51507
Error Description
Machine Info: (soatest.corp.prv) The data starting at position 0 is not recognized as a valid data transmission.
Error Level
ERROR_LEVEL_COLLABORATION
Error Severity
ERROR
Error Text
Payload validation error.
Wire Message:
Id
AC18017B146145ADAB400000178AA4F1
Message Id
AC18017B146145ADAB400000178AA4F1
Business Message
AC18017B146145ADDCD00000178AA4F9
Packed Message
Packed Message
Payload
Payload
Protocol Message Id
test_4value.txt@AC18017B146145ADAE500000178AA4F5
Refer To Protocol Message Id
Protocol Collaboration Id
Protocol Transport Binding
filename=test_4value.txt filesize=143 ChannelName=SzkolenieChannel file_ext=txt fullpath=/home/oracle/POC/positional/Krzysiek/test_4value.txt timestamp=2014-05-07T10:23:30.000+01:00 tp_profile_name=Szkolenie MSG_RECEIVED_TIME=Mon May 19 14:00:37 CEST 2014
Message Digest
Message Digest
Digest Algorithm
Transport Protocol
File
Transport Protocol Version
1.0
Url
file://localhost//home/oracle/POC/positional/Krzysiek
security
Transport Headers
filename=test_4value.txt filesize=143 ChannelName=SzkolenieChannel file_ext=txt fullpath=/home/oracle/POC/positional/Krzysiek/test_4value.txt timestamp=2014-05-07T10:23:30.000+01:00 tp_profile_name=Szkolenie MSG_RECEIVED_TIME=Mon May 19 14:00:37 CEST 2014
certificates
certificates
State
ERROR
Reattempt Count
Error Code
B2B-51507
Error Description
Machine Info: (soatest.corp.prv) The data starting at position 0 is not recognized as a valid data transmission.
Error Text
Payload validation error.
exchange Retry Interval
exchange Remaining Retry
Message Size
141
Payload seems to be unchanged. Problem appears, when our B2B works in Cluster (but both configuration parameters b2b.HAInstance  b2b.HAInstanceName are set properly for both soa-servers):
imgur: the simple image sharer
There are no additional info in logs :
[2014-05-19T14:07:47.994+02:00] [soa_server1] [NOTIFICATION] [] [oracle.soa.b2b.engine] [tid: DaemonWorkThread: '20' of WorkManager: 'wm/SOAWorkManager'] [userId: <anonymous>] [ecid: a8bc74c6eb84aa5b:452af1da:146046c88de:-8000-00000000000598a0,0] [APP: soa-infra] Engine: processIncomingMessageImpl: Message id = AC18017B14614616E1A00000178AA510-1 FromParty = Szkolenie Doctype = Person version = Logistics2013
That’s unfortunately all. Is there any configuration that we missed? How to fix this problem?

Hi Anuj,
I have added a transformation in the mediator component and I'm now able to transfer the opaque data from Oracle B2B to composite and then write the contents of flat file to another specified file thorugh file adapter.
However I still have few more issues in different scenarios:
I have tried the excercise that was mentioned in your blog:
http://www.anuj-dwivedi.blogspot.com/2011/10/handling-positionaldelimited-flat-files.html
I tried to create a flat file with two records or lines as mentioned in the blog. First line contains header information whose length is 57 characters with 5 Fileds in it. Second line contains the actual data with 57 characters and 5 fields in it. In the agreement that was created to handle this flat file, I have checked the Validate & Translate check box.
During validation, B2B is failing to treat the second line as a new record and it considering it in one record and the validation is failing as nothing is specified from 58th character in the ECS & XSD files.
Following is the error:
Error -: B2B-51507: Payload validation error.: Machine Info: (corpdevsoa10) Extra Field was found in the data file as part of Record HEADER. Record HEADER is defined in the guideline at position 1.{br}{br}This error was detected at:{br}{tab}Record Count: 1{br}{tab}Field Count: 6{br}{tab}Characters: 57 through 116
Please advice me in resolving this issue.
Thanks,
Krishna.

Similar Messages

  • B2B integration with EBiz for a custom positional flat file

    I have installed SOA 11g with the latest patch. I am confused with my setup since I could not find any good example for my use case. I get a positional flat file from my trading partner (VDA4905). I have done the following so far:
    Created Document Definition : Custom, version 1.0, type PO with identification type as "flat". I have given the first 3 characters of the file to contain "511" as the identification value.
    I have created trading partners, host listening channel (folder as /tmp/b2b) and agreement. Validated and deployed the agreement.
    I created a dummy file with first 3 characters as 511 and put it in the /tmp/b2b. The file is not getting consumed and I see the pretty standard error "Agreement not found from TP is null".
    My questions are:
    1. When I created the document definition, do I need to specify any xsd or ecs file? Since I am using positional flat file, is this mandatory?
    2. I changed the name of the dummy file in the "FROM TP_HOST_seq.txt", still see the same issue.
    Any help is appreciated.
    Shanthi

    When I changed the file name, it gets consumed. Regarding the file name, I have the following question:
    I get the files from different plants and the file names are different for each plant, however, the processing is the same. If I need to have file name in %From TP%_%SEQ%.txt this format, do I need to create multiple TPs such that the file name maps to TP name? Can I use the same agreement for all these dummy TPs that I create?
    Is there anything I can do, to consume all messages in a specific directory irrespective of the file name?
    Thanks
    Shanthi

  • Getting error while reading the positional flat files

    Hi,
    I have a requirement to read a Positional Flat file and convert into XML through oracle B2B. I have created a positional flat file .ecs, XSD and parser .ecs file using the blog http://anuj-dwivedi.blogspot.in/2011/10/handling-positionaldelimited-flat-files.html and updated the same in the server.
    I have created a agreemnt for inbound scenario But while testing the inbound scenario i am getting an error in B2B as :
    Error Code     B2B-50083
    Error Description     Machine Info: Description: Document protocol identification error.
    Error Level     ERROR_LEVEL_COLLABORATION
    Error Severity     ERROR
    Error Text     Document protocol identification error.
    I have tried all the possible ways but getting the same error.
    Please guide me to overcome this.
    Thanks,
    Nishanth

    Nishanth,
    Have you followed the step#2 correctly -
    Copying the parser ecs in the installation directory and adding an entry in XERegistry.xml (of all machines if domain is spanned across multiple machines)
    Also make sure that you have provided Identification Value, Identification Start Position and Identification End Position correctly in Document Defintion. Please remember that start position starts from 1 (instead of 0)
    Regards,
    Anuj

  • How can I include loops in positional flat files?

    Hi,
    I got a simple record structure flat file working in B2B, but my data file is more complex and I want to know how I can define the ecs file for the same.
    My data has orders, lines underneath and finally a trailer with the total number of lines in the file. So when this goes to my target system, I would like to create order and lines underneath so I am expecting my schema as a result of the ecs to represent that loop structure.
    In the document editor, when I right click on the message I see Insert Child Node --> Record, but under the record I see both Insert Node and Insert Child Node again, and I am lost as I don't find any document on what these options mean.
    Can someone please throw some light on this?
    Thanks
    Venkat

    Hi Venkat,
    You may find Min Use and Max Use record properties at right side when you click on a record. Set min use as 0 (or 1 if it is not optional) and max use to 999999 or any other high value and you are good to go.
    Inside record, you can insert Fields which will eventually hold the line data.
    Regards,
    Anuj

  • Positional flat file schemas for input and output files to be generated with the required usecases

    Hello all,
    I need one help regarding the positional flat file schema which contains multiple headers, body and trailers.
    I have attached the sample input file below. This is a batched input and we have to generate the output which I have given below:
    We are unable to flat file schema which replicates the below input file. Please suggest better approach to achieve this.
    Sample Input FIle:
    010320140211ABC XYZ XYZ ABCD201402110 FSPAFSPA005560080441.02000006557.FSPA.IN AB-CD ABCD/AB-CD BETALKORT
    1020140210AN194107123459.FSPA.IN
    [email protected]
    1020140210AN196202123453.FSPA.IN
    [email protected]
    1020140210AN198103123435.FSPA.IN
    [email protected]
    1020140210AN195907123496.FSPA.IN
    [email protected]
    1020140210AN195903123437.FSPA.IN
    [email protected]
    1020140210AN193909123434.FSPA.IN
    [email protected]
    1020140210AN195607123413.FSPA.IN
    [email protected]
    1020140210AN199205123408.FSPA.IN
    [email protected]
    1020140210AN196206123499.FSPA.IN
    [email protected]
    1020140210AN196709123410.FSPA.IN
    [email protected]
    1020140210AN194708123455.FSPA.IN
    [email protected]
    1020140210AN195710123443.FSPA.IN
    [email protected]
    1020140210AN198311123436.FSPA.IN
    [email protected]
    1020140210AN196712123471.FSPA.IN
    [email protected]
    1020140210AV197005123403.FSPA.IN
    98000000000000014000000000000001000000000000015
    010320140211ABC XYZ XYZ PEDB201402111 FSPAICA 005560080441.02000006557.FSPA.IN AB-CDABCD/ABCDBETALKORT
    1020140210AN195111123491.ICA.IN
    [email protected]
    1020140210AV195309123434.ICA.IN
    98000000000000001000000000000001000000000000002
    Output FIle:
    1020140210AN195607123413.FSPA.IN
    [email protected]
    1020140210AN199205123408.FSPA.IN
    [email protected]
    1020140210AN196206123499.FSPA.IN
    [email protected]
    1020140210AN196709123410.FSPA.IN
    [email protected]
    1020140210AN194708123455.FSPA.IN
    [email protected]
    1020140210AN195710123443.FSPA.IN
    [email protected]
    1020140210AN198311123436.FSPA.IN
    [email protected]
    1020140210AN196712123471.FSPA.IN
    [email protected]
    1020140210AN195111123491.ICA.IN
    110019EPS [email protected]
    1020140210AV197005123403.FSPA.IN
    1020140210AV195309123434.ICA.IN
    98000000000000001000000000000001000000000000002
    99000000000000001
    Note: Header is a single line till BETALKORT and also there is a space before email id. That is not getting pasted properly in the files.
    Thanks and Regards,
    Veena Handadi

    Hi all,
    Header is missed from the output file for the above post:
    Please find the output file:
    010320140211ABC XYZ XYZ ABCD201402110 FSPAFSPA005560080441.02000006557.FSPA.IN AB-CD ABCD/AB-CD BETALKORT
    1020140210AN195607123413.FSPA.IN
    [email protected]
    1020140210AN199205123408.FSPA.IN
    [email protected]
    1020140210AN196206123499.FSPA.IN
    [email protected]
    1020140210AN196709123410.FSPA.IN
    [email protected]
    1020140210AN194708123455.FSPA.IN
    [email protected]
    1020140210AN195710123443.FSPA.IN
    [email protected]
    1020140210AN198311123436.FSPA.IN
    [email protected]
    1020140210AN196712123471.FSPA.IN
    [email protected]
    1020140210AN195111123491.ICA.IN
    110019EPS [email protected]
    1020140210AV197005123403.FSPA.IN
    1020140210AV195309123434.ICA.IN
    98000000000000001000000000000001000000000000002
    99000000000000001

  • Is exporting to Positional Flat File possible?

    Post Author: wilsonricharda
    CA Forum: Publishing
    I would like to know if it's possible to export data generated from a scheduled Webi report to a positional flat file?I know it's possible to export to a CSV but I'm looking for fixed length positional. Any help/suggestions would be appreciated.Thanks,Richard  

    Hi Richard,
    Could you please let me know how did you schedule a report to send as an CSV attachment. There are only 3 output formats available currently namely:
    1) WebIntelligence Document
    2) Excel
    3) PDF
    It is possible to save a report in CSV format. But there is no CSV format available when a scheduled report needs to be exported. Only the above 3 options are available.
    Do we need to configure the CMC for the CSV option to appear while specifying the output format of the report during scheduling. Or is anything else required for the CSV option to come up. Could you please provide your inputs on this.
    Thanks,
    Meghana

  • Flat File Handling

    Currently I am dealing with flat file (txt) that originally is a report so contains some information that need to be avoided like page number
    so the flat file data is a mix between a structure data and unstructured
    I need an idea on how to handle this type of file

    Hi
    I think you are facing the situation where the text file is actually a multipage report with headers and footers along with it. If this is the case, then I suggest you first extract all the data into a temporary table. Then in a subsequent dataflow use this temporary table as the source to a query transform and apply a condition in "Where" filter to eliminate the pattern of the header and footer of the document. IF you need more info please do let me know.
    Raghu

  • Update XERegistry.xml for my custom positional flat file

    Hi,
    I believe we need to add entry into XERegistry.xml for the new parser and upload the .ecs file, can anybody confirm? If so, what information do we need to enter ? In the document definition, what do we need to put in "Identification Start Position" and "Identification End Position"
    Thanks
    John

    Anuj,
    What you said makes sense, but somehow I could not make it work even with the invoice example shipped with Oracle B2B, INVOIC01.ecs and Invoice01.dat under samples/positional directory.
    I am using a Java program to read the data file and drop on a JMS queue Oracle B2B is listening. I am seeing the following trace in the log file, it seems that Oracle B2B knows it is a non-XML file.
    On the other hand, I have already made a X12 example work.
    May I send the log and other files to you to review?
    Thanks
    John
    BTW, still trying to understand how B2B identify the document. The parser uses the InitialFileDetect to get the initial information, what is the GroupNode for? Do I need to match those information (DocControlNumber, DocTypeID) when I create my document as well as the version and document type in the B2B console?
    <msg time="2011-09-22T09:55:45.949-04:00" comp_id="soa_server1" type="TRACE" level="1" host_id="uxpnwg001a0015.cbp.dhs.gov" host_addr="10.159.25.110" module="oracle.soa.b2b.engine" tid="weblogic.work.j2ee.J2EEWorkManager$WorkWithListener@15d19258" user="weblogic" ecid="72a52121dd40d94a:afeea16:1323f8126ee:-8000-00000000001c3f99" rid="0">
    <attr name="SRC_CLASS" value="oracle.tip.b2b.system.DiagnosticService" />
    <attr name="APP" value="soa-infra" />
    <attr name="SRC_METHOD" value="synchedLog_J" />
    <txt>oracle.tip.b2b.document.pff.PFFDocumentPlugin:identifyIncomingDocument Enter</txt>
    </msg>
    - <msg time="2011-09-22T09:55:45.949-04:00" comp_id="soa_server1" type="TRACE" level="1" host_id="uxpnwg001a0015.cbp.dhs.gov" host_addr="10.159.25.110" module="oracle.soa.b2b.engine" tid="weblogic.work.j2ee.J2EEWorkManager$WorkWithListener@15d19258" user="weblogic" ecid="72a52121dd40d94a:afeea16:1323f8126ee:-8000-00000000001c3f99" rid="0">
    <attr name="SRC_CLASS" value="oracle.tip.b2b.system.DiagnosticService" />
    <attr name="APP" value="soa-infra" />
    <attr name="SRC_METHOD" value="synchedLog_J" />
    <txt>oracle.tip.b2b.document.pff.PFFDocumentPlugin:identifyIncomingDocument iDoc ECS = /soa/b2b/PositionalFlatFile/SAP IDoc/SAP IDoc type/INVOICEDoc/INVOIC01.ecs</txt>
    </msg>
    - <msg time="2011-09-22T09:55:45.949-04:00" comp_id="soa_server1" type="TRACE" level="1" host_id="uxpnwg001a0015.cbp.dhs.gov" host_addr="10.159.25.110" module="oracle.soa.b2b.engine" tid="weblogic.work.j2ee.J2EEWorkManager$WorkWithListener@15d19258" user="weblogic" ecid="72a52121dd40d94a:afeea16:1323f8126ee:-8000-00000000001c3f99" rid="0">
    <attr name="SRC_CLASS" value="oracle.tip.b2b.system.DiagnosticService" />
    <attr name="APP" value="soa-infra" />
    <attr name="SRC_METHOD" value="synchedLog_J" />
    <txt>oracle.tip.b2b.document.pff.PFFDocumentPlugin:identifyIncomingDocument non-XML Payload</txt>
    </msg>
    - <msg time="2011-09-22T09:55:45.949-04:00" comp_id="soa_server1" type="TRACE" level="1" host_id="uxpnwg001a0015.cbp.dhs.gov" host_addr="10.159.25.110" module="oracle.soa.b2b.engine" tid="weblogic.work.j2ee.J2EEWorkManager$WorkWithListener@15d19258" user="weblogic" ecid="72a52121dd40d94a:afeea16:1323f8126ee:-8000-00000000001c3f99" rid="0">
    <attr name="SRC_CLASS" value="oracle.tip.b2b.system.DiagnosticService" />
    <attr name="APP" value="soa-infra" />
    <attr name="SRC_METHOD" value="synchedLog_J" />
    <txt>oracle.tip.b2b.document.pff.PFFDocumentPlugin:identifyIncomingDocument start pos = -1</txt>
    </msg>
    - <msg time="2011-09-22T09:55:45.949-04:00" comp_id="soa_server1" type="TRACE" level="1" host_id="uxpnwg001a0015.cbp.dhs.gov" host_addr="10.159.25.110" module="oracle.soa.b2b.engine" tid="weblogic.work.j2ee.J2EEWorkManager$WorkWithListener@15d19258" user="weblogic" ecid="72a52121dd40d94a:afeea16:1323f8126ee:-8000-00000000001c3f99" rid="0">
    <attr name="SRC_CLASS" value="oracle.tip.b2b.system.DiagnosticService" />
    <attr name="APP" value="soa-infra" />
    <attr name="SRC_METHOD" value="synchedLog_J" />
    <txt>oracle.tip.b2b.document.pff.PFFDocumentPlugin:identifyIncomingDocument values = {IdentificationStartPosition=0, IdentificationExpressionValue=EDI_DC , TransactionECSFileBlob=/soa/b2b/PositionalFlatFile/SAP IDoc/SAP IDoc type/INVOICEDoc/INVOIC01.ecs, IdentificationEndPosition=10}</txt>
    </msg>
    - <msg time="2011-09-22T09:55:45.949-04:00" comp_id="soa_server1" type="TRACE" level="1" host_id="uxpnwg001a0015.cbp.dhs.gov" host_addr="10.159.25.110" module="oracle.soa.b2b.engine" tid="weblogic.work.j2ee.J2EEWorkManager$WorkWithListener@15d19258" user="weblogic" ecid="72a52121dd40d94a:afeea16:1323f8126ee:-8000-00000000001c3f99" rid="0">
    <attr name="SRC_CLASS" value="oracle.tip.b2b.system.DiagnosticService" />
    <attr name="APP" value="soa-infra" />
    <attr name="SRC_METHOD" value="synchedLog_J" />
    <txt>oracle.tip.b2b.document.pff.PFFDocumentPlugin:identifyIncomingDocument xpath = null</txt>
    </msg>
    Edited by: zchen on Sep 22, 2011 1:11 PM

  • Handling Positional EDI file

    In 10.1.2.0.2 Positional EDI Document can be processed using Custom Document Plugin and following are the steps for the same.
    1. Apply the latest B2B patch on 10.1.2.0.2 Installation.
    2. Add the parser into registryValidator.xml
    <Item
    Name="SchemaFile">D:/Oracle/midtier/ip/oem/edifecs/XEngine/config/schema/IDoc_parser_40_2_ch
    ar_delim.ecs</Item>
    3. Design the ecs file in spec builder depending on the Positional Flat file.
    4. As part of Document protocol Revision Override the Document Protocol parameter by specifying the ImplementationClass as oracle.tip.adapter.b2b.document.custom.CustomDocumentPlugin
    5. Provide the IDOC Ecs file as part of the Document definition Parameter along with start position and End position.
    6. Run the End to End test.

    Hi Chandra,
    Instead of using File Content Conversion to convert the file structure, try using the standard module StrictXML2PlainBean instead. The problem is File Content Conversion doesn't identify an empty payload, even if you don't pass anything from the proxy. It doesn't return an empty message to the file adapter.
    StrictXML2PlainBean or MessageTransformBean will do the trick for you.
    Don't forget to set the empty message handling to Write Empty File.
    http://help.sap.com/saphelp_nwpi711/helpdata/en/44/748d595dab6fb5e10000000a155369/frameset.htm
    HTH,
    Erwin

  • Flat file mandatory field mapping problem

    Hi Guys,
    I am mapping xml to the positional flat file.
    There is only one field in the output FF 'provider_state' which is not being mapped.
    As a change I added new field after the provider_state.
    My problem is that the provider_state with length 2 is not produced in the output as blank spaces to match the position flat file.
    However, I have tried creating the record using xslt and also tried to add default 'spacespace' in the schema, tried mapping 'spacespace' in the outpt.
    When I try to add space2 I get CRLF in the output at the filed position.
    In short last field '%' should be at 733 position but it is at 731.
    Thanks in advance...
    //Dhiraj Bhavsar

    Hi Dhiraj,
    This is by design and 
    You may want to change provider_id to attribute if its element.
    To resolve this please check this..
    http://msdn.microsoft.com/en-us/library/ee250694(v=bts.10).aspx#bts:source_specification
    Regards
    Ritu Raj
    When you see answers and helpful posts,
    please click Vote As Helpful, Propose As Answer, and/or Mark As Answer

  • Flat file over sftp

    Hi
    I got a requirement to transfer flat file from ftp to remote trading partner ftp site. Can any body tell me the best way to do it. I can use ftp adapter but my client wants to have the report. I created ecs file and xsd for positional flat file but I am getting error while placing in the remote ftp. I dont want to translate the flat file to xml and back to flat file. this is flat file to flat file.
    Thanks

    You can use Custom document protocol also as a Identification type flat file.
    Please post the error which you are getting while sending the message.

  • FLAT file to Database load

    Hi All,
    we have been using OWB 10.2.0.3 version from past 6 months till now we are loading database to database.
    Data base version : 10.2.
    OWB version : 10.2.0.3
    Server operating system : Sun Solaries (10)
    Client operating system : Windows Xp.
    I would like to share some information to u and all.
    i have installed owb10gr2 on personal PC it has windows Xp operating system. i have one simple mapping for FLAT file.
    Steps handled by me to populate data flat file to Database.
    i have created one folder in C drive then i have created directory and i gave privilages to access that folder.
    i have created external table for flat file manually.
    after that i imported external table to mapp target table. it's working fine and data also loading.
    i am still getting confuse Flat file handling with SUN SOLARIES OPERATING SYSTEM
    How do we handle flat file in UNIX environment.
    How create directory and how to give access privilages.
    how to create EXTERNAL table with OWB.
    Could u please help me
    Regards.
    venkat.

    Hi,
    i have logged in sys and i created directory then gave privilages to 'c:\test_file'.... test_file is folder name which i created folder in windows PC. server is SUNSOLARIS.
    in OWB i have created table based on that file with delimited. below mentioned column names and values.
    no     name     sal     coom
    100     aa     10000     1000
    200     bb     20000     2000
    300     cc     30000     3000
    400     dd     40000     4000
    after that i have created external table under source module and then deployed that table in database table has created sucessfully but while using select statement it's showing error.
    ex: select * from table ( error)
    ORA-29913 error in executing ODCIEXTTABLEOPEN callout
    ORA-29400 data cartridge error
    KUP-04040 file abc.csv in test_v_s_loc not found
    ORA-06512 at "sys.ORACLE_LOADER",line 19
    i have done/given privilages like below
    create or replace directory sample_owb as 'C:\test_file'
    grant read,write on directory sample_owb to ods_inb_stg
    what is the cause and how to handle and resolve. could u please help me.
    regards,
    venkat.

  • I can't build an xsd for a flat file (txt) to handle repeating records

    Hi - have looked at many posts around flat file schema and they don't seem to address my question.
    I have a flat file that is \n delimited
    the pattern of the data is simple:
    record1 - 90 characters
    record2 - 20 characters
    record3 - n 248 characters - each of these records is parsed into children by the positional method
    record n+1 10 characters
    record n+2 20 characters
    so I used the flat file schema generator to generate the schema and built a map mapping the flat file schema to another xml schema. The schema looks ok - record1, record2, record n+1, record n+2 are child elements of the root. the repeating record
    section is showing up as a node with the parsed children.
    The transform is only mapping the children of the repeating records. When I test the map only the first repeating record gets parsed. No repeating happens (the actual flat file has 400+ repeating records). When I run the map in debug mode, the input
    xml shows that record1 is read in correctly, record2 is read in correctly, record3 is read in and parsed and record4 is treated like record n+1 and record5 is treated like record n+2 and the map thinks it's all finished.
    the section of the repeat part of the schema is and you can see that I set the minOccurs=1 and maxOccurs=unbounded for the node (INVOICE) and the complexType but this is not an affective syntax. I have looked at how the EDI X12 schema look and how they handle
    looping and it is a lot different than what the Flat File schema wizard is doing. Is there a good set of rules published that would guide me though this? otherwise I will basically have to read in the lines from the file and parse them out with functoids -
    seems so inelegant. Thanks in advance.
    <xs:element minOccurs="1" maxOccurs="unbounded" name="INVOICE">
              <xs:annotation>
                <xs:appinfo>
                  <b:recordInfo structure="positional" sequence_number="3" preserve_delimiter_for_empty_data="true" suppress_trailing_delimiters="false"
    />
                </xs:appinfo>
              </xs:annotation>
              <xs:complexType>
                <xs:sequence minOccurs="1" maxOccurs="unbounded">
                  <xs:annotation>
                    <xs:appinfo>
                      <groupInfo sequence_number="0" xmlns="http://schemas.microsoft.com/BizTalk/2003"
    />
                    </xs:appinfo>
                  </xs:annotation>
                  <xs:element name="SegmentType" type="xs:string">
                    <xs:annotation>
                      <xs:appinfo>
                        <b:fieldInfo justification="left" pos_offset="0" pos_length="2" sequence_number="1" />
                      </xs:appinfo>
                    </xs:annotation>
                  </xs:element>....... more children elements
    Harold Rosenkrans

    Thanks for responding
    I gave up trying to parse the repeating record into fields. Instead I just loop through the repeating record section with an <xs:for-each> block in the xsl and use functoids to grab the fields.
    So that works for having the two, shorter header records (structure is positional) before the section of repeating records. Now I just have to figure out how to get the schema to handle the two, shorter trailer (or footer, whatever you prefer) records after
    the section of repeating records
    the error I get in VS when I test the map is [BTW I changed the element names in the schema which is why you don't see INVOICE in the error]
    When I declare the last element as being positional with a character length of 10 I get the error:
    Error 18 Native Parsing Error: Unexpected end of stream while looking for:
    '\r\n'
    The current definition being parsed is SAPARData. The stream offset where the error occured is 1359. The line number where the error occured is 9. The column where the error occured is 0. 
    so the first record is 77 char in length and the second is 16 char and then the repeating records (5 in the file) are 248 char and the last record is 10 char
    so an offset of 1359 puts it beyond the last record by 16 characters - so the stream reader is looking for the next repeating record.
    if I try to declare the last element as delimited I get the error:
    Error 14 Native Parsing Error: Unexpected data found while looking for:
    '\r\n'
    The current definition being parsed is SAPARData. The stream offset where the error occured is 597. The line number where the error occured is 5. The column where the error occured is 0. 
    so the first record is 77 char in length and the second is 16 char and then the repeating records are 248 char.
    a stream offset of 597 puts me 8 characters into the third repeating record - at this point I have only declared one trailer record in the  schema, 10 characters long.
    Why is stream reader stopping at such a weird spot?
    The bottom line is I still haven't discovered the correct schema to handle the trailer records. even if I set the maxOccurs="4" (for the repeat record declaration) it still gets the first error. How does it find an unexpected end of stream looking
    for \r\n when the maxOccurs for the repeat record declaration should have the stream pointer in the 5th repeat record.
    I unfortunately don't have any options concerning the file structure.
    I have read a lot of posts concerning the trailer issue. I have seen a couple that looked interesting. I guess I'll just have to give them a try. The other option is to create a custom pipeline that will only take file lines of 248 characters.
    That's just disgusting !
    Harold Rosenkrans

  • How to handle flat file with variable delimiters in the file sender adapter

    Hi friends,
    I have some flat files in the ftp server and hope to poll them into XI, but before processing in XI, I hope to do some content conversion in the file sender adapter, according to the general solution, I just need to specify the field names, field seperator, end seperator, etc. But the questions is:
    The fileds in the test data may have different amount of delimiters (,), for example:
    ORD01,,,Z4XS,6100001746,,,,,2,1
    OBJ01,,,,,,,,,,4,3     
    Some fileds only have 1 ',' as the delimiter, but some of them have multiple ','.
    How can I handle it in the content conversion?
    Regards,
    Bean

    Hi Bing,
    Plz do refer the following blogs u will get an idea:
    File content conversion Blogs
    /people/venkat.donela/blog/2005/03/02/introduction-to-simplefile-xi-filescenario-and-complete-walk-through-for-starterspart1
    /people/venkat.donela/blog/2005/03/03/introduction-to-simple-file-xi-filescenario-and-complete-walk-through-for-starterspart2
    /people/arpit.seth/blog/2005/06/02/file-receiver-with-content-conversion
    /people/anish.abraham2/blog/2005/06/08/content-conversion-patternrandom-content-in-input-file
    /people/shabarish.vijayakumar/blog/2005/08/17/nab-the-tab-file-adapter
    /people/venkat.donela/blog/2005/03/02/introduction-to-simplefile-xi-filescenario-and-complete-walk-through-for-starterspart1
    /people/venkat.donela/blog/2005/03/03/introduction-to-simple-file-xi-filescenario-and-complete-walk-through-for-starterspart2
    /people/venkat.donela/blog/2005/06/08/how-to-send-a-flat-file-with-various-field-lengths-and-variable-substructures-to-xi-30
    /people/anish.abraham2/blog/2005/06/08/content-conversion-patternrandom-content-in-input-file
    /people/shabarish.vijayakumar/blog/2005/08/17/nab-the-tab-file-adapter
    /people/jeyakumar.muthu2/blog/2005/11/29/file-content-conversion-for-unequal-number-of-columns
    /people/shabarish.vijayakumar/blog/2006/02/27/content-conversion-the-key-field-problem
    /people/michal.krawczyk2/blog/2004/12/15/how-to-send-a-flat-file-with-fixed-lengths-to-xi-30-using-a-central-file-adapter
    /people/arpit.seth/blog/2005/06/02/file-receiver-with-content-conversion
    http://help.sap.com/saphelp_nw04/helpdata/en/d2/bab440c97f3716e10000000a155106/content.htm
    Regards,
    Vinod.

  • Validation Event Handler Not working with Flat File GTC Trusted Recon

    We are created Event Handler for checking special characters in Telephone field.Its working fine when the user is created through Admin Console.but the event handler is not triggering while doing GTC Flat File Trusted recon.
    Version: OIM 11.1.1.5.0
    Can someone please help me out with this.
    Thanks
    Edited by: 790561 on 17-Feb-2013 09:01
    Edited by: 790561 on Feb 17, 2013 9:35 PM
    Edited by: 790561 on Feb 18, 2013 12:38 AM

    Validation Event handlers will not work with your trusted recon. You can use the GTC Validation provider, it is nothing but the plugins which you can insert with in your source field in GTC. I think you can easily get the steps for how to create a custom GTC Providers.
    Edited by: iam37 on Feb 16, 2013 4:09 PM

Maybe you are looking for

  • Labview developer-search for remote work

    I have 3 year experience as Labview developer. I have good skils in controling devices using different interfaces and automated tests. I'm searching for some small projects which can be resolved using remote work. If necessary I can travel several ti

  • Imported iMovie HD project clips have incorrect capture date - add-on info

    People converting their .dv files from the iMovie HD style into iMovie 09 style report some frustration at having to rename their clips to view them properly. For discussion, see: http://discussions.apple.com/thread.jspa?threadID=1344628 Fortunately,

  • Windows 8.1 file history - Restore to fresh install

    Hi Everyone I'm trying to understand the benefits of file history. As Windows 8.1 doesn't allow me to create a system image on a schedule anymore, I now have to resort to file history. Hypothetical Scenario: OS hard disk catches fire, melts down and

  • T60 - external monitor

    I just installed a new 19 inch external monitor for my notebook T60.  I would like to use the new monitor exclusively and close my notebook, while I am working.  when I close my notebook, the new monitor goes into standby mode. what settings do I nee

  • Reverse engineering sequence diagram with more than one level deep

    Hello, When I generate sequence diagram from a method with Java Studio Enterprise 8.1, it only shows the method called directly by this method. Can I also show the methods that are called by the other methods that the one I'm generating the diagram f