Processing of huge no of Inbound xml files

Dear All ,
I have a query related to Performance in <b>FILE-XI-IDOC</b> scenario :
If I am having 1200 inbound xml files in my Inbound folder and I have to process 1200 xml it in a span of 2 hours ,
<b>Can anybody suggest how I should configure in my Sender File adapter to process huge no of xml files .</b>
Regards
Prabhat

Hi Prabhat,
>>>Can anybody suggest how I should configure in my Sender File adapter to
there's not a lot you can you can do in your sender file adapter
try using: Permanently connection mode instead of Per File Transfer cause with the last one:
"A new connection to the FTP server is established for each file transfer."
BTW
the most important thing is: how big your files are
that's the thing that can tell you: yes you can process 1200 in 2 hours or not
Regards,
michal

Similar Messages

  • Inbound XML file into ECC6 .. is middleware required?

    Hi
    We are running SAP ECC6 and do not have SAP XI.
    An external system needs us to send an outbound WMTORD in XML format. We have configured the XML port and succesfully got an xmla file out of the system.
    The external system will return a WMSUMO idox in XML format. ... but I am unclear if SAP can process this directly without some middleware or having to run special function modules (somehow?!).
    Can anyone help clarify what needs to be done for the inbound xml file ... do I create a file port and SAP simply converts the xml into normal format?
    I see some post about running function modules .. im not cleear on what program is used to run these
    Any help appreciated
    Thx

    HI,
       To answer your question specifically, No, there is no need for an middleware for your case.
    You can just create an inbound port of type xml file port and do the necessary IDOC inbound configuration and that should be enough.
    http://help.sap.com/saphelp_nw70/helpdata/en/3f/faa288bb7911d2897f0000e8216438/content.htm
    Best Regards,
    Ravi

  • Receiving a inbound xml file in Business Connector

    Hi All,
    The requirement is we are receiving an inbound xml file at the BC end from a third party application.
    We have configured the url in the third party end as follows -
    http://ip address of BC:port address/invoke/Folder/Service
    When the file is posted, the xml file is normally routed to the service in the developer.
    In the developer we are using the service load document to load the file from the url location.
    But at present when a xml file is triggered from the third party application it is routed to the service, and on checking the url location http://ip address of BC:port address/invoke/Folder/Service the server slows down slowly and we are not able to access both the developer and administrator after this.
    Are we following the correct steps, or else is there anyother way of receiving the inbound file in BC.
    Any suggestions on this would be of great help.
    Please let me know if we are going wrong anywhere.
    Regards,
    Priya

    Hi priye,
    I am agreed with you approach, and this is what everyone do. :). So that is not a problem. May be something else will be the cause.
    >>>>In the developer we are using the service load document to load the file from the url location
    For load document, i don't think its necessary because using the URL post, you are submitting data directly to the Flow Service. So no need to do document load or anything. Also i am against to use the Save data in pipeline (savePipeline) service because when you transport this  in production you need to comment this step otherwise it will give other problems when this will run.
    if this approach is giving you performance problem, then try putting your xml file in you package directory and the write a file pooler and then parse this xml for ur use. this is one of the simple solutions.
    hope this will help you.
    Regards
    Aashish Sinha
    PS : reward points if helpful

  • XML files containing huge string

    Hi,
    I'm trying to read in a huge string from an XML file (the string is an encoded file e.g. a video - ideally I want to read in a file over 100MB if possible) It seems fine for small files, but large ones give me the 'out of memory' error, apparently when creating the XML document.
    I gather it's best to use SAX parser rather than DOM for large files to avoid this; however I'm already using SAX. Does anyone have a suggestion as to where I might be going wrong? Maybe there's a better way to store the string in the XML? At the moment it's just an attribute of an object.
    I'm very new to XML and would greatly appreciate anyone's advice here!
    Thanks a lot,
    Kat

    create content handler by over riding defaulthandler. Append to StringBuufer the content you require that way you can recreate chunks you require. Like this
    class myContentHandler extends DefaultHandler
    StringBuffer sb = new StringBuffer();
    public void startElement(String uri,String localName,String qName,
    Attributes attributes) throws SAXException
    sb.append("<").append(qName).append(">");
    public void endElement(String uri, String localName, String qName) throws SAXException
         sb.append("</").append(qName).append(">");
         if ( qName.equalsIgnoreCase("endtag"))
              System.out.println(sb);
    public void characters(char[] ch, int start, int length) throws SAXException
    sb.append(XmlUtil.encodeXmlSpecialChars(ch,start,length));
    Call Sax Parser ==>
    SAXParserFactory parserFactory = SAXParserFactory.newInstance();
    SAXParser parser = parserFactory.newSAXParser();
    parser.parse("myfile.xml",new MycontentHandler());

  • Adapter type for inbound xml message XI3.0 SP10

    Hi,
    We are using XI3.0 SP10 and are about to set up a scenario where we will receive an xml to process. Up until now we have only received simple flatfiles  to process.
    My question is, can I use adapter type "File" to process this inbound xml file? And are there any special considerations I should be aware of, that is different from processing an textfile?
    At the moment I have started to set it up as follows:
    Adapter Type: File
    Transport Protocol: FTP
    Message Protocol: File Content Conversion
    Adapter Engine: Integration Server
    Transfer Mode: Binary
    File type: Binary
    Thanks,
    Fredrik

    Hello,
    Yes you can but you don't need "File Content Conversion".
    Choose "File" and well done.
    Regards,
    Chris

  • Reading A xml file and sending that XML Data as input  to a Service

    Hi All,
    I have a requirement to read(I am using File adapter to read) a xml file and map the data in that xml to a service(schema) input variable.
    Example of  xml file that I have to read and the content of that xml file like below:
      <StudentList>
        <student>
           <Name> ravi</Name>
           <branch>EEE</branch>
          <fathername> raghu</fathername>
        </student>
      <student>
           <Name> raju</Name>
           <branch>ECE</branch>
          <fathername> ravi</fathername>
        </student>
    <StudentList>
    I have to pass the data(ravi,EEE,raghu etc) to a service input varible. That invoked Service input variable(schema) contains the schema similar to above schema.
    My flow is like below:
      ReadFile file adapter -------------------> BPEL process -----> Target Service.I am using transform activity in BPEL process to map the data from xml file to Service.
    I am using above xml file as sample in Native Data format(to create XSD schema file).
    After I built the process,I checked file adapter polls the data and receive the file(I am getting View xml document in EM console flow).
    But transform activity does not have anything and it is not mapping the data.I am getting blank data in the transform activity with only element names like below
    ---------------------------------------------------------------------------EM console Audit trail (I am giving this because u can clearly understand what is happening-----------------------------------------------------
       -ReceiveFile
            -some datedetails      received file
              View XML document  (This xml contains data and structure like above  xml )
        - transformData:
            <payload>
              <InvokeService_inputvariable>
                  <part name="body">
                     <StudentList>
                         <student>
                           <name/>
                            <branch/>
                            <fathername/>
                         </student>
                   </StudentList>
              </part>
             </InvokeService_inputvariable>
    'Why I am getting like this".Is there any problem with native data format configuration.?
    Please help me out regarding this issue as I am running out my time.

    Hi syam,
    Thank you very much for your replies so far so that I have some progrees in my task.
    As you told I could have put default directory in composite.xml,but what happenes is the everyday new final subdirectory gets created  in the 'soafolder' folder.What I mean is in  the c:/soafolder/1234_xmlfiles folder, the '1234_xmlfiles' is not manually created one.It is created automatically by executing some jar.
    Basically we can't know the sub folder name until it is created by jar with its own logic. whereas main folder is same(soafolder) ever.
    I will give you example with our folder name so that it would be more convenient for us to understand.
    1) yesterday's  the folder structure :  'c:/soafolder/130731_LS' .The  '130731_LS' folder is created automatically by executing some jar file(it has its own logic to control and create the subdirectories which is not in our control).
    2) Today's folder structure :  'c:/soafolder/130804_LS. The folder is created automatically(everytime the number part(130731,130804).I think that number is indicating 2013 july 31 st like that.I have to enquire about this)is changing) at a particular time and xml files will be loaded in the folder.
    Our challenge : It is not that we can put the default or further path in composite.xml and poll the file adapter.Not everytime we have to change the path in composite.xml.The process should know the folder path (I don't know whether it is possible or not.) and  everyday and file adapter poll the files in that created subfolders.
    I hope you can understand my requirement .Please help me out in this regard.

  • How to find the existence of a tag in  XML file through  XSLT Mapping?

    Hello Friends,
    Working on an SAP XI interface ,I have come across a situation where I need to map the values only when a particular tag exists in the inbound XML file.I need to use the XSLT mapping for the same.
    Requesting your advice on as to how may I validate the existence of a tag through XSLT mapping?
    Thanks.

    Hello Friends
    After research , I could also find another way to check the existence of a node .We can even use CHOOSE to check the existence.
    <xsl:choose>
          <xsl:when test="(/mynode)">
              your action if the mynode is found
          </xsl:when>
          <xsl:otherwise>
                    action if mynode is not found
          </xsl:otherwise>
    </xsl:choose>
    Thanks.
    Wishes
    Richa

  • The nicest way to generate xml-files by Java???

    I have to generate huge and quite complex xml files by Java. I have to fetch the data from a Oracle database. What I really don't know is a proper and reliable way to this? I could of course create a String and concatenate all the tags, attributes and data but it doesn't feel right. I guess this is a quite common task and there are many established ways to this by Java. My question is what is the best way to this? What is your suggestion?
    Thank you for any clues...

    Hi,
    Since you have access to the database, I think one of the best way is to create stored procedures that will generate those files for you (using SQL/XML functions preferably) and return the content to the Java caller.

  • GUI to create all xml file combinations allowed by a Schema

    Suppose I have an XML Schema and according to this Schema a total of six xml file configurations are legitimately possible.
    1. Is there an easy way to create a Swing GUI that can be used to easily generate all six of these possible xml files?
    ...or...
    2. Can anyone recommend specific tools that would be helpful in creating such a GUI.
    I am familiar with tools to validate an xml file against a Schema but what tools are available to programmatically read a schema and extract all the possible legitimate combinations of the fields?

    Hi,
    What i need is the following:
    I have some selects wich generates several XML files, but those files are plane, like this:
    <?xml version = '1.0' encoding = 'ISO-8859-1'?>
    <ROWSETTAG>
    <ROWTAG>
    <TX_COAPI>PES</TX_COAPI>
    <TX_DNAPI>Consultas Pesetas</TX_DNAPI>
    <TX_DEEUS>Datuak Pezetetan</YX_DEEUS>
    <TX_DIREC>C:\TEST\</TX_DIREC>
    <TX_ORDEN>11</TX_ORDEN>
    </ROWAG>
    Well, i need to apply an schema to each of that XML files, and the output of that process it should be a new XML file "formatted" or "transformed" ( i don't know what is the word) with the attributes,... that appears in a Schema file.
    I don't know if this is possible, and how to do this, or maybe i should use a XSLT file...
    i am newbie at this technology.
    thx for your time!!!

  • ISABuild: missing sda_build.xml file

    Dear all,
    I have downloaded the ISABuild-Tool from note 594370 and followed the instructions given at mySAP CRM SAP Internet Sales Building and Updating Modified SAP Internet Sales Web Applications.
    The documentation wants me to copy "sda_build.xml" to <ISA build tool>/sap_ear. There is nowhere mentioned where I do get this file from ...
    I have taken a look at the <ISA build tool>/bin/build.xml file, finding this comment:
    5. Copy the sda_build.xml from your sap distribution to  the sap_ear folder
    I did a search of J2EE_ENGINE_ROOT without any hit. Any idea where it could be? Could it be inside b2b.ear (basis web app to be copied)? How can I take a look at the content of this file (using windows here)?
    "ant build" intterupts the build process when there is no sda_build.xml file.
    Any help is highly appreciated.
    Thanks,
    Martin Muellenberg

    Hi Martin,
               I assume you are talking abt ISA 4.0 version.
    For any 4.0 version you would get the sar file from the market place (something like ISAWAC40SP11P_3-20000529.SAR). When you extract this using sapcar you would find the sda_build.xml which you can use in isa build tool.
    Hope this helps.
    Thanks and Rgds,
    Satya

  • Memory overflow problems when processing huge XML files

    Hi All,
    We are in need of processing very large XML file.(more than 100MB)..
    We ran this job in background and it resulted in runtime errors.
    Is there any way of processing this file as a whole?
    Edited by: Thomas Zloch on Nov 17, 2010 4:16 PM - subject adjusted

    Normally such memory problems can be avoided by using block processing and clearing tempory data inbetween the blocks. However all XML techniques that I know (DOM, XSLT, ST) require all data to reside in an internal table at once. I will be facing a similar problem soon, so I'm quite interested in a solution.
    One way would be to upgrade the hardware and allow more memory to be allocated to the workprocess (system administration). Some background information:
    http://help.sap.com/saphelp_nw70ehp1/helpdata/en/49/32eb1fe92e3504e10000000a421937/frameset.htm
    I wonder if there are other workarounds, let's see if there will be additional replies.
    Thomas

  • Issue while Processing the Huge File in BPEL

    Hi,
    We are facing an Issue while Processing a Hige file in BPEL Process (More than 1MB File). When i test the files with more than 1500 transactions (More than 1MB file) then the BPEL Process is automatically goes to OFF Mode or it goes to Perform Manually Recovery Queue.
    Even we are facing this issue in Production also so we are using UNIX Script to Split the file before place the file in BPEL Input directory.Any Pointers to resolve this issue will be helpful.
    Thanks,
    Saravana

    Hi,
    Please find the answers.
    1. Currently we are using SOA 10.1.2 Version and JDev10g
    2. we are using File Adapeter
    3. yes. We used debatching.
    4. Yes. I am able to recover from Manual Recovery Queue
    5. Please find the error message
    <2009-05-21 04:32:38,461> <DEBUG> <ESIBT.collaxa.cube.engine.dispatch> <Dispatcher::adjustThreadPool> Allocating 1 thread(s); pending threads: 1, active threads: 0, total: 83
    <2009-05-21 04:32:44,077> <DEBUG> <ESIBT.collaxa.cube.activation> <File Adapter::Inbound> File : /harbinger/prod/xlate/Outbound_Input_LDS_810/Input5162009.B2B is ready to be processed.
    <2009-05-21 04:32:44,077> <INFO> <ESIBT.collaxa.cube.activation> <File Adapter::Inbound> Processing file : /harbinger/prod/xlate/Outbound_Input_LDS_810/Input5162009.B2B
    <2009-05-21 04:32:44,077> <DEBUG> <ESIBT.collaxa.cube.activation> <AdapterFramework::Inbound> onBatchBegin: Batch 'bpel://localhost/ESIBT/BPELProcess_810~1.0//Input5162009.B2B_1242894594000' (/harbinger/prod/xlate/Outbound_Input_LDS_810/Input5162009.B2B) starting...
    <2009-05-21 04:32:44,077> <DEBUG> <ESIBT.collaxa.cube.translation> <TranslatorFactory::log> Inside TranslatorFactory
    <2009-05-21 04:32:44,078> <DEBUG> <ESIBT.collaxa.cube.translation> <TranslatorFactory::log> using version attribute = NXSD
    <2009-05-21 04:32:44,078> <DEBUG> <ESIBT.collaxa.cube.translation> <TranslatorFactory::log> loading xlator class...oracle.tip.pc.services.translation.xlators.nxsd.NXSDTranslatorImpl
    <2009-05-21 04:32:44,081> <DEBUG> <ESIBT.collaxa.cube.translation> <TranslatorFactory::log> class loaded
    <2009-05-21 04:32:44,081> <DEBUG> <ESIBT.collaxa.cube.activation> <File Adapter::Inbound> Created translator : oracle.tip.pc.services.translation.xlators.nxsd.NXSDTranslatorImpl@46908ae8
    <2009-05-21 04:32:44,098> <DEBUG> <ESIBT.collaxa.cube.activation> <File Adapter::Inbound> Setting up Control dir for debatching error recovery
    <2009-05-21 04:32:44,121> <DEBUG> <ESIBT.collaxa.cube.activation> <File Adapter::Inbound> Control dir for debatching error recovery : /opt01/app/ESIBT/oracle/esibt10gR2iAS/BPEL10gR2/iAS/j2ee/home/fileftp/controlFiles/localhost_ESIBT_BPELProcess_810~1.0_/inbound
    <2009-05-21 04:32:44,121> <DEBUG> <ESIBT.collaxa.cube.activation> <File Adapter::Inbound> Invoking inbound translation for : Input5162009.B2B
    <2009-05-21 04:32:44,121> <DEBUG> <ESIBT.collaxa.cube.translation> <NXSDTranslatorImpl::log> Starting translateFromNative
    <2009-05-21 04:32:44,139> <DEBUG> <ESIBT.collaxa.cube.translation> <NXSDTranslatorImpl::log> Done with translateFromNative
    <2009-05-21 04:32:44,139> <DEBUG> <ESIBT.collaxa.cube.activation> <File Adapter::Inbound> Completed inbound translation for : Input5162009.B2B
    <2009-05-21 04:32:44,139> <DEBUG> <ESIBT.collaxa.cube.activation> <File Adapter::Inbound> isTextFile : true
    <2009-05-21 04:32:44,139> <INFO> <ESIBT.collaxa.cube.activation> <File Adapter::Inbound> Translated inbound batch index 1 of file {Input5162009.B2B} with corrupted message count = 1
    <2009-05-21 04:32:44,139> <DEBUG> <ESIBT.collaxa.cube.activation> <File Adapter::Inbound> Error Reader created using charset :ASCII
    <2009-05-21 04:32:44,139> <INFO> <ESIBT.collaxa.cube.activation> <File Adapter::Inbound> Sending message to Adapter Framework for rejection to user-configured rejection handlers : {
    fileName=/harbinger/prod/xlate/Outbound_Input_LDS_810/Input5162009.B2B, startLine=1, startColumn=1, endLine=-1, endCol=-1, Exception=ORABPEL-11167
    Error while reading native data.
    [Line=1, Col=70] Expected "\t" at the specified position in the native data, while trying to read the data for "element with name HDR_STORE_NUM", using "style" as "terminated" and "terminatedBy" as "\t", but not found.
    Ensure that "\t", exists at the specified position in the native data.
    <2009-05-21 04:32:44,139> <DEBUG> <ESIBT.collaxa.cube.activation> <File Adapter::Inbound> Setting batchId in NativeRecord to bpel://localhost/ESIBT/BPELProcess_810~1.0//Input5162009.B2B_1242894594000
    <2009-05-21 04:32:44,139> <WARN> <ESIBT.collaxa.cube.activation> <AdapterFramework::Inbound> [Read_ptt::Read(Object)] - onReject: The resource adapter 'File Adapter' requested handling of a malformed inbound message. However, the following bpel.xml activation property has not been defined: 'rejectedMessageHandlers'. Please define it and redeploy the business process. Will use the default Rejection Directory file:///opt01/app/ESIBT/oracle/esibt10gR2iAS/BPEL10gR2/iAS/integration/orabpel/domains/ESIBT/archive/jca/BPELProcess_810/rejectedMessages for now.
    <2009-05-21 04:32:44,140> <WARN> <ESIBT.collaxa.cube.activation> <AdapterFramework::Inbound> [Read_ptt::Read(Object)] - onReject: Sending invalid inbound message to Exception Handler:
    <2009-05-21 04:32:44,140> <INFO> <ESIBT.collaxa.cube.activation> <AdapterFramework::Inbound> Handing rejected message to DEFAULT rejection handler: file:///opt01/app/ESIBT/oracle/esibt10gR2iAS/BPEL10gR2/iAS/integration/orabpel/domains/ESIBT/archive/jca/BPELProcess_810/rejectedMessages since none of the configured rejection handlers [] succeeded.
    <2009-05-21 04:32:44,140> <DEBUG> <ESIBT.collaxa.cube.activation> <AdapterFramework::Inbound> Finished persisting rejected message to file system under the name: /opt01/app/ESIBT/oracle/esibt10gR2iAS/BPEL10gR2/iAS/integration/orabpel/domains/ESIBT/archive/jca/BPELProcess_810/rejectedMessages/INVALID_MSG_BPELProcess_810_Read_20090521_043244_0140.dat
    <2009-05-21 04:32:44,141> <DEBUG> <ESIBT.collaxa.cube.activation> <File Adapter::Inbound> Setting last error record to : -1
    <2009-05-21 04:32:44,141> <DEBUG> <ESIBT.collaxa.cube.activation> <File Adapter::Inbound> Translator has failed to translate any message from batch number: 1
    <2009-05-21 04:32:44,141> <INFO> <ESIBT.collaxa.cube.activation> <File Adapter::Inbound> Message not published as translation failed: {
    File=/harbinger/prod/xlate/Outbound_Input_LDS_810/Input5162009.B2B, batchIndex=1, PublishSize=1
    <2009-05-21 04:32:44,141> <ERROR> <ESIBT.collaxa.cube.activation> <AdapterFramework::Inbound> onBatchFailure: Batch 'bpel://localhost/ESIBT/BPELProcess_810~1.0//Input5162009.B2B_1242894594000' (/harbinger/prod/xlate/Outbound_Input_LDS_810/Input5162009.B2B) has failed due to: ORABPEL-11167
    Error while reading native data.
    [Line=1, Col=70] Expected "\t" at the specified position in the native data, while trying to read the data for "element with name HDR_STORE_NUM", using "style" as "terminated" and "terminatedBy" as "\t", but not found.
    Ensure that "\t", exists at the specified position in the native data.
    <2009-05-21 04:32:44,141> <INFO> <ESIBT.collaxa.cube.activation> <File Adapter::Inbound> Deleting file : /harbinger/prod/xlate/Outbound_Input_LDS_810/Input5162009.B2B after processing.
    <2009-05-21 04:32:44,141> <DEBUG> <ESIBT.collaxa.cube.activation> <File Adapter::Inbound> Deleting file : Input5162009.B2B
    <2009-05-21 04:32:44,141> <DEBUG> <ESIBT.collaxa.cube.activation> <File Adapter::Inbound> Deleted file : true
    <2009-05-21 04:32:44,141> <DEBUG> <ESIBT.collaxa.cube.activation> <File Adapter::Inbound> Removing file /harbinger/prod/xlate/Outbound_Input_LDS_810/Input5162009.B2B from files to be processed Map.
    <2009-05-21 04:32:44,141> <INFO> <ESIBT.collaxa.cube.activation> <File Adapter::Inbound> Done processing File : /harbinger/prod/xlate/Outbound_Input_LDS_810/Input5162009.B2B
    <2009-05-21 04:33:09,698> <DEBUG> <ESIBT.collaxa.cube.engine.data> <ConnectionFactory::getConnection> GOT CONNECTION 4 Autocommit = false
    For this error message this shows due to some /t its not picking up the file. but even i am facing the same issue for all the files where load is huge.
    Thanks,
    Saravana

  • URGENT --  Perofrmance issue while creating Huge XML file

    All XML Experts Please Help....Thanks a lot in Advance
    We are trying to create a XML file for a huge table.. 5 million rows and the performance is ver very bad.. Can some body help by giving me an idea what what my best approch could be... or what am I doing wrong in in the code below
    CREATE OR REPLACE PROCEDURE Sales_1_Generate_Xml IS
    temp_clob CLOB;
    temp_buffer VARCHAR2(1);
    amount BINARY_INTEGER := 1;
    position INTEGER := 1;
    filehandle utl_file.file_type;
    error_number NUMBER;
    error_message VARCHAR2(100);
    length_count INTEGER;
    qryctx dbms_xmlgen.ctxhandle;
    BEGIN
    qryctx := dbms_xmlgen.newcontext('select /* INDEX UF_SALES(UF_SALES_IX16) */
    TRANSACTION_NUMBER     "Transaction_Number",
    TRANSACTION_TYPE_ID     "Transaction_Type_ID",
    PROCESS_FISCAL_DATE_ID     "Process_Fiscal_Date_ID",
    INVOICE_FISCAL_DATE_ID     "Invoice_Fiscal_Date_ID",
    ORDER_FISCAL_DATE_ID     "Order_Fiscal_Date_ID",
    PROCESS_CALENDAR_DATE_ID     "Process_Calendar_Date_ID",
    INVOICE_CALENDAR_DATE_ID     "Invoice_Calendar_Date_ID",
    ORDER_CALENDAR_DATE_ID     "Order_Calendar_Date_ID",
    CURRENT_TM_ID     "Current_TM_ID",
    CUSTOMER_ID     "Customer_ID",
    CUSTOMER_TYPE_ID     "Customer_Type_ID",
    CUSTOMER_LEVEL_ID     "Customer_Level_ID",
    ACCOUNT_TYPE_ID     "Account_Type_ID",
    TRADE_CLASS_ID     "Trade_Class_ID",
    DISTRIBUTOR_ID     "Distributor_ID",
    PRODUCT_ID     "Product_ID",
    ORDERED_PRODUCT_ID     "Ordered_Product_ID",
    BRAND_TYPE_ID     "Brand_Type_ID",
    LABEL_TYPE_ID     "Label_Type_ID",
    BRAND_LABEL_ID     "Brand_Label_ID",
    PRICED_BY_ID     "Priced_By_ID",
    SALES_UOM_ID     "Sales_UOM_ID",
    PURCHASING_UOM_ID     "Purchasing_UOM_ID",
    PRICING_UOM_ID     "Pricing_UOM_ID",
    NET_COST     "Net_Cost",
    NPA_S     "NPA_S",
    CMA_S     "CMA_S",
    NOT_S     "NOT_S",
    TOTAL_NATIONAL_ALLOWANCE_S     "Total_National_Allowance_S",
    LPA_S     "LPA_S",
    LMA_S     "LMA_S",
    LOT_S     "LOT_S",
    TOTAL_LOCAL_ALLOWANCE_S     "Total_Local_Allowance_S",
    TOTAL_ALLOWANCES_S     "Total_Allowances_S",
    LPC     "LPC",
    LPC_EXTENDED     "LPC_Extended",
    LPF     "LPF",
    LPF_EXTENDED     "LPF_Extended",
    TRUE_COST     "True_Cost",
    CDE     "CDE",
    LPP     "LPP",
    SURCHARGE     "Surcharge",
    COMBINED_SURCHARGE     "Combined_Surcharge",
    TOTAL_SURCHARGES     "Total_Surcharges",
    MARKET_COST     "Market_Cost",
    INSIDE_PAD     "Inside_Pad",
    SALES_REP_COST     "Sales_Rep_Cost",
    SALES_REP_MARGIN     "Sales_Rep_Margin",
    SALES_PRICE     "Sales_Price",
    SALES_TRUE_MARGIN     "Sales_True_Margin",
    NVD     "NVD",
    LVD     "LVD",
    NID     "NID",
    LID     "LID",
    TOTAL_VD     "Total_VD",
    TOTAL_ID     "Total_ID",
    TOTAL_DEVIATIONS     "Total_Deviations",
    GP1     "GP1",
    GP2     "GP2",
    DEVIATED_COST     "Deviated_Cost",
    ACTUAL_COST     "Actual_Cost",
    SALES_TAX     "Sales_Tax",
    QUANTITY_ORDERED     "Quantity_Ordered",
    QUANTITY_SHIPPED     "Quantity_Shipped",
    QUANTITY_DEVIATED     "Quantity_Deviated",
    QUANTITY_SUBBED     "Quantity_Subbed",
    UNITS_ORDERED     "Units_Ordered",
    EACHES_ORDERED     "Eaches_Ordered",
    EACH_CONVERSION_FACTOR     "Each_Conversion_Factor",
    UNITS_SHIPPED     "Units_Shipped",
    EACHES_SHIPPED     "Eaches_Shipped",
    SHIP_WEIGHT     "Ship_Weight",
    ACTUAL_GP_DLR     "Actual_GP_Dlr",
    TRUE_GP_DLR     "True_GP_Dlr",
    LANDED_GP_DLR     "Landed_GP_Dlr",
    LANDED_ACTUAL_GP_DLR     "Landed_Actual_GP_Dlr",
    INVOICE_GP_DLR     "Invoice_GP_Dlr",
    INVOICE_ACTUAL_GP_DLR     "Invoice_Actual_GP_Dlr",
    ADJUSTED_ACTUAL_GP_DLR     "Adjusted_Actual_GP_Dlr",
    EB_S     "EB_S",
    MB_S     "MB_S",
    ACTUAL_TM_ID     "Actual_TM_ID",
    ACTUAL_TM_NAME     "Actual_TM_Name",
    ACTUAL_DSM_ID     "Actual_DSM_ID",
    ACTUAL_DSM_NAME     "Actual_DSM_Name",
    INVOICE_NUMBER      "Invoice_Number ",
    CONTRACT_NUMBER     "Contract_Number",
    CUSTOMER_NUMBER     "Customer_Number",
    CUSTOMER     "Customer",
    PRODUCT_NUMBER     "Product_Number",
    MASTER_DISTRIBUTOR_ID     "Master_Distributor_ID",
    ORDERED_PRODUCT_NUMBER     "Ordered_Product_Number",
    NATIVE_PRODUCT_STATUS     "Native_Product_Status",
    NATIVE_PRICED_BY_INDICATOR     "Native_Priced_By_Indicator",
    EXTRACTION_TIME     "Extraction_Time"
    from uf_sales where distributor_id in (''5139'',
    ''5140'',
    ''5145'',
    ''5150'',
    ''5160'',
    ''5175'',
    ''5180'',
    ''5210'',
    ''5220'',
    ''5230'')
    DBMS_XMLGen.setRowTag(qryctx,'Sales_Record');
    DBMS_XMLGen.setRowSetTag(qryctx,'Sales_Set');
    temp_clob:=dbms_xmlgen.getxml(qryctx);
    length_count := dbms_lob.getlength(temp_clob);
    dbms_output.put_line('Internal LOB size is: ' || length_count);
    filehandle := utl_file.fopen('DATA_EXTRACT','Sales_1.xml','Wb',32767);
    WHILE length_count <> 0 LOOP
    dbms_lob.read (temp_clob, amount, position, temp_buffer);
    --utl_file.put (filehandle, temp_buffer);
    utl_file.put_raw(filehandle, utl_raw.cast_to_raw(temp_buffer));
    position := position + 1;
    length_count := length_count - 1;
    temp_buffer := null;
    END LOOP;
    dbms_output.put_line('Exit the loop');
    utl_file.fclose(filehandle);
    DBMS_XMLGen.closeContext(qryctx);
    dbms_output.put_line('Close the file');
    EXCEPTION
    WHEN OTHERS THEN
    BEGIN
    error_number := sqlcode;
    error_message := substr(sqlerrm ,1 ,100);
    dbms_output.put_line('Error #: ' || error_number);
    dbms_output.put_line('Error Message: ' || error_message);
    utl_file.fclose_all;
    END;
    END;
    /

    OK, so you are writing the file with UTL_FILE. How long is the whole process taking. Have you timed the time taken to generate the temp_clob with the result Vs the time to write the output to a file.

  • Reading  huge xml files in OSB11gR1(11.1.1.6.0)

    Hi,
    I want to read a huge xml file of size 1GB in OSB(11.1.1.6.0)?
    I will be creating a (JCA)file adapter in jdeveloper and importing artifacts to OSB.
    Please let me know the maximum file size that could be handled in OSB?
    Thanks in advance.
    Regards,
    Suresh

    Depends on what you intend to do after reading the file.
    Do you want to parse the file contents and may be do some transformation? Or do you just have to move the file from one place to another for ex. reading from local system and moving to a remote system using FTP?
    If you just have to move the file, I would suggest using JCA File/FTP adapter's Move operation.
    If you have to parse and process the file contents within OSB, then it may be possible depending on the file type and what logic you need to implement. For ex. for very large CSV files you can use JCA File Adapter batching to read a few records at a time.

  • How to create Inbound Idoc from XML file-Need help urgently

    Hi,
    can any one tell how to create inbound Idoc from XML file.
    we have xml file in application server Ex. /usr/INT/SMS/PAYTEXT.xml'  we want to generate inbound idoc from this file.we are successfully able to generate outbound XML file from outbound Idoc by using the XML port. But not able to generate idoc from XML file by using we19 or we16.
    Please let me know the process to trigger inbound Idoc with out using  XI and any other components.
    Thanks in advance
    Dora Reddy

    Hi .. Did either of you get a result on this?
    My question is the same really .. I am testing with WE19 and it seems SAP cannot accept an XML inbound file as standard.
    I see lots of mention of using a Function Module.
    Am I correct in saying therefore that ABAP development is required to create a program to run the FM and process the idoc?
    Or is there something tht can be done with Standard SAP?
    Thanks
    Lee

Maybe you are looking for

  • Error 4960 during Adobe Premier el 10 trial download/installation

    So I got this error in the end of the downloading process of the adobe premier elements 10 trial. Opens a dialog screen, attached a picture below: I tried canceling download, force quiting bloody download assitant, uninstaling download assistant, but

  • Save Changes in a TextArea

    I have a TextArea in my web application and I want a user to be able to enter in any text they want to in the TextArea and be able to save the changes so that when another user accesses the same application they will see the new text in the TextArea.

  • Updating Cs3 to ACR 5.7

    I am running windows Vista and want to update ACR to 5.7 I have downloaded the update following instructions for windows XP but this did not work. I have the 5.7 folder on my desktop, How do I get it into cs3 ? Easy to follow instuctions would be app

  • PRINT SEVERAL FORMS IN DUPLEX

    Hi, I have a problem when I sent to print several dynamic template in duplex in the same transaction file. I sent 2 dynamic template diferents of 3 pages every one, and print the first form well, but the second form it begins back the second sheet. T

  • My apple TV works for everything except hulu and netflix.  they worked but have suddenly stopped and I can not enter them

    my apple TV works for everything except Hulu an Netfix.  When I try to open them I always get the message that they are not available and to try again later