Suggestion needed for processing Big Files in Oracle B2B

Hi,
We are doing a feasibility study for Using Oracle AS Integration B2B over TIBCO. We are presently using TIBCO for our B2B transactions. Now since my client company planning to Implement Fusion Middleware (Oracle ESB and Oracle BPEL), we are also looking at Oracle AS Integration B2B for B2B transactions (On other words we are planning to replace TIBCO by Oracle Integration B2B if possible).
I am really concern about one thing that is receiving and processing any "BIG FILE" (15 MB of size) from trading partner.
Present Scenario: One of our trading partner is sending Invoice documents in a single file and that file size can grow upto 15 MB of size. In our existing scenario when we receive such big files from trading partner (through TIBCO Business Connect - BC), Tibco BC works fine for 1 or 2 files but it crashes once it received multiple files of such size. What exactly happening is Whatever Memory that TIBCO BC is consuming to receive one such big file, are not getting released after processing and as a result TIBCO BC throws "OUT OF MEMORY" error after processing some files.
My questions:
     1. How robust the Oracle AS Integration B2B is, in terms of processing such big files?
     2. Is there any upper limit in terms of size that Oracle AS Integration B2B can handle for receiving and processing data?
     3. What is the average time required to receive and process such big file? (Lets say we are talking about 15MB of size).
     4. Is there any documentation availble that talks about any such big files through Oracle B2B?
Please let me know if you need more information.
Thanks in advance.
Regards,
--Kaushik                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                       

Hi Ramesh,
Thanks for your comment. We will try to do POC ASAP. I will definitely keep in touch with you during this.
Thanks bunch.
Regards,
--Kaushik                                                                                                                                                                                                                                                                                                                               

Similar Messages

  • Approach needed for processing huge file.using file-jdbc

    Hi,
    My Scenario is file-jdbc. I need to update the records in database table. The size of file would be 500mb.
    1)Will the recordsets per message in the sender side FCC will help me in processing the file? Any other better solution or any configuration needs to be checked on PI system? as processing should be finished in 3 hours time?
    2)i need to update other table with no.of records processed per cycle with time. How to acheive this?
    TIA

    I would suggest you to go for Stored procedures and follow as mentioned below:
    1)Use recordsets per message in the sender side FCC (already mentioned by u)
    2) Create a SP and and pass source payload as xml input to the SP.
    Ur target structure will be something like this:
    http://help.sap.com/saphelp_nw04/helpdata/en/2e/96fd3f2d14e869e10000000a155106/content.htm
    <StatementName>
    <storedProcedureName action=u201D EXECUTEu201D>
    <table>realStoredProcedureeName</table>
    <param1 type=SQLDatatype>val1</param1>
    </storedProcedureName >
    </StatementName>
    pass ur source data as xml input to param1.
    If u r using PI7.1 then chk this:
    http://www.sappi.sapag.co.in/flat-file-to-file-senario/convert-the-input-xml-to-string-in-pi-7-1-using-standard-graphical-mapping-2/
    on the database side parse the string (xml document) and then insert the data in the table.. Chk with ur DB team regarding the same.

  • What do I need to process raw files in Photoshop CS5 for a Canon Rebel EOS t4i?

    What do I need to process raw files in Photoshop CS5 for a Canon Rebel EOS t4i?

    You would need ACR 7.x which is available in CS6.  Or as ertho said you can get the free DNG 7.3 converter and convert raw to a DNG.

  • BPM design for trigger based file from Oracle

    Hi
    We have one requirement as follows:
    1. We need to receive trigger file from Oracle.
    2. As soon as we recieved trigger file, it needs to activate all other 10 JDBC adapter channels.
    3. once reciever all 10 RFC channels completed, it needs to pass that trigger file.
    Please suggest design.
    Thanks
    Siva

    Hi Siva,
    1. We need to receive trigger file from Oracle.
    define a sender CC (FILE) which polls a directory every N seconds.
    The receiver of this message should be your BPM.
    2. As soon as we recieved trigger file, it needs to activate all other 10 JDBC adapter channels.
    in your BPM, trigger your 10 sender CC (JDBC). See sap help and blog to know how to trigger externally a CC. Easy to do.
    3. once reciever all 10 RFC channels completed, it needs to pass that trigger file.
    Then always in your BPM, you have to do a correlation on the 10 SQL responses (that's crazy!). Several blogs and threads on this subject.
    And after that, in BPM what do you do of these 10 SQL responses (thats' crazy!) ? do you have to merge data ?
    Welcome to the birthday of a future monster... for dev and maintenance... Sincerely simplify your flow!
    Question: do you really need to have 10 SQL? for your needs, is it not possible to create a stored procedure in database which will do the 10 SQL (with Join tables)... if yes do it, by tis way you will have only one Sender CC (JDBC) to trigger, and no correlation in a BPM. That will greatly simplify your flow.
    Regards
    Mickael

  • Calculating hash values for really big files

    I am using the following code to calculate the hash values of files
    public static String hash(File f, String algorithm)
                throws IOException, NoSuchAlgorithmException {
            if (!f.isFile()) {
                throw new IOException("Not a file");
            RandomAccessFile raf = new RandomAccessFile(f, "r");
            byte b[] = new byte[(int) raf.length()];
            raf.readFully(b);
            raf.close();
            MessageDigest messageDigest = MessageDigest.getInstance(algorithm);
            messageDigest.update(b);
            return toHexString(messageDigest.digest());
        }Now the problem is, for really big files, 100 MB or over, I get an OutOfMemoryError.
    I have used the -Xms and -Xms options to increase the JVM heap size, and untimately made it to work. However, I think this is lame and there is also a limit to the -Xmx option.
    Is there any other way I can calculate the hash values of these really big files?
    Thanks a lot in advance.

    why do u open the file the way u do ?
    why to u upload ALL the file AT ONCE into the memory ?
    i would do it like this:
    FileInputStream fis = new FileInputStream (f);
    int fileSize = f.available();
    byte buffer[] = new byte[1000];
    MessageDigest messageDigest = MessageDigest.getInstance(algorithm);
    for(int read = 0;read < fileSize;read +=1000;)
    if(fis.available() > 1000)
    fis.read(buffer, read, 1000);
    else if(fis.available() > 0)
    fis.read(buffer, read, fis.available());
    else
    break;
    messageDigest.update(b);
    fis.close();
    return toHexString(messageDigest.digest());

  • EDIFECS Mapping not found in the X12 Output file in Oracle B2B 11g

    Hi,
    We are using Oracle SOA suite 11g, we have created a outbound process for EDI 855 which is PO Acknowledgement.
    We are using a XML Gateway for the outbound and consuming the message in BPEL. We are mapping the OAG xml to EDIFECS xml to be consumed by Oracle B2B. We had created the EDIFECS xsd and ecs file using Oracle B2B editor and got it validated.
    Now the issue is we have mapped few elements in BPEL and the same data is found in the payload message in Oracle B2B, but we are unable to see the data in the X12 output file.
    Any thoughts will be hightly appreciated.
    Thanks
    Sathish

    Hi,
    Thanks for your update.
    The following were the elements missing in the native X12 flat file. I verified these elements present in ecs using oracle b2b document editor.
    <ns0:Segment-PO1>
    <ns0:Element-355>EA</ns0:Element-355>
    <ns0:Element-235>UI</ns0:Element-235>
    <ns0:Element-234>UPCXREF</ns0:Element-234>
    </ns0:Segment-PO1>
    <ns0:Segment-ACK>
    <ns0:Element-355 xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:nil="">EA</ns0:Element-355>
    <ns0:Element-373>20100318</ns0:Element-373>
    </ns0:Segment-ACK>
    Thanks
    Sathish

  • ........ Expert  suggestion needed for Loading Data in DB

    Hi all
    The current requirement we have is to Load data from Excel File to oracle Databases.
    The Excel file has around 30 Sheets , each corresponds to a table, ( means 30 tables in the database) .
    Currently we are using sqlldr commands to load data from CSV files to the Oracle Database. The only problem is that sometime the DBA has to go out or is busy workign on somthing else, so May i kindy get some expert suggestions on how to automate this DataLoading Task.
    Somebody has suggested to My Manager ( who is not aware of oracle and IT at all) that data can be loaded via ODBC. It is suggested to him that all we need to do is to place the CSV files on the Server at a particular folder and Oracle will do the rest.
    I am not that proficient in Data loading methods.. so may i know any technique which will simplify/automate the Task.
    I mean how can we automate so that every time , the the sql loader scripts run at command promt. I think data base ( Oracle ) as nothing to do with command promt i feel. isnt it ..
    Kindly have ur expert./experienced suggestions.
    I would be highly grateful to all
    Regards

    To automate sqlldr scripts, you would usually write a OS script file that will periodically looks for files in a given directory and run the sqlldr commands:
    - on Windows: .bat or .wsh files to be scheduled with "at" command or windows scheduled tasks tool
    - on Unix: a script shell in the crontab.
    Much more complicate but without any OS script:
    - write a PL/SQL code to read, parse the file to be loaded using UTL_FILE
    package
    - this PL/SQL code must also generate INSERT statements and process errors ...
    - this PL/SQL code can be scheduled with DBMS_JOB package to run a periodic intervals.
    Message was edited by:
    Pierre Forstmann

  • Problem in JAXB for processing XML files

    hello
    I have been working on a project where i need to process data in XML format. the flow goes thus
    I have 28 data elements that i need to represent as a XML so i compile the schema files and generate the class files for each of the tags and thus i can use the get and set methods to read and write to a XML file respectively(example getName and setName)......
    Now the problem is that my coding is done if i change my xml file and add say 2 more tags how do i handle it in my code.........
    1>Do i have recomplie the schema file and generate new class files every time the xml structure changes. can i avoid this recompiling process and use a one time genrated class files even if the xml structure changes.
    2>Now i have hard coded the get and set methods for processing the xml file if i add new tags to my xml i wouldnt have the get set methods for the new tags in my code(say i add a new tag as Phone then i wouldnt have the codes getPhone and setPhone called in my code and this tage was added after the coding was done)........how do i handle this situation. Is is possible that i can get and set data without using these methods and use some sort of a dynamic way of getting and setting data.............
    3>Any other approach available to meet the above requirements other than JAXB.
    Please help for the above problem
    Thank you

    hi,
    i had written a xml and schema to validate.
    my xml would be
    <output>
    <table>
    <row>
    <column></column>
    <column></column>
    </row>
    </table>
    <document>
    <properties>
    </properties>
    <contents>
    </contents>
    </document>
    <table>
    <row>
    <column></column>
    <column></column>
    </row>
    </table>
    <document>
    <properties>
    </properties>
    <contents>
    </contents>
    </document>
    <table>
    <row>
    <column></column>
    <column></column>
    </row>
    </table>
    <document>
    <properties>
    </properties>
    <contents>
    </contents>
    </document>
    </output>
    schema should validate : each table should contain atleast one row element and each row element should have atleast one column. similarly, each document should have atleast one properties and contents element.
    if any of these things occur. for ex: if there is no row element in table, i need to delete the table tag. similary if there is no properties/content or both element in document it should delete the corresponding document from the xml.
    i tried for table if there is no row element am getting the line number of the </table> tag, based on that am deleting the table element. if there is no properties tag and contents tag is there. am getting the line number for <contents>start tag, with which i could not able to delete the whole document.
    can anybody plz help me out for this requirement

  • Need help processing XML files

    I'm fairly new to Java and have never worked with XML. I need to process several XML files and display them in a matrix for comparison and I'm not sure if I need to understand SAX, DOM or some other API. I'm not creating the files, I'm just parsing them to put into a table so they can be displayed for comparison. I could be processing up to several hundreds of files, so performance would also be an issue.
    I'm just looking for options and possible areas I can begin to look for help. Can anyone tell me which APIs would help me acheive my goal?
    Any help would be greatly appreciated.
    Thanks

    Thanks for the response Kev. I guess a better diescription of the problem would be to say that I want to display the attributes and values of the differnet files in a matrix.
    For example...
    Here are 2 sample xml files with the same tags and attributes.
    XML FILE A
    <tag1 name="Fred", country="ca">
    <tag2 group-name="Group 1">
    <tag3 color ="blue">
    </tag3>
    </tag2>
    </tag1>
    XML FILE B
    <tag1 name="Sue", country="us">
    <tag2 group-name="Group 2">
    <tag3 color="red">
    </tag3>
    </tag2>
    </tag1>
    I would like to have them displayed as follows...
    XML FILE A XML FILE B
    tag1 name           Fred Sue
    tag1 country           ca us
    tag2 group-name          Group 1 Group 2
    tag3 Color          Blue     Red
    HTML tags didn't work, so the matrix is bunched together, but I think you get the idea. My question is, which would be better to use SAX or DOM. I could be running this on as many as 50 or 60 files, so the matrix could get very large and performance could be an issue as multiple users could be doing this comparison at the same time.

  • How to Process flat File in Oracle Apps through Concurrent Program

    Hello Everyone,
    My client has a request, to process a bank file (Lockbox) which is a flat file that will be copied on UNIX box and I will have to create a new concurrent request that will process this flat file and will update receipt information in Oracle Apps database tables.
    Could you please suggest, if there are any other standard Oracle Apps functions (Example FND) available which can be used through Concurrent program that can be used to open a file from a particular directory and can be read from the flat file and after processing this file can be closed.
    Please let me know, if you have a small example, that would help me a lot.
    Thanks

    There are base concurrent programs in Accts Receivable that do consume lockbox flat files. Pl see the AR Setup/User Guides at
    http://download.oracle.com/docs/cd/B40089_10/current/html/docset.html
    Srini

  • Urgent Need for creating 100 tables in Oracle.

    Hello All,
    I need to create 100 tables in oracle using a loop.
    Please suggest . Advance thanks for your efforts.
    ANto

    I am getting the foll error at run time when executing the procedure ..
    ERROR at line 1:
    ORA-01031: insufficient privileges
    ORA-06512: at "ORAUSER.CREATE_100_TABLE", line 9
    ORA-06512: at line 1
    The script goes here ...
    create or replace procedure create_100_table
    is
    v_sql_string varchar2(200);
    begin
    for i in 1..100
    loop
    v_sql_string := 'create table ajames' || i || ' as select * from emp';
    execute immediate v_sql_string;
    end loop;
    end;
    ------------------------

  • Validation programm  needed for the  xmlfile   generated from  Oracle apps

    Hi ,
    I Generated an xml file from concurrent program .i need to perform validation to that xml file. Is there any program which can validate it.
    It is very urgent .
    Thanks®ards,
    sireesha

    $ORACLE_HOME/bin/schema
    Usage: schema [flags] <instance> [schema] [working dir]
    Where:
        <instance>    is the XML instance document to validate (required)
        [schema]      is the default schema (optional)
        [working dir] is the working directory for processing (optional)
    Flags:
        -0                Always exit with code 0 (success)
        -c                Extra tests to improve code coverage
        -e <encoding>     Specify default input file encoding
        -E <encoding>     Specify output/data/presentation encoding
        -i                Ignore provided schema file
        -o <num>          Validation options
        -p                Print instance document to stdout on success
        -v                Show version number
        -u                forced to Unicode path

  • Tips for downloading big files(World Of Warcraft)

    I need to download this 8 gigs of the game more. I'm downloading directly from bilzzard its going about 150 kb/s. Just wonder some tips yall do when downloading big files like this. Also if my macbook pro goes to sleep will it slow down the download because thats what seems to happen when I leave it on overnight.

    Do it in the evening through overnight. Set the MBP to not sleep. System Preferences>Energy Saver>Computer Sleep = Never.

  • Urgent Suggestions needed for best way to solve the problm

    Hi Everybody,
    I am working on an application which has to graphically show the data in the database. We are using JSP for the front end (the view for the time being will be simple, with text boxes and frames and DHTML) and tomcat as the Server. The data is huge and there ain't much business logic involvd (the user when he clicks on a URL I get the data pertaining to that particular table or column). Now my question is can Tomcat handle such huge data. What do u guys suggest, should I cache data at my server and server to the clients (instead of connecting to the DB for each and every request) If yes, can you guys please suggest a good way to cache data, I mean hat TableModel or HashMap will store data from different tables. Also the DB isn't updated for a while everyday. Also let me know if any of u guys still think a 3 -tier approach is advisable instead of the 2-tir approach
    Thanks guys who made it this far

    Well, of course a 3-tier approach is advisable. Also, yes, Tomcat can handle what you are doing. As far as caching data on the server, that is going to depend on your database and other requirements. Now about the graphical nature, you aren't going to be able to use JSP to draw graphics. The only way you can do it is to draw the graphics on the server-side, write out JPG files, but then the JSP won't know anything about where the image is, unless you always write the same number of images, in the same place, every time. Otherwise, you need an Applet. Hope that helps.

  • Examples needed for Idoc to file and IDOC to web services

    Hi ,
    Could any one of you give some examples which take me through step-by-step in building IDOC-TO-FILE and IDOC-TO-WEB SERVICES?
    Regards,
    XI Developer.

    Hi,
    For IDOC scenario you need to first do the required configuration:
    ALE configuration for pushing idocs from SAP to XI
    /people/swaroopa.vishwanath/blog/2007/01/22/ale-configuration-for-pushing-idocs-from-sap-to-xi
    Testing purpose u can use the below method:
    IDOC testing using WE19
    /people/sameer.shadab/blog/2005/07/25/reposting-idocs-instead-of-recreating--for-testing-purpose-xi
    /people/prateek.shah/blog/2005/06/08/introduction-to-idoc-xi-file-scenario-and-complete-walk-through-for-starters --> For Idoc sender: IDOC -file
    IDOC configuration:
    Please follow the below process for configuration:
    Pre-requisites for Inbound IDoc to R/3 from PI:
    Configuration required at Xi side:
    Go to IDX1: configure the port.
    Go to IDX2: load the DOC metadata.
    Go to SM59: Create RFC destination which points to R3 system this is require in the case where your IDOC is sent to R 3 system,
    Configiration required at R3 side:
    Maintain Logical System for PI (SALE transaction):
    Maintain Partner Profile for XI system(WE20):
    Pre-requisites for Outbound IDoc from R/3 to PI:
    Configurations required in R/3:
    Maintain Logical System (SALE)
    Define RFC Destination (SM59) which points to PI system
    Maintain Port (WE21)
    Maintain partner profile. (WE20):
    Maintain Distribution Model (BD64):
    File To IDOC - Part1 (SLD and Design):
    https://www.sdn.sap.com/irj/sdn/wiki?path=/display/profile/2007/05/11/fileToIDOC&
    File To IDOC - Part2 (Configuration):
    https://www.sdn.sap.com/irj/sdn/wiki?path=/display/profile/2007/05/11/fileToIDOC-Part2+(Configuration)&
    File To IDOC - Part3 (Steps required in XI and R3):
    https://www.sdn.sap.com/irj/sdn/wiki?path=/display/profile/2007/05/11/fileToIDOC-Part3(StepsrequiredinXIandR3)&
    SOAP scenario:
    YOu have to first create the WSDL through ID and import that WSDL in to IR external definition.
    Refer the below thread and pdf:
    How to use SOAP adapter:
    https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/40611dd6-e66e-2910-f383-e80fb44f9cd4
    SAP AII - How to consume and expose webservices  ?
    Thnx
    Chirag
    reward points if it helps.

Maybe you are looking for