Fast data unload

I am awfully sorry, guys, but I can find the right place where to ask my question. I need to unload some data from ny table very fast. Say, I have a table with 3-15M records, and I need to unload records witch were not unloaded yet (I have an attribute for that). I wrote a PL/SQL stored procedure for that (using dynamic sql, table name is not constant) and I see about 800kb/s write speed. I understand, that's not quite sensible numer yet. But when I have to move data inside database I can see speed up to 200Mb/s (I am looking at iostat -x on RHEL 4). Today I wrote a Pro*C program to unload the data and I am able to archive almmost 1500Kb/s, almost twice as fast as PL/SQL! Yes, I saw http://asktom.oracle.com/pls/ask/f?p=4950:8:::::F4950_P8_DISPLAYID:459020243348
but I think I have a bit more efficient program. So, what do you think, am I able to archive more speed?

but I think I have a bit more efficient program. Can you share the program (or the changes that give you that extra efficiency) so others in these forums can benefit from that?
It provides 4 times better performanse than scripts suggested by Kyte I would not necessarily want to compare a free program available as a courtesy to a commerical product that needs to be paid for.

Similar Messages

  • Upload data from excel file to "Fast Data Entry" in FB60

    Hi,
    I have a requirement from users, they want to load data of G/L accounts items in FB60 using "Fast Data Entry" from an excel file.
    I did some research and found out that the framework to transfer clipboard data into an SAP table(http://www.synactive.com/examples/example0016.html) seems to be the suitable solution.
    However, I still cannot make it work.
    Do you have any suggestion for my case ? How can I load data from an excel file to the transaction FB60?
    Thanks in advance,
    Hung

    Dear Hung,
    You can upload data through LSMW (Batch Input).
    But you need to convert your excel sheet into CSV format or txt format.
    Please take help of your ABAP team if required.
    This will accommodate your requirement.
    Regards
    Saurabh

  • I bought a new iMac today. I'm using migration assistant to move all my software, but the time just keeps getting longer. It says connect an Ethernet cable for faster data transfer. I did, but that doesn't seem to help. Any ideas?

    I bought a new iMac today. I'm using migration assistant to move all my software, but the time just keeps getting longer. It says connect an Ethernet cable for faster data transfer. I did, but that doesn't seem to help. Any ideas?

    m1doc,
    Are you migrating from a Mac or a MS Window machine? Either way you probably should be in touch with AppleCare, you have 90 days of free AppleCare telephone support. They can usually help on issues like this. If you don't know the phone number please use http://support.apple.com/kb/HE57 to help find the number in your country.

  • Solaris 8 07/01 installation : Fast Data Access MMU Miss

    hello,
    Because of a hardware crash of my 9.1Gb hard drive, I plugged a brand new 40Gb drive in my Ultra 10 workstation.
    I tried to re-install my release of Solaris 8 (06/00), but this release detect only a 5Gb hard drive (but the installation work).
    I downloaded and burned Solaris 8, release 07/01 to break the 32Gb/drive limit, but when I type 'boot cdrom' or 'boot cdrom install' at the bios prompt, I get :
    Fast Data Access MMU Miss.
    I upgraded the PROM to the lastest version, and tried to reburn the CD.
    Does someone has any idea to help me ?
    Greg.

    974009 wrote:
    Hello all,
    I have been working on a Sun Netra T1 105 for the past two weeks That system is too old for Solaris 11 and Solaris 10. That generation UltraSPARC module isn't going to be recognized by those newer OS's.
    I suggest you go to your preferred online auction web site (for example, eBay) and get any Solaris 8(SPARC) or Solaris 9(SPARC) for that little 1U box.
    I have disabled auto boot so when it starts it is going to the ok prompt.Smart. Good move for troubleshooting.
    The CD of Solaris 11 is a new one that I burned just two weeks ago and I burned it on 24x speed.
    And it is the Sparc version that I downloaded.
    I also tried installing a DVD-ROM with the Solaris 10 DVD but it gave the same error.I hope that means you replaced the original CD reader with a DVD-capable drive.
    When I type probe-ide it shows that the seconday Master is a Removable ATAPI Model: TOSHIBA CDROM XM-7002BcYes. Since that model computer was discontinued in 2002, there was only a CD-capable drive ever qualified to be used in it.
    When I type probe-scsi it shows:
    Target 0
    Unit 0 Disk Seagate ST39103LCSUN9.0G034AA 9GB disk isn't going to be large enough for much more than just a core install of Solaris. You may again need to get larger drive(s) at that same online auction web site. The chassis can hold two internal disks.
    Thank you very much.
    Hendrik
    Edited by: 974009 on 29-Nov-2012 07:34When you get the chance, glance at this link to a copy of the Netra T105's resource page as it appeared in the old Sun System Handbook (SSH). The link is to a non-Oracle site that mirrors the old now-discontinued free version of that SSH.
    Next, while there's not much system documentation for that box offered by Oracle, you can glance through what there happens to be here:
    http://docs.oracle.com/cd/E19628-01/index.html

  • Can i rename heading Date1 feild in va01 transcation under fast data tab.

    Can i rename heading Date1 feild in va01 transcation under fast data tab to confirmed date

    Hi,
    try with the following exit :
    V45A0002       of          EXIT_SAPMV45A_002
    Regards,
    Renjith Michael.

  • Saving slow and fast data on the time stamp..

    Hi ! I'm retrieving data fast data from DAQ device  (dt=0,01s) and from GPS resiever (dt=1s) . I need to save all data on common time axis.
    Is there way to put zero in slow data colum if regular data are missing ?
    (sorry for stupid question I'm LV beginner )
    thanks
    Attachments:
    GPS_and_fast_data.vi ‏232 KB

    I tried to make it all day. This attachment demonstrate the way that look to be resolution of my problem.
    Message Edited by CUA on 05-07-2007 08:29 AM
    Message Edited by CUA on 05-07-2007 08:30 AM
    Attachments:
    GPS_and_fast_data_rev1.vi ‏201 KB

  • Vc Characterstics Values Missing in Fast data Entry of sale Order

    Hi Experts
    we are working on ECC 6.0 with Varaiant Configuration,we are facing a problem in sale Orders enetered through fast data Entry 1 Vc Characterstics is Missing after Release of sale Order we are not able to find The Characetrtsic in Instance which is Saved Finally,Whats  Resolution for such problem.

    Hi ,
    Check the requirement type and requirement class with item category assignments ( requirement type KEK to 45 )
    Edited by: R Gopi on Jun 3, 2010 3:48 PM

  • Need faster data loading (using sql-loader)

    i am trying to load approx. 230 million records (around 60-bytes per record) from a flat file into a single table. i have tried sql-loader (conventional load) and i'm seeing performance degrade as the file is being processed. i am avoiding direct path sql-loading because i need to maintain uniqueness using my primary key index during the load. so the degradation of the load performance doesn't shock me. the source data file contains duplicate records and may contain records that are duplicates of those that are already in the table (i am appending during sql-loader).
    my other option is to unload the entire table to a flat file, concatenate the new file onto it, run it through a unique sort, and then direct-path load it.
    has anyone had a similar experience? any cool solutions available that are quick?
    thanks,
    jeff

    It would be faster to suck into a Oracle table, call it a temporary table and then make a final move into the final table.
    This way you could direct load into an oracle table then you could
    INSERT /*+ APPEND */ INTO Final_Table
        SELECT DISTINCT *
        FROM   Temp_Table
        ORDER BY ID;This would do a 'direct load' type move from your temp teable to the final table, which automatically merging the duplicate records;
    So
    1) Direct Load from SQL*Loader into temp table.
    2) Place index (non-unique) on temp table column ID.
    3) Direct load INSERT into the final table.
    Step 2 may make this process faster or slower, only testing will tell.
    Good Luck,
    Eric Kamradt

  • Very slow network directory listing - but fast data transfer speed once listed?

    Hello,
    I have really tried to sort this myself before opening up to the community, however I have run out of ideas, and hope someone can offer the magic solution I have missed.
    I am currently using the 3.4ghz i7 iMac on a 1GB LAN, running OSX10.7.2 - connecting to a Windows Server 2008 (Running Release 2) over ethernet.
    If i go to a network directory that i haven't recently accessed it can take up to 60 seconds to show the contents of that directory. Once i have accessed that folder, if i come out of it and go back in it will be instant again - but the first time it lists the directory it looks like i have opened an empty folder - which after anything from 10seconds to 1 minute will suddenly show the files that are there.
    Internet connectivity is fast through the network, and file transfers across the LAN are fast. (showing as approx 300mb per second) I can play and edit HD content across the network with no slowdown so I am confident that this issue is not related to the network speed itself, and is more to do with a setting on this mac.
    Symptoms are very similar to this post: https://discussions.apple.com/message/12245148?messageID=12245148&amp%3b#1224514 8 - however i understand that in OSX Lion - SMB was removed - so i cannot find this file to edit.
    I have tried bypassing additional hubs in the network by wiring direct cables to the switch that is connected to the file server, this made no difference.
    I have also tried disconnecting the ethernet cable, and running over wifi. This fixes the listing problem, but when editing HD content over a network drive, this connection is not fast enough to carry the data without interruption (some projects are linked to up to 900gb of hd video content!)
    Using ethernet, I have tried DHCP, DHCP with manual address, and manual mode. All reproduce this problem. i have tried using the windows workgroup, and tried without it.
    I have also followed this suggestion: https://discussions.apple.com/thread/2134936?threadID=2134936&tstart=45 and used OpenDNS. this did not fix the issue.
    For argument sake, I have also just tested a Macbook Pro running Snow Leopard to see if it was OS related. This reproduces the exact same problem, near instant directory listing on the wifi, a long and arduous wait on ethernet.
    I cannot work out why directory listing is instant over wifi, but not over ethernet on 2 different macs, running 2 different versions of OSX. I also do not understand why if the network is having trouble listing the directories - the data transfer speed is 300mbps when i copy files across the wired network from the file server to the mac.
    Does anyone have any other ideas as to what could be the problem here? We are about to start work on a very large project, where the content we are editing is spread out across around 200 different network folders (different shoots captured over the past 2 years). We really don't have the time to wait 60 seconds each time we need to access one of those directories to look for a file, and I am very close to pulling all my hair out!
    I really look forward to hearing from anyone who can offer any insight.

    If you are suspecting that the Windows update had something to do with your LAN going slow, then try the following:
    1.  Look for updates for your clients LAN NIC driver; or
    2.  Un-install the updates.

  • Fast data retreival from Table BSAD

    Dear Friends,
    I want to retrive data from table BSAD and right now it have more than 1,300,000 records.
    for "where condition" i have Customer number, Document number, Document type fixed RV and only
    debit side S.
    What should i do for fast retreival or any function module . right now data fatching is too slow with simple select.
    regards
    Malik

    Hi Malik,
    Pass all the key fields present in the database table, and in the current selection you are using only 2 key fields.  The remaining key fields in this table i.e, BSAD are:
    BUKRS-----     BUKRS---     CHAR     4     0--
         Company Code
    KUNNR-----     KUNNR---     CHAR     10     0--
         Customer Number 1
    UMSKS------     UMSKS-----CHAR     1     0--
         Special G/L Transaction Type
    UMSKZ------     UMSKZ-----CHAR     1     0--
         Special G/L Indicator
    AUGDT-----AUGDT---DATS8     0--
         Clearing Date
    AUGBL------     AUGBL------     CHAR--     10     0--
         Document Number of the Clearing Document
    ZUONR-----DZUONR-     CHAR     180--
         Assignment number
    GJAHR-----     GJAHR---     NUMC     4     0--
         Fiscal Year
    BELNR-----     BELNR_D--     CHAR--10     0--
         Accounting Document Number
    BUZEI--     BUZEI     NUMC     3     0--
         Number of Line Item Within Accounting Document
    Therefore pass as many key fields as possible to improve the performance of the select query.
    Regards,
    Md Ziauddin.

  • Fast Data Transfer

    I want to make an online multiplayer pong game. I want the
    user to be able to play against a different user on a different
    network. How could I transfer the data of one player to the other?
    When one player moves his paddle left, I need somehow to let the
    other user's computer know that pretty fast so they have the same
    game play. I thought about using a text file. When a player moves
    left, it will write left on a text file and upload it to my server.
    Then the other computer will download the text file. If it says
    left, he will move the opponent left. If the text file says right
    it will move the opponent right. But I think this is a very slow
    process and their game will pretty much not be the same. How could
    I do this so that they get the data fast?
    Thanks,
    Dor

    DorKauf schrieb:
    > does anyone know the answer?
    I did, but the public newsservers don't seem to relay answers
    back to
    Macromedia's newsserver, so here is the answer again:
    You may want the players flash movies to do a direct IP
    communication,
    so your server will only have to relay the opponents IP once.
    Best regards,
    Martin

  • Fast data transfer out of PXIe-8133?

    Hi,
    What is the fastest way to transfer data out from the PXIe-8133? I am currently using two UDP ports for the two ethernet ports with gives a theoretical transfer rate of 2Gbit/s.
    How would I transfer data out from an FPGA or PXIe to a host computer faster than that?
    Cheers.
    Solved!
    Go to Solution.

    Hi u436,
    The only faster way to transfer data to a computer would be with a x4 or x16 MXI-Express Remote Controller, which would provide up to 838 MB/s or 5.6 GB/s respectively.  These would need to be installed in place of your PXIe-8133  and be controlled by the MXI card on your PC.  Also note that the x16 MXI-Express card is not supported by all PCs; it requires full PCI Express 2.0 clocking specs. 
    NI PXIe-PCIe8375:  http://sine.ni.com/nips/cds/view/p/lang/en/nid/207823
    NI PXIe-PCIe8388 and NI PXIe-PCIe8389:  http://sine.ni.com/nips/cds/view/p/lang/en/nid/209710
    Other than that I am not aware of a faster way of exporting data to a PC(s) in real time.  Hope this helps!
    Brian 
    Brian G.

  • Fast Table Unload

    Hello, We are running 8i. I need to perform a one-time extract of data from a dozen or so moderate sized tables (approx 3 million rows/table) for transmission to another platform.
    Is there a "bulk" unload program/utility/command I can use to accomplish this task quickly (something similar to SQL Server BCP or DB2 UNLOAD utility)?
    Any help would be greatly appreciated.

    I usually use SQLPlus, spool and good old SQL! Be sure to set the following env variables so you have raw data being extracted.
    SET WRAP OFF;
    SET HEADING OFF;
    SET SPACE 0;
    SET FEEDBACK OFF;
    SET PAGESIZE 0;
    SET TERMOUT OFF;
    SET ECHO OFF;
    SET LINESIZE 166; <- change this to how wide you want your rows to be structured.

  • Data unload issues

    I am using sqlplus to unload data from a script in a pipe-delimited format based on other messages I have found on the forums. I have a couple questions on issues I am having. ( I am using 10g)
    1. Some of the fields in the table I am unloading have new lines in them. I want to unload them to one line, is there a way to replace the new line characters while unloading? I tried replace but it didn't seem to work.
    2. What is the max that can be selected from 1 column? I am concat'ing the columns with pipes to get the format I want and 1 table has several columns that are each 4000 varchars. I have read that in the past 4000 was the limit for a column. If it is larger than that now, what is it and can I use the line size to make it unload all that I want?
    Thanks

    1. There are a few different types of "new lines". Windows, for example, uses a CHR(13) followed by a CHR(10) while UNIX uses just a CHR(13). You might also be referring to some sort of escape sequence (i.e. the string "\n") that a client app interprets as a line feed. In any case, you'll probably want to use the REPLACE function, though exactly what you want to replace may change. If you have Windows new lines, for example, you would want something like
    SELECT REPLACE( <<column name>>,
                    CHR(13) || CHR(10),
                    NULL )
      FROM <<table name>>2. If you are concatenating the columns together, you can't have more than 4000 characters to a line.
      1  select lpad('a',4000,'*') ||
      2         lpad('b',4000,'*')
      3*   from dual
    SCOTT @ hp92> /
      from dual
    ERROR at line 3:
    ORA-01489: result of string concatenation is too longIf, on the other hand, you are letting SQL*Plus fill in the column separator, i.e.
    SCOTT @ hp92> set colsep '|'
    SCOTT @ hp92> select ename,empno from emp;
    ENAME     |     EMPNO
    ----------|----------
    SMITH     |      7370
    ALLEN     |      7500
    WARD      |      7522
    JONES     |      7567
    MARTIN    |      7655
    BLAKE     |      7699
    CLARK     |      7783
    SCOTT     |      7789
    KING      |      7840
    TURNER    |      7845
    ADAMS     |      7877
    JAMES     |      7901
    FORD      |      7903
    MILLER    |      7935
    Justin    |      1444
    john      |      2000
    16 rows selected.you won't have that sort of issue.
    Justin
    Distributed Database Consulting, Inc.
    http://www.ddbcinc.com/askDDBC

  • Faster date conversion

    I want to convert the string which is in format (month day year )eg 16.05.99 to (year month day ) 19990516T120000.000Z format .For that i wrote folowing piece of code.Just it is taking lot of time about 300 milliseconds to convert each date.Is there some other way to achieve the same result faster.
    public static String convertDate(String oldDate) throws Exception
    SimpleDateFormat oldDateFormat = new SimpleDateFormat("dd.MM.yy'T'HHmmss.SSS'Z'");
    oldDate = oldDate+"T120000.000Z" ;
    java.util.Date dt = null;
    dt = oldDateFormat.parse(oldDate);
    SimpleDateFormat newDateFormat = new SimpleDateFormat("yyyyMMdd'T'HHmmss.SSS'Z'");
    return newDateFormat.format(dt);
    thanks

    SimpleDateFormat is typically the best class to use for date formats if you are most concerned about supporting the widest array of possible date formats and wish to apply error-checking logic to the supplied dates (i.e. determine if the dates are valid), but if speed is your priniciple concern and you are only working with a single format, you can achieve quicker performance by coding your own class that just converts a String from mm.dd.yy to yyyyMMdd'T'HHmmss.SSS'Z'. Like,
    * Returns specified date as a String in yyyyMMdd'T'HHmmss.SSS'Z' format
    * @param date java.lang.String date in mm.dd.yy format
    public static String convertDate(String date) {
      char [] d = date.toCharArray();
      return "20" + d[6] + d[7] + d[0] + d[1] + d[3] + d[4] + "T120000.000Z";
    }... the issue reduces to a balance between development costs (i.e. taking the time to code your own coverstion logic and error checking) versus performance- you can get better perfomance through your own class, but you have to spend more time worrying about different formats, nonsensical dates, etc, etc.

Maybe you are looking for

  • INDD to PDF: web-safe color palette?

    Context: currently, the Google Docs Viewer/Embedder, when used to display a PDF in a browser, heavily downsamples images in the PDF resulting in grisly color changes and posterization. This has been discussed for a long time elsewhere on the web (exa

  • Photoshop Elements 8 does not open any files

    Dear Community, today I booted my adobe photoshop elements 8.0 to do some picture editing. But when I try to open a file nothing happens. And I mean nothing. Photoshop just keeps on running but it doesn nothing with the command I gave it. It doesn't

  • Financial Statement Report - Special items in FSE2

    Hi,     I'm trying to rebuild the GL Balance Report using transaction FSE2, but I can't delete or create new "special items" (the red node ones). Anyone knows how to create or delete "special items" at FSE2 to custumize the Financial Statement Report

  • Partial reversal of statistical posting

    Hi, I am using statistical posting (F-38) for posting Letter of Credit  (LC) into books of accounts. Now when LC is disbursed, part disbursement takes place over the period. So i have to reverse the LC created in part. I tried F-19 (reversal of stati

  • How to find the coordinates of a mouse click in a Tree node ?

    Hello I have a JTree that have few nodes. The nodes contain an icon and a label (using custom cell renderer) and I want to receive right clicks on the icon and show a popup menu. The problem is I can only get the right click on the whole tree node, n