Transfer large amount back to external system

Hi!
Here is the scenario:
There is a table with 4 million rows. I want to send all rows to an external caller which calls via RFC function get_all_rows
What is the best way to transfer 4 million rows? Should I divide them into seperate internal tables with say a couple of thousand rows each, return one table with 4 million rows or how is this best handled?
regards
Baran

Hi Baran!
As long as you have only one function call (via RFC) to give all lines, it's not really important if you have several tables with less lines or one table with all lines - all data has to be forwarded through network.
There are some size limits for fields, but I don't know any for an internal table (except physical memory and other session limits). Of course total size might get to big (e.g. more than 1GB) and you can't read everything in one step
Then you have to use selection criteria and make several accesses.
Regards,
Christian

Similar Messages

  • Looking for ideas for transferring large amounts of data between systems

    Hello,
    I am looking for ideas based on best practices for transferring Large Amounts of Data in and out of a Netweaver based application.
    We have a new system we are developing in Netweaver that will utilize both the Java and ABAP stack, and will require integration with other SAP and 3rd Party Systems. It is a standalone product that doesn't share any form of data store with other systems.
    We need to be able to support 10s of millions of records of tabular data coming in and out of our system.
    Since we need to integrate with so many different systems, we are planning to use RFC for our primary interface in and out of the system. As it turns out RFC is not good at dealing with this large amount of data being pushed through a single call.
    We have considered a number of possible ideas, however we are not very happy with any of them. I would like to see what the community has done in the past to solve problems like this as well as how SAP currently solves this problem in other applications like XI, BI, ERP, etc.

    Primoz wrote:Do you use KDE (Dolphin) 4.6 RC or 4.5?
    Also I've noticed that if i move / copy things with Dolphin they're substantially slower than if I use cp/mv. But cp/mv works fine for me...
    Also run Dolphin from terminal to try and see what's the problem.
    Hope that help at least a bit.
    Could you explain why Dolphin should be slower? I'm not attacking you, I'm just asking.
    Cause I thought that Dolphin is just a „little" wrapper around the cp/mv/cd/ls applications/commands.

  • Store agent started to transfer large amounts of data, eating all my bandwidth.  Seemed to just start 5-6 days ago.

    About5-6 days ago store agent started to continuously run, sending and receiving large amounts of data.  This eats all my bandwidth quickly, essentially rendering my internet access worthless since I have to use satellite internet.  I have tried stopping it in the Activity Monitor , but it restarts again. I thought I might have had a virus or something.  I downloaded trend micro for Mac, but found its core services essentially did the same thing. I uninstalled, but found that store agent is still running non stop. Ideas?

    The storeagent process is a normal part of Mac OS X, not a virus. Remove Trend Micro, which is a quite poor choice for protecting yourself against malware in the first place (see the results of my Mac anti-virus testing 2014), and which isn't really necessary anyway (see my Mac Malware Guide).
    As for what it might be doing, as babowa points out, it should be present when the App Store app is open, and at that time, it might be occupied with downloading updates or something similar. If you keep force-quitting it in Activity Monitor, that probably ruins whatever download it was working on, so it has to start all over again, perpetuating the cycle. In general, it is a very bad idea to force-quit processes that are part of Mac OS X without a very good reason and an understanding of what they are.
    Go to System Preferences -> App Store:
    You will probably want to turn off automatic download of newly available updates, as well as automatic download of apps purchased on other Macs (if you have other Macs). I do not advise turning off the master "Automatically check for updates" box, or the one for installing security updates, as disabling those will reduce the security of your system. These security updates are typically small, so they should have very little impact on your total internet usage.

  • Transfer Large Amounts of Data From External to External Drives

    I have 4 each 1.5TB USB 2.0 drives that contain music & movies for use in itunes or apple tv. I purchased two 3TB Thunderbolt drives and would like to transfer the data from the 4 each 1.5TB drives to the new 3TB Thunderbolt drives How Do I Do This?
    I have tried to copy/paste  drag/drop  but only so much moves then stops.
    All formatting is MAC Journaled

    First I just want to say thank you for the suggestion.
    There were no errors shown just stopped after several hours, also sleep is turned off. What I started doing is taking lets say approximately 6GB at a time and maybe doing this 4 or 5 times and after about 20 to 30 minutes it is done and I do this all over again.
    I will run the disk utility like you suggested and try that again but give or take there are probably 2500 files that are 2+GB in size and then all of the music files we have collected is around 90GB total if not slightly higher.

  • Unable to transfer large files from MB to External HDs

    Hi,
    This may be an unusual one. My 1st edition Macbook won't let me transfer large files to my external hard-drives, either by USB, Ethernet or wirelessly through my home wi-fi network.
    I've started video editing so I'm importing DVcam tapes through firewire from the camera. A full DV tape translates to about 8-12gb I think.
    Using iMovie and saving the import to my Freecom 3.5inch network harddrive vis USB or ethernet, it fails saying 'Unexpected error, error code 1309'.
    When I try quicktime pro instead to import the tape to the external HD, I get 'Operation could not be completed, An attempt to add a resource to the file failed'. This happens too with my WD 2.5inch external drive.
    However, there is no such problem when I import to the MB's internal harddrive. When I then try and shift the large file off the laptop to an external drive, via USB, ethernet of wirelessly, for editing and save keeping, it fails half way through.
    I have the same problem when I try back-up my 17gb virtual windows machine I use with VMware fusion off the laptop to an external drive.
    I am currently burning an 18gb iMovie project from the MB's internal HD, over 5 DVDs using Toast's disc-spanning and will reimport them to the external, which is really time consuming.
    Also to note, Final Cut Express doesn't have these problems as it seems to break up what would be an 18gb movie file into several smaller files, which my MB is quiet happy to let go onto an external drive. However, the problem re-emerges on Final Cut Ex when I try and import a HDV tape. Possibly FCE isn't breaking them up into small enough pieces for my MB to handle?
    Any ideas why this is happening and what I can do to fix it? Or does this mean a new laptop?

    Irish Apple Fan wrote:
    Hi,
    This may be an unusual one. My 1st edition Macbook won't let me transfer large files to my external hard-drives, either by USB, Ethernet or wirelessly through my home wi-fi network.
    I've started video editing so I'm importing DVcam tapes through firewire from the camera. A full DV tape translates to about 8-12gb I think.
    Using iMovie and saving the import to my Freecom 3.5inch network harddrive vis USB or ethernet, it fails saying 'Unexpected error, error code 1309'.
    When I try quicktime pro instead to import the tape to the external HD, I get 'Operation could not be completed, An attempt to add a resource to the file failed'. This happens too with my WD 2.5inch external drive.
    I'm a bit late to the party, but specifically there's a 4GB file size limit in the FAT32 format that is standard on many external hard drives. I've run across this problem before, and usually it copies over most of the file and then quits when it reaches the limit.

  • Integration of Financials with external systems

    Hi,
    I am strugling with an implementation where the client is not sure if Oracle Financials suits his business processes
    Overview of situation on Hand:
    We do not have the product installed as yet at the client site
    Products of Oracle Financials to be used :General Ledger, Account Receivable, Account Payable, Fixed Assets,India Localization Patch
    Products of Oracle Applications NOT available for use :
    Purchasing, Inventory, Order Management
    (All these areas are being covered by developing a customized Bespoke system)
    1.     Is it possible to use Oracle India Localizations (with regards to the excise functionality, for e.g. claiming of ModVat, the various excise registers that are to be maintained for e.g. RG23 A, RG23C,etc ) in the above situation (without implementing Purchasing, Inventory, and Order Management?).
    2.     Further, while passing the Vendor’s Bills (in Oracle Payables), one of the criteria for PO Matching is to check if the ModVat has been claimed. Is this functionality available in Payables with the India localization patch?
    3.     Does Oracle India localization cater for VAT requirements?
    4.     Is an Open Interface available to transfer Purchase Order data from external systems to Oracle Purchasing tables (which are shared by Oracle Payables the names being PO_HEADERS, PO_LINES, PO_LINES_LOCATIONS, PO_DISTRIBUTIONS, PO_DISTRIBUTIONS_AP_V (VIEW OF PO_DISTRIBUTIONS), PO_RELEASES (Blanket Purchase Orders), PO_LOOKUP_CODES) at transactional frequency? However, if the oracle purchasing module is not being used, can the interface tables of Purchasing be used?
    5.     An open interface (Payables Open Interface) is available in Oracle Payables to import the Invoices from external systems. While importing these invoices, does the system expect to have the Purchase Order data in the PO tables mentioned in the point above?
    6.     Is an Open Interface available to transfer Quantity Received/Accepted data from external systems to PO_line_locations table to enable carrying out of 2/3/4 way matching of Purchase orders with invoices? Can the 4 way mathcing be carried out in AP by just importing Purchase Order data??
    7.     Can the Credit Card Transaction Interface be used for uploading employee expenses / advances settlement (not carried out via credit card) directly from feeder system?
    8.     Is it possible to use Open Item interface (including import concurrent program) even though Inventory module is not being installed ? If yes, then we would like to use this interface for updating Item master from bespoke system..
    9.     Can Auto Invoice API be used to import invoices from feeder system / legacy system (via RA interface Tables) into the Oracle receivable invoice tables? Is order number as column a prerequisite for successful completion of Auto Invoice API?
    10. Where should the Masters be kept....OF of Bespoke
    eg. Employee master, Inventory Item Master etc.
    11. What is the best strategy for keeping the data in Bepoke and OF related to Masters in sync?
    I have got various answers to these questions .....but some seem to contradict each other.
    PLEASE HELP!!
    Thanks,
    Kamana

    Dear Kamana,
    Can you send me the replies given by our other Forum Friends, let me analyze the entire stuff and get back to you with a single consolidated bible for all your questions.
    Gopal

  • WM interface to external system

    Have anybody ever configured WM interface to external system (t-code OMKY)?
    I have a question here and hope you can help:
    What's the naming rule when define Logical System, RFC destination, Port and Partner Profile? Should all be defined under one name? In my test, if I only define logical system and partner profile under same name (for example "ABC"), and maintain Port and RFC destination under another name say "XYZ", then I can only sent IDOC but I can't receive the IDOC sent from external system (for example, confirmation of transfer order).
    Can anybody shed some light on this question? Your input is highly appreciated.

    Hi,
    Why should the port and RFC destination also use same name as logical system and partner profile? Would you please kindly show me the rule? I’ve checked online help and searched extensively on websites without any clear instruction.
    In my experinces, usually I setup using the names and make easy for other developer to identfy the interface problems in production as well as for enhancement in the future. Just imagine if you are dealing with hundred interfaces (inbound and outbound) and you have different names for same interface.
    Again, you could setup with different names for logical system, RFC and so on.
    In case of different name between logical system and port, would you please tell me why we can’t receive IDOC sent back by external system? Can’t they find the correct partner and port then send IDOC anymore?
    Well ... you need to tell the external system what names to use for partner profile and port as part of EDIDC data.
    Regards,
    Ferry Lianto

  • Streaming large amounts of data of socket causes corruption?

    I'm wrinting an app to transfer large amounts of data via a simple client/server architecture between two machines.
    Problem: If I send the data too 'fast', the data arrives corrupted:
    - Calls to read() returns wrong data (wrong 'crc')
    - Subsequent calls to read() do not return -1 but allow me to read e.g. another 60 or 80 KBytes.
    - available() returns always '0'; but I'll get rid of that method anyway (as recommended in other forum entries).
    The behaviour is somewhat difficult to repeat, but it fails for me reliably when transferring the data between two separate machines and when setting the number of packets (Sender.TM) to 1000 or larger.
    Workaround: Reduce number of packages send to e.g. 1; or intruduce the 'sleep' on the Sender side. Another workaround: Changing alone to java.nio.* did not help, but when I got rid of the Streams and used solely ByteBuffers, the problem disappeared. Unfortunately the Streams are required by other parts of my application.
    I'm running the code on two dual-CPU machines connected via
    Below are the code of the Sender and the Listener. Please excuse the style as this is only to demonstrate the problem.
    import java.io.IOException;
    import java.io.OutputStream;
    import java.net.InetSocketAddress;
    import java.nio.channels.Channels;
    import java.nio.channels.SocketChannel;
    import java.util.Arrays;
    public class SenderBugStreams {
        public static void main(String[] args) throws IOException {
            InetSocketAddress targetAdr = new InetSocketAddress(args[0], Listener.DEFAULT_PORT);
            System.out.println("connecting to: "+targetAdr);
            SocketChannel socket = SocketChannel.open(targetAdr);
            sendData(socket);
            socket.close();
            System.out.println("Finished.");
        static final int TM = 10000;
        static final int TM_SIZE = 1000;
        static final int CRC = 2;
        static int k = 5;
        private static void sendData(SocketChannel socket) throws IOException {
            OutputStream out = Channels.newOutputStream(socket);
            byte[] ba = new byte[TM_SIZE];
            Arrays.fill(ba, (byte)(k++ % 127));
            System.out.println("Sending..."+k);
            for (int i = 0; i < TM; i++) {
                out.write(ba);
    //            try {
    //                Thread.sleep(10);
    //            } catch (InterruptedException e) {
    //                // TODO Auto-generated catch block
    //                e.printStackTrace();
    //                throw new RuntimeException(e);
            out.write(CRC);
            out.flush();
            out.close();
    import java.io.IOException;
    import java.io.InputStream;
    import java.net.InetSocketAddress;
    import java.nio.channels.Channels;
    import java.nio.channels.ServerSocketChannel;
    import java.nio.channels.SocketChannel;
    public class ListenerBugStreams {
        static int DEFAULT_PORT = 44521;
         * @param args
         * @throws IOException
        public static void main(String[] args) throws IOException {
            ServerSocketChannel serverChannel = ServerSocketChannel.open();
            serverChannel.socket().bind(new InetSocketAddress(DEFAULT_PORT));
            System.out.print("Waiting...");
            SocketChannel clientSocket = serverChannel.accept();
            System.out.println(" starting, IP=" + clientSocket.socket().getInetAddress() +
                ", Port="+clientSocket.socket().getLocalPort());
            //read data from socket
            readData(clientSocket);
            clientSocket.close();
            serverChannel.close();
            System.out.println("Closed.");
        private static void readData(SocketChannel clientSocket) throws IOException {
            InputStream in = Channels.newInputStream(clientSocket);
            //read and ingest objects
            byte[] ba = null;
            for (int i = 0; i < SenderBugStreams.TM; i++) {
                ba = new byte[SenderBugStreams.TM_SIZE];
                in.read(ba);
                System.out.print("*");
            //verify checksum
            int crcIn = in.read();
            if (SenderBugStreams.CRC != crcIn) {
                System.out.println("ERROR: Invalid checksum: "+SenderBugStreams.CRC+"/"+crcIn);
            System.out.println(ba[0]);
            int x = in.read();
            int remaining = 0;
            while (x != -1) {
                remaining++;
                x = in.read();
            System.out.println("Remaining:"+in.available()+"/"+remaining);
            System.out.println(" "+SenderBug.TM+" objects ingested.");
            in.close();
    }

    Here is your trouble:
    in.read(ba);read(byte[]) does not read N bytes, it reads up to N bytes. If one byte has arrived then it reads and returns that one byte. You always need to check the return value of read(byte[]) to see how much you got (also check for EOF). TCP chops up the written data to whatever packets it feels like and that makes read(byte[]) pretty random.
    You can use DataInputStream which has a readFully() method; it loops calling read() until it gets the full buffer's worth. Or you can write a little static utility readFully() like so:
        // Returns false if hits EOF immediately. Otherwise reads the full buffer's
        // worth. If encounters EOF in mid-packet throws an IOException.
        public static boolean readFully(InputStream in, byte buf[])
            throws IOException
            return readFully(in, buf, 0, buf.length);
        public static boolean readFully(InputStream in, byte buf[], int pos, int len)
            throws IOException
            int got_total = 0;
            while (got_total < len) {
                int got = in.read(buf, pos + got_total, len - got_total);
                if (got == -1) {
                    if (got_total == 0)
                        return false;
                    throw new EOFException("readFully: end of file; expected " +
                                           len + " bytes, got only " + got_total);
                got_total += got;
            return true;
        }

  • Printing the data of SAP in the external system!!

    Hi Friends,
    I have one requirement.
    The user is using some external system for creating PO/SO through BAPI from SAP R/3,the actual requirement is when the user presses the print button in the external system
    then the corresponding PO/SO should get printed in the external system.How can i achieve this?
    Kindly advice...
    Regards,
    Abdul
    Message was edited by: Abdul Hakim
    Message was edited by: Abdul Hakim
    Message was edited by: Abdul Hakim
    Message was edited by: Abdul Hakim
    Message was edited by: Abdul Hakim

    Hi,
    Here some of points which i think
    1. Get PO number from external number
    2. For the some PO write RFC, which will give you
        corresponding PO details from SAP.
    3. So get data from SAP system.
    4. come back to external system extract PO data which
        you have recently got from SAP system
    5. Using print command in external system do can finish
        your work.
    This all is depend upon external system. If external
    system is SAP then things are much simpler than this.
    Regards,
    Amey

  • Finder issues when copying large amount of files to external drive

    When copying large amount of data over firewire 800, finder gives me an error that a file is in use and locks the drive up. I have to force eject. When I reopen the drive, there are a bunch of 0kb files sitting in the directory that did not get copied over. This is happens on multiple drives. I've attached a screen shot of what things look like when I reopen the drive after forcing an eject. Sometime I have to relaunch finder to get back up and running correctly. I've repaired permissions for what it's worth.
    10.6.8, by the way, 2.93 12-core, 48gb of ram, fully up to date. This has been happening for a long time, just now trying to find a solution

    Scott Oliphant wrote:
    iomega, lacie, 500GB, 1TB, etc, seems to be drive independent. I've formatted and started over with several of the drives and same thing. If I copy the files over in smaller chunks (say, 70GB) as opposed to 600GB, the problem does not happen. It's like finder is holding on to some of the info when it puts it's "ghost" on the destination drive before it's copied over and keeping the file locked when it tries to write over it.
    This may be a stretch since I have no experience with iomega and no recent experience with LaCie drives, but the different results if transfers are large or small may be a tip-off.
    I ran into something similar with Seagate GoFlex drives and the problem was heat. Virtually none of these drives are ventilated properly (i.e, no fans and not much, if any, air flow) and with extended use, they get really hot and start to generate errors. Seagate's solution is to shut the drive down when not actually in use, which doesn't always play nice with Macs. Your drives may use a different technique for temperature control, or maybe none at all. Relatively small data transfers will allow the drives to recover; very large transfers won't, and to make things worse, as the drive heats up, the transfer rate will often slow down because of the errors. That can be seen if you leave Activity Monitor open and watch the transfer rate over time (a method which Seagate tech support said was worthless because Activity Monitor was unreliable and GoFlex drives had no heat problem).
    If that's what's wrong, there really isn't any solution except using the smaller chunks of data which you've found works.

  • I just installed a larger hard drive, and used my Time Machine Backup to transfer my info back to the new hard drive. When I open Iphoto, my thumnails are there, but they aren't linked back to the actual photos.I see the photos in my HD. What should I do

    I just installed a larger hard drive, and used my Time Machine Backup to transfer my files back to the new hard drive. When I open Iphoto, my thumbnails are there, but they aren't linked back to the actual photos. I see the original photo files in my HD. Is there a way to link the Iphoto thumbnails back to the original files?

    Use the Firewire cable and t boot the old Mac's hard drive to the new Mac's desktop and transfer your entire iPhoto folder.
    Use Disk Utiliy to erase'format HFS+ Journaled your TimeMachine drive and use the free Carbon Copy Cloner and clone your new boot drive to the external, it's hold option bootable.

  • Backing Up Your System & Setting Configurations Using External Hard Drive.

    We just purchase A external hard drive to back up our system. I believe the product name Is "Lacie" Anyway, the question is what is the best way to set up to an external hard drive so that my computer runs @ it's best speed. Recently I found out that I had about 5GB of hard drive space on my computer and this is why we made the purchase. Now is it wise to install logic on the new external or keep it on my internal drive. Basically for the best possible performance how should I set up everything and transfer data where I can keep a great work flow.

    What's your goal? Do you want to back up your data or make a clone of your HD? These take two different approaches.
    A clone of your HD is an exact copy which can be used to restore your main HD in case of drive failure/data corruption/severe data loss. Do a search for "Carbon Copy Cloner" to do this.
    If you want to back up your data (Logic songs, etc.), use the HD just for this purpose and no other. And NEVER work on those backup copies. They're "safety" copies, not to be altered ever. If you need to work on a backup copy of a song, make a copy of the copy and work on that.
    To make backup of Logic songs easier, always save your songs as projects and simply backup the project folders. Makes life easy.

  • Im try to import a large amount of pictures to Iphoto. I have pleant of space on my HD. But half way thru the transfer an error message comes up and says not enough disc space. How can I fix this?

    Im trying to import a large amount of pictures to Iphoto. I have pleanty of space on my HD. But half way thru the transfer an error message comes up and says not enough disc space. How can I fix this?

    I. I want to import 58.64 GB of Photos from an external HD
    2. I have 278.28 GB space left on my HD
    3. The exact error message goes like this......
    Insufficient Disk Space
    iPhoto cannot import your photos because
    there is not enough free space on the
    volume containing your iPhoto library
    I Have researched this and most advice is to empty the iPhoto trash. I have already done this and it did not help.
    Thank you for helping

  • HUGE amount of data in flat file every day to external system

    Hello,
    i have to develop several programs to export all tables data in a flat file for external system ( EG. WEB).
    I have some worries like if is possible by SAP export all KNA1 data that contains a lot of data in a flat file using the extraction:
    SELECT * FROM KNA1 ITO TABLE TB_KNA1.
    I need some advices about these kind of huge extractions.
    I also have to extract from some tables, only the data changes, new record, and deleted records; to do this I thought of developing a program that every day extract all data from MARA and save the extraction in a custom table like MARA; the next day when the programs runs compare all data of the new extraction with the old extraction in the ZMARA table to understand the datachanges, new records or deleted record.. It's a righ approach? Can have problems with performance? Do you now other methods?
    Thanks a lot!
    Bye

    you should not have a problem with this simple approach, transferring each row to the output file rather than reading all data into an internal table first:
    open dataset <file> ...
    select * from kna1 into wa_kna1
      transfer wa_kna1 to <file>
    endselect
    close dataset <file>
    Thomas

  • Is there any way to connect time capsule to a MacBook Pro directly via USB. I have a large amount of data that I want to back up and it is taking a very long time (35GB is taking 3 hrs, I have 2TB if files in total)...)?

    Perhaps via USB. I have a large amount of data that I want to back up and it is taking a very long time (35GB is taking 3 hrs, I have 2TB if files in total)...? I want to use TimeCapsule as back-up for an archive which is curently stored on a 2 TB WESC HD. 

    No, you cannot backup via direct usb connection..
    But gigabit ethernet is much faster anyway.. are you connected directly by ethernet?
    Is the drive you are backing up from plugged into the TC? That will slow it down something chronic.. plug that drive in by its fastest connection method.. WESC sorry I have no idea. If ethernet use that.. otherwise USB direct to the computer.. always think what way the files come and go.. but since you are copying from the computer everything has to go that way.. it makes things slower if they go over the same cable.. if you catch the drift.

Maybe you are looking for