Large Amount of External Transaction - FI-BL

Hi gurus,
I need to create a large number of external transaction, around 60,000, and assign them to the respective posting rules as usual for Electronic Bank Statemet, but I would like to know if I can anyhow make this work easier.
Best Regards,
Paulo.

Execute T Code SE16N, enter T028G as table and Execute (F8).  In the next screen, you can use the Insert in New Row option (it should be sixth icon from left) to paste the entries that you copy from your excel sheet where you have the external transactions, posting rules, etc.  Once you save, make sure you choose Table Entry -> Transport option from the menu on top to push your entries into a transport request.

Similar Messages

  • Finder issues when copying large amount of files to external drive

    When copying large amount of data over firewire 800, finder gives me an error that a file is in use and locks the drive up. I have to force eject. When I reopen the drive, there are a bunch of 0kb files sitting in the directory that did not get copied over. This is happens on multiple drives. I've attached a screen shot of what things look like when I reopen the drive after forcing an eject. Sometime I have to relaunch finder to get back up and running correctly. I've repaired permissions for what it's worth.
    10.6.8, by the way, 2.93 12-core, 48gb of ram, fully up to date. This has been happening for a long time, just now trying to find a solution

    Scott Oliphant wrote:
    iomega, lacie, 500GB, 1TB, etc, seems to be drive independent. I've formatted and started over with several of the drives and same thing. If I copy the files over in smaller chunks (say, 70GB) as opposed to 600GB, the problem does not happen. It's like finder is holding on to some of the info when it puts it's "ghost" on the destination drive before it's copied over and keeping the file locked when it tries to write over it.
    This may be a stretch since I have no experience with iomega and no recent experience with LaCie drives, but the different results if transfers are large or small may be a tip-off.
    I ran into something similar with Seagate GoFlex drives and the problem was heat. Virtually none of these drives are ventilated properly (i.e, no fans and not much, if any, air flow) and with extended use, they get really hot and start to generate errors. Seagate's solution is to shut the drive down when not actually in use, which doesn't always play nice with Macs. Your drives may use a different technique for temperature control, or maybe none at all. Relatively small data transfers will allow the drives to recover; very large transfers won't, and to make things worse, as the drive heats up, the transfer rate will often slow down because of the errors. That can be seen if you leave Activity Monitor open and watch the transfer rate over time (a method which Seagate tech support said was worthless because Activity Monitor was unreliable and GoFlex drives had no heat problem).
    If that's what's wrong, there really isn't any solution except using the smaller chunks of data which you've found works.

  • Advice needed on how to keep large amounts of data

    Hi guys,
    Im not sure whats the best way is to make large amounts of data available to my android  app on the local device.
    For example records of food ingredients, in the 100's?
    I have read and successfully created .db's using this tutorial.
    http://help.adobe.com/en_US/AIR/1.5/devappsflex/WS5b3ccc516d4fbf351e63e3d118666ade46-7d49. html
    However to populate the database I use flash? So this kind of defeats the purpose of it. No point in me shifting a massive array of data from flash to a sql database, when I could access the data direct from the as3 array?
    So maybe I could create the .db with an external program? but then how would I include that .db in the apk file and then deploy it to users android device.
    Or maybe I create a as3 class with an xml object init and use that as a means of data storage?
    Any advice would be appreciated

    You can use any means you like to populate your SQLite database, including using external programs, (temporarily) embedding a text file with SQL statements, executing some SQL from AS3 code etc etc.
    Once you have populated your db, deploy it with your project:
    http://chrisgriffith.wordpress.com/2011/01/11/understanding-bundled-sqlite-databases-in-ai r-for-mobile/
    Cheers, - Jon -

  • Is there a way to put a large amount of music on your iPod without having to keep all the files in iTunes on your computer as well? I want to put my entire music collection (including cds) on my iPod but don't want to take up the space on my computer.

    is there a way to put a large amount of music on the ipod without having to keep all the files on itunes as well? I want to use my ipod as an externa drive and put all of my music on it without taking up the space on my computer. I also don't want to lose all my files everytime I plug ipod into my computer. Is this possible? Is there a way to avoid using itunes and only use the ipod as an external drive?

    You cannot put music onto your iPod without using iTunes, that what iTunes is for.
    It's also not a good idea to wipe the music from your computer and having it only on your iPod. We see countless posts here from people who have done just that - and then lost everything when the iPod needs a Restore. Even if you never need to Restore your iPod, what happens when you eventually replace the iPod? You'll be back here asking how to get the music from one iPod to another. That's not easy to do, we see countless posts about that too!
    A much better idea is to buy an external drive (a proper external drive, not simply an iPod) and put your large amount of music onto that drive. Then point your iTunes Library to that drive. However, you need to remember two things:
    You still need a backup of that Library.
    Using an external drive as your iTunes Library means that the drive must be connected and ready to read before starting your iTunes programme. If it isn't, then iTunes will look on the C: drive - and you will find no music in your Library. (Once again, lots of posts about that as well!)

  • Create new table , 1 column has large amount of data, column type ?

    Hi, sure this is a noob question but...
    I exported a table from sql server, want to replicate in oracle.
    It has 3 columns, one of the columns has the configuration for reports, so each field in the column has a large amount of text in it, some upwards of 30k characters.
    What kind of column do I make this in oracle, I believe varchar has a 4k limit.
    creation of the table...
    REATE TABLE "RPT_BACKUP"
    "NAME" VARCHAR2(1000 BYTE),
    "DATA"
    "USER" VARCHAR2(1000 BYTE)
    ORGANIZATION EXTERNAL
    TYPE ORACLE_LOADER DEFAULT DIRECTORY "XE_FTP" ACCESS PARAMETERS ( records delimited BY newline
    skip 1 fields terminated BY ',' OPTIONALLY ENCLOSED BY '"' MISSING FIELD VALUES
    ARE NULL ) LOCATION ( 'REPORT_BACKUP.csv' ))
    Edited by: Jay on May 3, 2011 5:42 AM
    Edited by: Jay on May 3, 2011 5:43 AM

    Ok, tried that, but now when i do a select * from table i get..
    ORA-29913: error in executing ODCIEXTTABLEFETCH callout
    ORA-30653: reject limit reached
    ORA-06512: at "SYS.ORACLE_LOADER", line 52
    29913. 00000 - "error in executing %s callout"
    *Cause:    The execution of the specified callout caused an error.
    *Action:   Examine the error messages take appropriate action.                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                           

  • Problems copying large files to external disks

    I have a lot of media and so multiple USB and network based external hard disks.
    I'm having trouble with two in particular that are recent buys. I initially mounted them via a USB hub onto my Time Capsule, but when I had errors, I've now tried mounting them directly to my MacBook Air - and for comparison directly to a Windows laptop.
    And I'm only having problems when it's a Mac doing the copying (MBA to either USB mounted via Time Capsule or directly USB mounted on the MBA).
    The problem is that the drive appears to behave OK for initial copies - but I'm trying to put a set of old movies (captured from a VCR ages ago that I'd recorded off TV) onto one of the drives and (a) it takes ages to copy and (b) eventually I get a write failure. The specific error message is
    The Finder can't complete the operation because some data in "" can't be written. (Error code -36)
    I"ve tried a whole variety of setups - as I've said via Time Capsule and directly mounted. I also wondered if the file system on the drive would make a difference. Out of the box it was formatted with FAT32 and when that failed I've now reformatted with MacOS file system - which I would have thought would give better compatibility (and I've read that FAT32 has a large file size limit - although I think it's 4GB and while I do have one file at 4.04GB, it's not failing the copy on that file).
    I've also connected the drive (when formatted FAT32) to a Windows laptop and (a) it copies faster and (b) it copies successfully.
    Any suggestions for this? Is there some kind of large file (all are >1GB) copy issue with USB mounted devices in OSX?

    As I mentioned in my original post while the disks were originally formatted FAT32 I reformatted then and changed them to Mac OS Extended (Journaled) so that isn't the issue. I still have the problem with the disks using Apple format.
    I've noticed that if I do the copy in pieces, it seems to work. I.e. if I select the full set of 45GB in one copy and paste operation (dragging and dropping using Finder) it fails part way through.
    But if I copy 3/4 movies at a time going back until I have copied all of them then I can get all of them on the disk.
    It suggests to me it's some kind of issue with copying and pasting a very large amount of data in Snow Leopard?

  • Large amounts of video data 2TB+

    Over the next several months the University of Michigan will be videotaping elementary school classes. We hope to tape at least 40 classrooms with two cameras in each classroom for about 90min each. Im estimating I will have 2-3 TB of video files I need to safely store and edit. Im considering buying External 1TB Firewire Hard drives for this. In the past using multiple external HDDs was not the best solution. When I have more than 1 or 2 Firewire devices hooked up I start losing connections.
    Can someone help me locate information on best practices for storing and editing large amounts of video files with PPro?

    I'd recommend looking at solutions offered by one of these companies. If you contact them and describe what you are trying to accomplish, they can offer suggestions...
    http://www.caldigit.com/
    http://www.dulcesystems.com/
    http://www.sonnettech.com/product/fusiond800raid.html

  • Large amount of files

    I have a quite reliable AEB + airdisk set-up and can transfer large files. Wireless and wired speed is about 1Gb per 2-3 minutes.
    Everything is fine until I copy a large amount of files to the airdisk.
    I've tried 8000 files which went OK.
    I've tried a package with 22500 files and it fails. The connection drops and I have to restart the AEB.
    Are there any similar experiences that show that the number of files is a critical factor?

    It was no TM back-up, it's just a Lightroom catalog file that contains 22,500 items tat I drag-and-drop to the Airdisk.
    From my experience, that's likely to fail because AirDisk will most likely to crash when dealling with large number items in one single operation.
    Does looks like a memeory allocation problem on the firmware.
    I got around the problem by just plug the external usb hard drive directly to my Mac. I hope TC doesn't have similar problem, otherwise, it's would be much harder to get the HD out of the TC

  • Im try to import a large amount of pictures to Iphoto. I have pleant of space on my HD. But half way thru the transfer an error message comes up and says not enough disc space. How can I fix this?

    Im trying to import a large amount of pictures to Iphoto. I have pleanty of space on my HD. But half way thru the transfer an error message comes up and says not enough disc space. How can I fix this?

    I. I want to import 58.64 GB of Photos from an external HD
    2. I have 278.28 GB space left on my HD
    3. The exact error message goes like this......
    Insufficient Disk Space
    iPhoto cannot import your photos because
    there is not enough free space on the
    volume containing your iPhoto library
    I Have researched this and most advice is to empty the iPhoto trash. I have already done this and it did not help.
    Thank you for helping

  • Converting large amounts of points - 76 million lat/lon's to spatial object...

    Hello, I need help.
    Platform - Oracle 11g 64bit on Windows Enterprise server 2008 64bit.  64 GB of ram with 2 CPUs totalling 24 cores
    Does any one know of a fast way to convert large amounts of points to a spatial object?  I need to convert 76 million lat/lon's to ESRI st_geometry or Oracle sdo_geometry.
    Currently, I have setup code using pipelined parallel functions and multiple jobs that run concurrently.  It still takes over 2.5 hours to process all of the points.
    Any pointers would be GREATLY appreciated!
    Thanks
    John

    Hi,
    Where is the lat/lon data at the moment?  In an external text file or in an existing database table as number attributes?
    If they're in an external text file, then I'd probably use an external table to load them in as quickly as possible.
    If they're in an existing database table, then you can just update the sdo_geometry column using:
    update <table> set <geometry column> = sdo_geometry(2001, <your srid>, sdo_point_type(<lon column>, <lat column>, null), null, null)
    where <lon column> is not null
    and <lat column> is not null;
    That should run very quick for you.  If you want to avoid the overhead of creating redo, you could use "create table .... as select...".  This example of creating 1,000,000 points runs in 9 seconds for me.
    create table sample_points (geometry) nologging as
      (select sdo_geometry(2001, null,
      sdo_point_type(
      trunc(100000 * dbms_random.value()),
      trunc(100000 * dbms_random.value()),
      null), null, null)
      from dual connect by level <= 1000000);
    I have setup code using pipelined parallel functions and multiple jobs that run concurrently
    You shouldn't need to use pl/sql for this task.  If you find you do, then provide some sample code and we'll take a look.
    Regards,
    John O'Toole

  • Firefox is using large amounts of CPU time and disk access, and I need to know how to shut down most of this so I can actually use the browser.

    Firefox is a very busy piece of software. It's using large amounts of CPU time and disk access. It puts my usage at low priority, so I have to wait for some time to be able to use my pointer or keyboard. I don't know what it uses all that CPU and disk access time for, but it's of no use to me. It often takes off with massive use of resources when I'm not doing anything, and I may not have use of my pointer for several minutes. How can I shut down most of this so I can use the browser to get my work done. I just want to use the web site access part of the software, and drop all the extra. I don't want Firefox to be able to recover after a crash. I just want to browse with a minimum of interference from Firefox. I would think that this is the most commonly asked question.

    Firefox consumes a lot of CPU resources
    * https://support.mozilla.com/en-US/kb/Firefox%20consumes%20a%20lot%20of%20CPU%20resources
    High memory usage
    * https://support.mozilla.com/en-US/kb/High%20memory%20usage
    Check and tell if its working.

  • Creation of data packages due to large amount of datasets leads to problems

    Hi Experts,
    We have build our own generic extractor.
    When data packages (due to large amount of datasets) are created, different problems occur.
    For example:
    Datasets are now doubled and appear twice, one time in package one and a second time in package two. Since those datsets are not identical, information are lost while uploading those datasets to an ODS or Cube.
    What can I do? SAP will not help due to generic datasource.
    Any suggestion?
    BR,
    Thorsten

    Hi All,
    Thanks a million for your help.
    My conclusion from your answers are the following.
    a) Since the ODS is Standard - within transformation no datasets are deleted but aggregated.
    b) Uploading a huge amount of datasets is possible in two ways:
       b1) with selction criteria in InfoPackage and several uploads
       b2) without selction criteria in InfoPackage and therefore an automatic split of datasets in data packages
    c) both ways should have the same result within the ODS
    Ok. Thanks for that.
    So far I have only checked the data within PSA. In PSA number of datasets are not equal for variant b1 and b2.
    Guess this is normal technical behaviour of BI.
    I am fine when results in ODS are the same for b1 and b2.
    Have a nice day.
    BR,
    Thorsten

  • Large Amount of Data in JSF

    Hello,
    I am using the Table Group component for displaying data in my application designed in Java Studio Creator.
    I have enabled paging on the component. I use CachedRowSet on the bean for the page for getting the data. This works very well at the moment in my development environment. At the moment I am testing on small amount of data.
    I was wondering how does this component perform with very large amounts of data (>75,000 rows). I noticed that there is a button available for users to retrieve all the rows. So I was wondering apart from that instance, when viewing in a paged mode does the component get all the results from the database everytime ?
    Which component would be best suited for displaying large amounts of data in a table format?
    Thanks In Advance!!

    Thanks for your reply. The table control that I use does have paging as a feature and I have enabled it. It still takes time to load the data initially.
    I wonder if it is got to do with the logic of paging. How do you specify which set of 20 records to extract from SQL.
    Thanks for your help!!

  • Open Large amount of data

    Hi
    I have a file on application server in .dat format, it contains large amount of data may be 2 million of records or  more, I need to open the file to check the record count, is there any software or any option to open the file, I have tried opening with Notepad, excel .... it gives error..
    please let me know
    Thanks

    Hi,
    Try this..
    Go to AL11..
    Go to the file directory..Then in the file there will be field called length..which is the total length of the file in characters..
    If you know the length of a single line..
    Divide the length of the file by the length of single line..I believe you will get the number of records..
    Thanks,
    Naren

  • Bex Report Designer - Large amount of data issue

    Hi Experts,
    I am trying to execute (on Portal) report made in BEx Report Designer, with about 30 000 pages, and the only thing I am getting is a blank page. Everything works fine at about 3000 pages. Do I need to set something to allow processing such large amount of data?
    Regards
    Vladimir

    Hi Sauro,
    I have not seen this behavior, but it has been a while since I tried to send an input schedule that large. I think the last time was on a BPC NW 7.0 SP06 system and it worked OK. If you are on a recent support package, then you should search for relevant notes (none come to mind for me, but searching yourself is always a good idea) and if you don't find one then you should open a support message with SAP, with very specific instructions for recreating the problem from a clean input-schedule.
    Good luck,
    Ethan

Maybe you are looking for

  • PR/ Sales order value not coming in condition value routine

    As per the requirement while creating a PO I need to add a new condition type and need to calculate condition price for this newly added condition based on some matrix. For this requirement PO will always be created with reference to PR and with acco

  • I got my Iphone 3gs unlocked, but Navigation and Compass are not working. Please help if anybody had faced this problem and got it resolved.

    I got my Iphone 3gs unlocked, but Navigation and Compass are not working. Please help if anybody had faced this problem and got it resolved.

  • Query with char. in column

    Hello all I have input ready query, with one char. in the columns (Line of business). When i try to add new row with data, the system puts the data with line of business =#, even when i write it under another value of line of business. Is it not poss

  • Using exchange rate in summary queries

    Hi, The data in the cube contains the order amount in order currency and the exchange rate.  When the query contains the order number and I multiply the amt in order currency with the exchange rate, each line is correct and when I set the result to s

  • IWeb Publishing

    Until recently I had not visited the iWeb forum for a long time. I see that so many of you are having publishing problems. Most of these seem to be about publishing to .Mac. I do have a .Mac account but I use it for Backup, Address Book etc and to tr