Using data loading softwares better than SM35 as batch input

Hi All,
Can anyone tell me what are the pros and cons of using SM35 batch input in SAP as compared with commercial data loading software like win shuttle, data loader etc
Thanks in advance.

Our data loading software runs in the background, with each particular run set as a separate batch. It doesn't matter if this data comes from a program generated on site, or via a data feed interfaced from another system. With the transaction, SM35, I can look at each run and know immediately if any entry within that run did not post correctly, as well as those that did. I can view each transaction and get a detailed log on all that happened. If an entry failed to post, I can see exactly where within the posting process the transaction failed and why it failed.  When we have a problem with a new - or recently changed, interface, SM35 is the tool I use to debug the problem fast. Another benefit of this batch input session is when a user tells me her/his interface failed. Often it didn't; someone just ran off with their interface report. In those cases, I don't have to restore & re-run anything. I just reprint the run to the relief  - and gratitude, of the user.

Similar Messages

  • Using SQL Loader in more than one table

    Hi all,
    I have a new question for those who have used SQL Loader. I
    have never used it and I just know that I need a control file to
    tell SQL Loader what is my flat file layout and what table the
    information goes to. My problem is: my flat file has information
    that goes to two tables in my schema. Those files are very big
    (aprox 280Mb) and I would like to read them just once. Can I do
    this with SQL Loader?
    Other question, is that the fastest way to import data from
    flat files becouse I am using PERL and it takes aprox. 9 hours
    to import 10 of those files. I could use UTL_FILE to read it but
    I heard that SQL Loader was better.
    Thanks for your cooperation
    (Robocop)
    Marcelo Lopes
    Rio de Janeiro - Brazil

    SQL*Loader is the fastest way to load, particularly in direct parallel mode, and can certainly load to multiple tables.
    >
    My advice would be to have a look at the examples given in the Oracle Utilities guide, there is one for loading to multiple
    tables, which I have pasted below.
    >
    -- Loads EMP records from first 23 characters
    -- Creates and loads PROJ records for each PROJNO listed
    -- for each employee
    LOAD DATA
    INFILE ’ulcase5.dat’
    BADFILE ’ulcase5.bad’
    DISCARDFILE ’ulcase5.dsc’
    REPLACE
    INTO TABLE emp
    (empno POSITION(1:4) INTEGER EXTERNAL,
    ename POSITION(6:15) CHAR,
    deptno POSITION(17:18) CHAR,
    mgr POSITION(20:23) INTEGER EXTERNAL)
    INTO TABLE proj
    -- PROJ has two columns, both not null: EMPNO and PROJNO
    WHEN projno != ’ ’
    (empno POSITION(1:4) INTEGER EXTERNAL,
    projno POSITION(25:27) INTEGER EXTERNAL) -- 1st proj
    INTO TABLE proj
    WHEN projno != ’ ’
    (empno POSITION(1:4) INTEGER EXTERNAL,
    projno POSITION(29:31 INTEGER EXTERNAL) -- 2nd proj
    INTO TABLE proj
    WHEN projno != ’ ’
    (empno POSITION(1:4) INTEGER EXTERNAL,
    projno POSITION(33:35) INTEGER EXTERNAL) -- 3rd proj
    see the documentation for a complete explanation of the configuration.
    Thanks, I will read it.

  • How to automate the data load process using data load file & task Scheduler

    Hi,
    I am doing Automated Process to load the data in Hyperion Planning application with the help of data_Load.bat file & Task Scheduler.
    I have created Data_Load.bat file but rest of the process i am unable complete.
    So could you help me , how to automate the data load process using Data_load.bat file & task Scheduler or what are the rest of the file is require to achieve this.
    Thanks

    To follow up on your question are you using the maxl scripts for the dataload?
    If so I have seen and issue within the batch (ex: load_data.bat) that if you do not have the full maxl script path with a batch when running it through event task scheduler the task will work but the log and/ or error file will not be created. Meaning the batch claims it ran from the task scheduler although it didn't do what you needed it to.
    If you are using maxl use this as the batch
    "essmsh C:\data\DataLoad.mxl" Or you can also use the full path for the maxl either way works. The only reason I would think that the maxl may then not work is if you do not have the batch updated to call on all the maxl PATH changes or if you need to update your environment variables to correct the essmsh command to work in a command prompt.

  • I accidentally deleted my firefox profile on OS X. My bookmarks, usernames, passwords - all gone. Used data recovery software...

    Data recovery software called Data Rescue 3. It seemed to have brought up many deleted files. I sifted through as many as I could. I have removed the ones that are .pdf, etc. and obviously not related to firefox.
    The folder & files I recovered:
    1) SQLite - I have many files called SQ00001.db, SQ00002.db, etc. I don't know if these are the user names or passwords or what? They are in a folder called SQLite. If I click on this nothing happens.
    2) There are files under misc called login-0001.keychain - Are these passwords, usernames?
    3) Web Bookmarks - Files called Bookmarks00001.plist. If I click on this it opens up in text edit and I can see some books randomly throughout.
    4) Mozilla - Only on file called bookmarks-00001.html
    5) There are many other files one in text ending in htm. and plist.
    6) Some other files called DS_Store-00001, etc.
    7) Also some files called embed.default.js.gz (under Archives folder and ZIP). I don't know what this is.
    I tried to recover the files... I don't know what to do with them. Obviously some files might be useless. If I could get the usernames saved that would mean everything to me. Literally. I have all my work on there... :(
    I could look up other files specifically, please just let me know. Running OS X 10.6.8.
    A lot of questions said use data recovery, I've read and read and am truly stuck. What do I do with these files? How do I get my username and profile info back. There is no file called .ini or 12345678.default. (example).
    Please, please, please help. Thank you.

    For passwords, look for 2 files key3.db and signons.sqlite
    For usernames saved in form fields, look for formhistory.sqlite
    Bookmarks are in places.sqlite
    For descriptions of each profile file, read:
    <br> [[Profiles - Where Firefox stores your bookmarks, passwords and other user data#w_what-information-is-stored-in-my-profile|What information is stored in my profile?]]

  • Data Load taking more than 24 hrs

    Hi All,
    We are in the process of loading the data from R3 to BW. It’s taking more than 24hrs to load 1 yr data. This is because of complexity of ABAP code its taking this much time. We did all the possible ways to improve the performance of the code. But no luck.
    In case if the same thing happened in the production system, then how should I forward with the data load as it’s taking more than 24 hrs for 1 yr data.
    I m Planning to load int with out data transfer 1st and then Full load.
    Please correct if I m wrong.
    Thanks,
    RS.

    Hi,
    where is your ABAP code complexity loacted? in R/3 or in BW?
    Are you talking about loading in empty cube taking long time? extraction time?
    Analyze the different steps in your monitor and tell us where is the bottleneck;
    If you already know the above and you have performed all the tunings (e.g. number range buffereing when filling an empty cube...) then you're correct; init without data and then full loads.
    As suggested you could segment your full loads and even run them in paralel...
    hope this helps...
    Olivier.

  • How to update existing table using Data Load from spreadsheet option?

    Hi there,
    I need to update an existing table, but in Data Load application when you select csv file to upload it inserts all data by replacing existing one. How can i change this?
    Let me know,
    Thank you.
    A.B.A.

    And how do you expect your database server to access a local file in your machine ?
    Is the file accessible from outside your machine say inside a webserver folder so that some DB process can poll on the file ?
    Or, is your DB server in the same machine where you have the text file ?
    You will have to figure out the file acess part before automating user interaction or even auto-refreshing.

  • USING DATA LOADING WIZARD

    All,
    I writing a data load process to copy&paste or upload files all is good. but now i want to bypass the step for column mapping (run it in the background/ move that copy to step1) so a user doesn't see it while loading the data by default the stages are: (
    Data Load Source
    Data / Table Mapping
    Data Validation
    Data Load Results
    so i want to run the 2nd step in the background (or if i can move that function and combine with step 1)...any help? 
    apex 4.2
    thanks.

    Maybe consider page branches on the relevant page, or the plugin
    - Process Type Plugin - EXCEL2COLLECTIONS

  • Any download software better than Folx ?

    Does anybody know a software that is better and faster than FOLX for Mac ?? I downloaded about 2GB by Folx a few days ago and now It doesnt allow me anymore which means I have to download with Safari .... at a very slow speed T.T
    I tried trial Speed download but seems like my network can make it faster huhuhu.
    thanks first !

    Try iGetter. Used it for years. Always worked a treat for me.
    http://www.igetter.net/

  • Need to find PowerPC Data Recovery Software

    So, I was working one day inside my Power Mac G5 Quad by putting in some additional memory. Suddenly, my arm accidently pushes over my external hard drive with lots of my back up software. The drive falls to the carpeted floor. Next, I hooked it up to my mac to see if there was a problem with it. I found no problem other than all my programs were gone. The drive does post to the desktop and no clicking or bad hard drive noises come on. I then go to hook this up to my windows laptop and Windows chkdsk refuses to run, saying that the drive is not formatted.
    Can someone recommend for me a good data recovery software, other than Prosoft Data Rescue 3? I have had nothing but problems with that software as when it gets to stage 3 of 3, it hangs at 71.93 percent and stays there.
    I really want my data back as I also have a backup of my Gateway's windows image on there. I looked up a software called Boomerang Data Recovery, but they want like 140 dollars for a basic licence, and not to mention I heard bad things about that company such as, scamming people, no customer service, etc.
    I am looking for something affordable, but with which can do the job.
    Thanks.

    Ive never run into any trouble with Data Rescue (granted Im still using 2.0), but it sounds like what BDAqua said. Its probably hit a bad spot on the drive and yes it will sit there reading and reading and reading until it can get around it.. Last time I had to recover some stuff due to a lovely power outage, it took a few hours to get everything off of my 500GB storage drive..
    Id go with Diskwarrior for my 'second opinion'. DW is pretty good about it, and in the event it finds a problem, I like its reports alot better then Data Rescue. Data Rescue wont really wont tell you anything about the drive's problems other then it cant read data, but DW will let you know whats going on..
    If you need a general diagnostic tool for the whole system, TechTool Pro also has some nice disc tests in it amongst all the other memory/cpu/etc tools..

  • Data Loader - Only imports first record; remaining records fail

    I'm trying to use Data Loader to import a group of opportunities. Everytime I run the Data Loader it only imports the first record. All the other records fail with the message "An unexpected error occurred during the import of the following row: 'External Unique Id: xxxxxxx'". After running the Data Loader, I can modify the Data file and remove the first record that was imported. By running the Data Loader again, the first row (previously the second row) will import successfully.
    Any idea what could be causing this behavior?

    W need a LOT more information, starting with the OS, and the version of ID, including any applied patches.
    Next we need to know if you are doing a single record per page or multiple records, whether the placeholders are on the master page, how many pages are in the document and if they all have fields on them (some screen captures might be useful -- embed them using the camera icon on the editing toolbar on the webpage rather than attaching, if it works [there seem to be some issues at the moment, though only for some people]).
    What else is on the page? Are you really telling it to merge all the records, or just one?
    You get the idea... Full description of what youhave, what you are doing, and what you get instead of what you expect.

  • Data Load Best Practice.

    Hi,
    I needed to know what is the best way to load the data from a source. Is the SQL load the best way or using data files better? What are the inherent advantages and disadvantages of the two processes?
    Thanks for any help.

    I have faced a scenario that explaining here
    I had an ASO cube and data is being load from txt file daily basis and data was huge. There is some problem in data file as well as Master file (file that is being used for dimension building).
    Data and master file has some special character like ‘ , : ~ ` # $ % blank spaces and tab spaces, even ETL process cannot remove these things because this is coming within a data.
    Sometimes any comment or database error were also present in data file.
    I faced problem with making rule file with different delimiter, most of the time I find same character within data that is used as a delimiter. So its increases no of data field and Essbase give error.
    So I have used sql table a for data load .a Launch table is created and data is populated in this table. All error are removed here before using data load into Essbase
    This was my scenario (this case I find SQL load the second one is better)
    Thanks
    Dhanjit G.

  • I bought a new imac running Loin and Office 2011. The server we use to access emails runs exchange 2003 and my IT person says the imac is not compatible with the server. The server does not want to let me access using any email software.

    I bought a new imac running Loin with Office 2011. I am the only mac in the office and the Server we use to access emails runs exchange 2003 and my IT person says the imac is not compatible with the server. The server does not want to let me access using any email software other than webmail access through our website. It looks as though 2011 is not compatible with 2003 per some searches online, but what are some options I have to gain full access again? I purchased parallels in hopes that this my help and I am able to use remote desktop connection to log on to server but can not drag and drop files I need. There is a shared drive on that Server we all use to exchange files.
    The two main issues are gaining access to my email again and ability to drag and drop files from mac to pc. I hope this is enough info to get some solutions.
    PS - IT person says my iMac's IP is what his server does not understand so that is why I can not login.

    Office 2011 is not compatbile with Exchange 2003.
    I suggest you post further Office related questions on Microsoft's own forums for their Mac software:
    http://answers.microsoft.com/en-us/mac

  • Oracle Data Loader On Demand : Account Owner Field mapping

    I was trying to import account records using data loader. For the 'Account Owner' field the data file contains 'User ID', then the import is failed. When I use "Company Sign In ID/User Id" value in the data file it is sucessfull.
    Do we have any way to use the 'User ID" value in the data file for 'Account Owner' field and run data loader suscessfully.

    The answer is no. It is my understanding that you need to map Account Owner to the User Sign In ID, which has the format of:
    <Company Sign In ID>/<User ID>

  • Data Recovery Software for Mac.

    I have 2 questions regarding data recovery from an external drive (WD My Book Essential).
    1. During a Bootcamp install, the backup taken using WD Smartware software has been replaced by a Windows folder and I cannot read the drive after reinstalling SL on my Mac Mini to retrieve my backup. If I erase the drive (as the software is asking me to do), can I then recover the lost backup files using data recovery software or will the file system on the external drive cause a problem. (Format is NTFS)
    2. Which data recovery software is best for for use on a Mac? Or do I recover using Windows software instead?
    Thanks.

    If you want to recover data from a Windows partition, you probably need to use Windows recovery software. I'm not aware of any Mac data recovery software that can work with an NTFS partition, though there might be something out there I've missed. Do not erase the Bootcamp partition before attempting to use any recovery software if you can possibly avoid doing so (use Windows volume repair software if the volume won't mount); that will just make the recovery far more difficult.
    Good luck.

  • Loading from text file using Sql Loader

    I need to load data from a text file into Oracle table. The file has long strings of text in the following format:
    12342||||||Lots and lots of text with all kinds of characters including
    ^&*!#%#^@ etc.xxxxxxxxxxxxxxxxxxxxxxx
    yyyyyyyyyyyyyyyyyyyyyyyyytrrrrrrrrrrrrrrrrrrr
    uuuuuuuuuuuuuuuuuuurtgggggggggggggggg.||||||||
    45356|||||||||||again lots and lots of text.uuuuuudccccccccccccccccccccd
    gyhjjjjjjjjjjjjjjjjjjjjjjjjkkkkkkkkkkkkklllllllllllnmmmmmmmmmmmmnaaa|||||||.
    There are pipes within the text as well. On the above example, the line starting with 12342 is an entire record that needs to be loaded into a CLOB column. The next record would be the 45356 one. Therefore, all records have a bunch of pipes at the end, so the only way to know where a new record starts is to see where the next number is after all the ending pipes. The only other thing I know is that there are a fixed number of pipes in each record.
    Does anyone have any ideas on how I can load the data into the table either using sql loader or any other utility? Any input would be greatly appreciated. Thanks.

    STFF [url http://forums.oracle.com/forums/thread.jspa?messageID=1773678&#1764219]Sqlldr processing of records with embedded newline and delimiter

Maybe you are looking for

  • Training and Event Management - 2 Specify Attendance Types

    Hi Is it possible to assign 1 attendance types to each course type or an specific event. For example: There is a course that generates payed attendance and another course that generates non payed attendance. So to discriminate for the payroll we need

  • BDC: How to enter data in Table control (With wizard) using scrolling?

    Using BDC, I am trying to enter the data in the table control (with wizard). I want to know what is the specific command to scroll down in table control (With Wizard). While recording I am getting these steps: SAPLCKBASCR1     0202     X             

  • Books and Movies not Synced to iPhone

    I used to sync some PDFs from my PC (iTunes) to my iPhone 3GS and there was no issues. However, after I run the sync today, the PDFs are disappear on my phone. They are still in my iTunes' LIBRARY->Books. All checked. During the sync process, I can s

  • BI statistics, Duplicate records in ST03N

    Hi Friends, We had applied BI statistics, and now we are checking the query performance in ST03n. In the Reporting Analysis View, I can monitor the query access, the problem is that this view show de statistical data saved in 0TCT_C02 (infocube) and

  • Why is CC 5.3 so slow?

    Hello again (and clamage45 - thanks so much for all your helpful responses - they are much appreciated). We just migrated from CC 4.2 to CC 5.3 (Forte 6 update 2). I am very surprised to discover that on average CC 5.3 runs about 2.5 times slower tha