Data upload problem

Hi,all:
   Now I face the problem as follow:
   1.The delta upload can not run automatic finished.I can find some course in <b>SM58</b>,How to solve this problem? Can you give me some helps?
   2.When I delta upload,system message like this:
<i>Processing in Warehouse timed out; processing steps mising
Diagnosis
Processing the request in the BW system is taking a long time and the processing step Second step in the update has still not been executed.
System response
<DS:DE.RSCALLER>Caller  is still missing.
Procedure
Check in the process overview in the BW system whether processes are running in the BW system under the background user.
If this is not the case, check the short dump overview in the BW system.</i>
    What is wrong with the data upload and how to solve it? Can you give me some advice?
Regards&Thanks!
zagory

Hi Zagory,
I think at the time of this particular job starting there are other jobs running with Background User Owner or there might be other background jobs running with Priority  A . SO  try to reschedule the JOb at some later point in time where system is less busy ......or one more option is to try to increase the System Dialog run Time in RZ10 transaction ...Ask ur basis team they will do for you ...
But try first option first......
Assign points if it helps.....
Regards,
Vijay.

Similar Messages

  • Data upload problem in delta update from 1st ODS to 2nd ODS

    Dear Friends,
    I am loading data from one ODS to another. The update mode was full upload. Sometime back an error occurred in activation of the first ODS. The error was: Full updates already available in ODS ,Cannot update init./delta. So currently daily records are pulled but not added i.e. transferred recs = 4000 but added recs = 0.
    When I looked for a solution in SDN I found that using program RSSM_SET_REPAIR_FULL_FLAG for 2nd ODS will reset all full uploads to Repair Full Request which I have already done for 2nd ODS. Then initialize once and pull delta.
    But problem is that I cannot set update mode to delta as I am pulling some 80,000 records in 2nd ODS from 1st ODS with around 14 lacs records daily based on some data-selection filters in infopkg. But do not see any parameters for data-selction in delta mode.
    Please suggest.
    Regards,
    Amit Srivastava

    Dear Sirs,
    Due to this error in activation in 2nd ODS daily data upload is failing in 1st ODS.
    To correct this I converted all full upload requests in 2nd ODS to Repair full requests.
    But now when I scheduled the infopkg today with full upload again data was transferred but not added.
    I know I cannot have init./ delta so what possibly can now be done in this scenario. Please help.
    Regards,
    Amit Srivastava

  • House Bank data upload problem

    Hi,
    I trying to upload data for bank through transaction FI12. My
    problem is that if you go this tranx through recording then the
    left side table i.e. dialog structure is disappearing.
    Due to this, im unable to create bank accounts for a house bank.
    Through recording im able to create only House banks but not
    Bank Accounts.
    Generally for bank creation, House bank is sufficient but i
    require with respect to company code what are the bank accounts
    created.
    Can anyone help on this.

    Hi Zagory,
    I think at the time of this particular job starting there are other jobs running with Background User Owner or there might be other background jobs running with Priority  A . SO  try to reschedule the JOb at some later point in time where system is less busy ......or one more option is to try to increase the System Dialog run Time in RZ10 transaction ...Ask ur basis team they will do for you ...
    But try first option first......
    Assign points if it helps.....
    Regards,
    Vijay.

  • CSV file data upload problem

    Hi all,
    I am new to the oracle technology.
    While uploading the data from csv file i got this error
    "SQL*Loader-350: Syntax error at line 4.
    Expecting keyword TABLE, found "xxgfs_gen_text_lookups".
    APPEND INTO xxgfs_gen_text_lookups"
    my csv file data is
    Invoice Match Options,,,Invoice,,
    Invoice Match Options,,,Receipt,,
    Invoice Match Options,,,Purchase Order,,X
    Invoice Type,A 00,Advance,Standard,Standard invoice,
    Invoice Type,B 00,Expense,Standard,Standard invoice,
    Invoice Type,2 00,Debit Memo,Credit Memo,Credit Memo,
    Invoice Type,2 20,EDI Debit Memo,Credit Memo,Credit Memo,
    Invoice Type,1 00,Invoice,Standard,Standard invoice,
    Invoice Type,1 01,Arrow Credit,Standard,Standard invoice,,
    Invoice Type,1 10,Recurring Payments,Standard,Standard invoice,,
    Invoice Type,1 20,EDI Invoices,Standard,Standard invoice,,
    Invoice Type,1 21,Imaged Invoice,Standard,Standard invoice,X,Default when entering from form
    Invoice Tax Code,XMT,DO NOT USE,,,,
    Invoice Tax Code,STE,DO NOT USE,,,,
    Invoice Tax Code,,,,,,
    Invoice Tax Code,,,,,,
    Invoice Tax Code,,,,,,
    Invoice Tax Code,,,,,,
    Invoice Tax Code,,,,,,
    Invoice Tax Code,,,,,,
    Invoice Tax Code,,,,,,
    Invoice Tax Code,,,,,,
    Invoice Tax Code
    If i am removing the blank row then also it is giving the same problem.
    If anybody face the same problem then please help me out.
    thanks in advance to u all for ur help.
    -Rajnish

    Running SQL*Loader as:
      sqlload userid=... control=... data=... log=...
    HOSTSTR logical has been set to same value as your connection string but
    without domain name.
    When you have specified connect string (ie. SCOTT/TIGER@DATABASE) but no
    domain you receive these errors:
      SQL*Loader-704: Internal error: ulconnect: OCIServerAttach [0]
      ORA-12154: TNS:could not resolve service name
    When you have not specified connect string (ie. SCOTT/TIGER) you receive
    these errors:
      SQL*Loader-704: Internal error: ulconnect: OCIServerAttach [0]
      ORA-12162: TNS:service name is incorrectly specified
    Your sqlnet.ora has:
      names.default_domain entry = world
    The syntax in your tnsnames.ora entry is correct.
    Your entry in tnsnames.ora does not include the .WORLD extension (default
    domain from sqlnet.ora).
    Solution Description
    Specify the .WORLD in your tnsnames.ora and also in your connect string.
    This will remove the error.
    Also, ensure you are not hitting Bug 893290.  Can you connect to the database from that server using sqlplus?

  • Problems in master data upload

    Hi all,
    What r the problem we will face in master data upload?
    Plz  explain this as very urgent.
    Thanks in advance
    Venkat

    Hi Venkat,
    My suugestion is Before Posting a new Post Please close u r previous threads if u r satisfied...
    Then everone will be interested to answer to u r queries..
    Regards,
    Sridhar

  • Basic Data Upload to MATERIAL  Using LSMW is not working

    HI All,
      we are using LSMW    /SAPDMC/SAP_LSMW_IMPORT_TEXTS  program to upload the  basic text of the material, all steps are executed correctly and shows  records are transfered correctly , but the  in MM03 the text is not uploading..
    EPROC_PILOT - MASTER - TEXT_UPLOAD Basic long text 1line
    Field Mapping and Rule
            /SAPDMC/LTXTH                  Long Texts: Header
                Fields
                    OBJECT                       Texts: Application Object
                                        Rule :   Constant
                                        Code:    /SAPDMC/LTXTH-OBJECT = 'MATERIAL'.
                    NAME                         Name
                                        Source:  LONGTEXT-NAME (Name)
                                        Rule :   Transfer (MOVE)
                                        Code:    /SAPDMC/LTXTH-NAME = LONGTEXT-NAME.
                    ID                           Text ID
                                        Source:  LONGTEXT-ID (Text ID)
                                        Rule :   Transfer (MOVE)
                                        Code:    /SAPDMC/LTXTH-ID = LONGTEXT-ID.
                    SPRAS                        Language Key
                                        Source:  LONGTEXT-SPRAS (Language Key)
                                        Rule :   Transfer (MOVE)
                                        Code:    /SAPDMC/LTXTH-SPRAS = LONGTEXT-SPRAS.
                                                 * Caution: Source field is longer than target field
                /SAPDMC/LTXTL                  Long Texts: Row
                /SAPDMC/LTXTL                  Long Texts: Row
                    Fields
                        TEXTFORMAT                   Tag column
                                            Rule :   Constant
                                            Code:    /SAPDMC/LTXTL-TEXTFORMAT = 'L'.
                        TEXTLINE                     Text Line
                                            Source:  LONGTEXT-TEXTLINE (Text Line)
                                            Rule :   Transfer (MOVE)
                                            Code:    /SAPDMC/LTXTL-TEXTLINE = LONGTEXT-TEXTLINE.
    and at last it displaying as follws
    LSM Workbench: Convert Data For EPROC_PILOT, MASTER, TEXT_UPLOAD
    2010/02/01 - 10:14:25
    File Read:          EPROC_PILOT_MASTER_TEXT_UPLOAD.lsmw.read
    File Written:       EPROC_PILOT_MASTER_TEXT_UPLOAD.lsmw.conv
    Transactions Read:                    1
    Records Read:                         1
    Transactions Written:                 1
    Records Written:                      2
    can any one tell us what could be problem
    Regards
    Channappa Sajjanar

    Hi , thanks for your reply,
      i run the all the steps .
      when i run the program it gives message as follows
    Legacy System Migration Workbench
    Project:                              EPROC_PILOT     eProcurement Pilot
    Subproject:                           MASTER          Master data Upload / Change
    Object:                               TEXT_UPLOAD     Basic long text 1line
    File :                                EPROC_PILOT_MASTER_TEXT_UPLOAD.lsmw.conv
    Long Texts in Total:                  1
    Successfully Transferred Long Texts:  1
    Non-Transferred Long Texts:           0

  • BW upgrade EHP1, data uploads should stop?

    Dear experts,
    we have plan for system upgrade. current system is BW 7.0 sp 17, Now planned for EHP1 and SP9.
    I know the there some post upgrade activities, which include consistancy check for objects (infoobject, transfer rules, cubes,dso, etc...)
    Could some one pls confirm, do we need to stop the data uploads/stop process chains during system upgrade?
    Thanks in advance!
    Best Regards,
    Mannu

    Hi Ingo,
    RSRT was giving proper results. We have now implemented few SAP notes and the issues got resolved.
    The following are the notes:
    1499233 - MDX:bXML flattening, unbalanced hierarchy, empty columns
    1485648 - MDX: bXML flattening and hierarchies and displaced columns
    1446245 - MDX: Error when RSR_MDX_BXML_GET_GZIP_DATA is called
    1441767 - MDX: No data for bXML if only ALL member is requested
    1438091 - MDX: basXML: Object MEASURE 0 not found
    1435844 - MDX:Wrong no. decimal places for basXML flattening interface
    1432162 - MDX: Flattening problems when using hierarchies
    1420169 - MDX: bXML flattening: Subsequent note
    1411491 - MDX: bXML flattening in transac. MDXTEST: Selecting packages
    1404328 - MDX: bXML flattening and PROPERTIES: Columns overwritten
    Thanks for your inputs.
    Regards,
    shesha.

  • Tables used in MM Master Data Upload using LSMW or BDC?

    Hi,
    Can anyone provide me with the following information about uploading MM Master Data using LSMW or BDC :-
    1. What and all tables are used for uploading Material Master, Vendor Master, Info Record, Open PO, Open PR, RFQ, Open Contracts/Agreements ?
    2. What problems are faced during Data Upload ?
    3. What error appears/encouontered during upload ?
    4. What is the diffrence b/w LSMW and BDC ? Both can be used for Data upload so, what differences are b/w them ?
    5. Any other thing to remember/know during Data Upload ?
    Thanks,
    Lucky

    Hi Lucky,
    Dont get angry by seeing my response.
    See each and every question posted by u is available , in this forum,
    u need to do just a search on individual post.
    see what ever ur posted, u r definetly got the solutions through others,
    remember most of these answers are fetching by this threads itself (60-70%)
    a few solutions are there that are given by own,
    so better to search first, in the forum , if not satisfied with the solutions then .........
    Just its an friendly talk
    and Reward points for helpful answers
    Thanks
    Naveen khan

  • Best data upload protocol.

    HI Everyone,
    We are managing a JSP based application in which we uploads data using HTTP Protocol. Daily data uploads are in GB's. the problems that we are facing are:-
    Memory Heap size.
    Request timeout errors of our web server.
    No Resume support in case of network error.
    Now we want to upgrade our application to get following benefits:-
    Get rid of above mention problems.
    Reduce data upload time.
    Resume support in case of network errors.
    I have following questions in this regard:-
    What should be the Data upload protocol ? HTTP OR FTP.
    What should be the Application interface for data uploading module ? like Java Applet, JSP OR Java Desktop application etc .
    To resolve Memory Heap size errors we are using cronjob to restart web server, is it OK ? OR there is some other batter solutions for this.
    Your value able advice is required to make correct decision.
    Regards.

    There is no best. What CAN you use? If FTP is an option (and make it SFTP) then of course go for that, then you don't have to be bothered by the clunky HTTP file upload.
    > What should be the Application interface for data uploading module
    How about a proper existing SFTP client? There is absolutely no reason at all why that should be anything Java related.
    > To resolve Memory Heap size errors we are using cronjob to restart web server, is it OK ? OR there is some other batter solutions for this.
    How about finding and solving the source of your memory leaks?

  • Data upload error during uploading data in catalogs ( CCM 2.0 )

    Hi ,
    I am having problem during upload of data into catalogs of CCM 2.0. I think there is some problem with my file. When i click "Upload Schema" it just upload schema not the data. But if i am removing schema from file it is giving me error.
    Kindly if anyone can send me sample file for data upload in CCM 2.0 or steps to upload data correctly.
    Thanks & Regards,
    Kamal

    Hi,
    I don't understand very well, the standard characteristic for images is /CCM/PICTURE.
    It's a complex characteristic composed by
    /CCM/DESCRIPTION
    /CCM/URL
    /CCM/MIME_TYPE. (for example image/jpeg).
    If your problems still exist, you can send me your file
    Regards

  • BOM Data upload

    Hi,
      We plan to create new fields in the BOM header through INCLUDE structure CI_STKO and do a data upload into these fields through LSMW. However, the issue is that CI_STKO structure is not included in the BI structure BICSK used in LSMW, hence the fields added to the BOM header through CI_STKO cannot be loaded through LSMW.
    This works fine at the item level - since CI_STPO is included in BICSP.
    Can anyone let me know if they have faced similar problem and are aware of a solution.
    Thanks,
    Vikas

    Hi,
    Insted of LSMW goo for BDC.. if not goo for BAPI.. you have an enhancemnet concept by using this concept you can enhance the standard BAPI.. and you can process for upload ...
    BAPI_MATERIAL_BOM_GROUP_CREATE

  • Data upload without XI/PI

    Dear Experts,
    I am stuck with some problem and i hope you all there will be able to help me.
    Scenario:
    client doesn't have Web Services i.e. XI/PI and they want to sync data from one application to SAP. The application extracts XML file to file server which then i need to pick and upload to SAP system.
    Problems:
    Is it possible to pick up file from 'file server' without XI/PI ? (if yes then how?)
    how to convert the file from XML to flat once file is picked?
    once there is success/failure of data upload then there should be some log file for the confirmation.
    this is the overview of the problem once i break this then i hope i will have some other questions to ask.
    Any light on this is highly appreciated and thanks in advance.
    waiting for your replies
    Cheers !
    Moderator message: duplicate post locked.
    Edited by: Thomas Zloch on Nov 6, 2011 3:55 PM

    Hi Neeru,
    Is it possible to pick up file from 'file server' without XI/PI ? (if yes then how?)
    Yes it is possible.
    how to convert the file from XML to flat once file is picked?
    There are some standard fm are available.
    once there is success/failure of data upload then there should be some log file for the confirmation.
    Maintain a job and collect the log.
    Regards,
    Madhu.

  • BW Archiving run Locks data upload

    Hello BW Experts.
    We are working with Archiving, but we have one problem, when we do data load to InfoProvider, data Load  is failing due to Archiving run locks.
    If we run Delete archiving step, then we can do data upload to InfoProvider without any issues.
    But we are not interested to perform Delete Step immediately; we want to do after few days. But this causes big issue for data loading in production system.
    Is there any program to unlock Archiving session?
    Is there any other process to unlock the tables?
    I know that we can unlock archiving session thru table RSARCHREQ, But I feel this is not right way to do it.
    Any help regarding this is appreciated.
    Thanks and Regards
    Venkat

    Hi Harsh,
    The Archiving process has two steps, the archiving step that writes the archive file and the second step that deletes the data. The Data Target is locked until the second step is completed.
    If you do not wish to delete the data and release the locks, then change the archiving process status to invalidate the archiving run. (transaction SARA).
    I hope this helps,
    Mike.

  • Data upload fails because of line breaks in data

    Hi experts,
    I´m facing an upload problem from R/3 to BW because of line breaks and tabulator spaces in R/3 data. The data can be loaded to the PSA but it is not possible to load the data to the ODS.
    Do you have any ideas how to permit the line breaks or remove them while uploading.
    Any hints are welcomed,
    Regards
    René

    For the info-object in question you can create a global transfer routine. When you open you info-object, tab general you'll find a section transfer rule. You can create a generic transfer rule here where you filter unwanted characters.
    You may want to use the ABAP command 'CONDENSE'.

  • Query for monitor data upload

    Hi, Experts
        Normally in Cube we just have requestID, which only has number information and nothing else( request date, time, selection, type of data upload ... )
        Can I make a Bex query show information just like Cube manage?  becase we had to check whether there is duplicated selection request is not deleted or some missing request in case multi-datasource to one cube
        I can not find any useful query in BW statistics queries.
    thanks in advance.

    I'm also can not found enough information from table RSMONICDP
    In our case, Cube 0COOM_C02 have lots infosources, some are full upload and some are Delta upload. all of inforpackage are scheduled one process chain.
    then I go to log of this process chain, I found some error happened in some days, so some time the process chain is not finished, so that's means in Cube 0COOM_C02 have missing request and duplicated request.
    I'm hard to using cube-manage to found all of problem request because there are so many request and so little windowns.  so my question is, is there any Bex query or BW table can indicate similiar information within cube - manage - request tab.
    so I can analysis them in Excel, it's quict easy for me.
    thank you all

Maybe you are looking for