Full upload between ODS's

Hi,
I have two ODS Objects (ODS1, ODS2 both identical ). I am doing a full upload from ODS1 TO ODS2. ODS1 has approximately 2.5M records. Load ran well for a while and then it froze. I couldnt even open a new session nor execute a transaction.
Can some one send me a solution for this ASAP.
Can we do a full upload for a such a huge set of records.
Best Regards,
Ravi.

Hi,
I will suggest you to check a few places where you can see the status
1) SM37 job log (In source system if load is from R/3 or in BW if its a datamart load) (give request name) and it should give you the details about the request. If its active make sure that the job log is getting updated at frequent intervals.
Also see if there is any 'sysfail' for any datapacket in SM37.
2) SM66 get the job details (server name PID etc from SM37) and see in SM66 if the job is running or not. (In source system if load is from R/3 or in BW if its a datamart load). See if its accessing/updating some tables or is not doing anything at all.
3) RSMO see what is available in details tab. It may be in update rules.
4) ST22 check if any short dump has occured.(In source system if load is from R/3 or in BW if its a datamart load)
5) SM58 and BD87 for pending tRFCs.
Once you identify you can rectify the error.
If all the records are in PSA you can pull it from the PSA to target. Else you may have to pull it again from source infoprovider.
If its running and if you are able to see it active in SM66 you can wait for some time to let it finish. You can also try SM50 / SM51 to see what is happening in the system level like reading/inserting tables etc.
If you feel its active and running you can verify by checking if the number of records has increased in the data tables.
SM21 - Job log can also be helpful.
Thanks,
JituK

Similar Messages

  • ODS optioin- Is it going to be full upload or init delta?

    Hi frnds,
               I got a query where the scenerio goes like this,
    Data loading from ods to infocube. Let us suppose in ods settings, i have selected the option Update data targets from ODS object automatically. Then my question is whether it will be a init delta or full upload.....please explain the scenerio.
    thanks in advance
    regards,
    dubbu.

    Hi Vinit,
    This will be dependent to the load in ODS,
    if the load is init in ods the further upload will be init.
    If the load to ods is delta the further upload will be delta
    Reward if useful
    Manish

  • - Differences between delta and full uploads of transactional data

    Hi All,
    This is with refrence to the loading transactional data into the cube.
    I know that delta and full uploads of transactional data into the cube is done for the following reasons.
    1. Delta upload allows change in transactional data (add/delete/modify) for the particular period selected but this is not the same in full upload.
    2. Full upload takes less time compared to delta upload for the particular period.
    Please let me know whether my understanding is correct. Otherwise, please explain these two concepts with examples taking loading transactional data  into the cube.
    Regards,
    Ningaraju

    hi,
      full upload is done to avoid more time to load the data .if u use initilallization then it will take more time to laod the same amount of data when compared to full load.
    however your statement about the delta is correct.
    u can perform delta only after sucessful initiallization.
    so
    1.first full load(to avoid more time0
    2. inirtilallization with zero recored.
    3. delta
    regards

  • Inconsistencies data between ODS and RSA3 in SAP system (delta problem??)

    Hi everyone,
    I have a problem in my SAP BW production system. There is inconsistencies data between ODS (using tcode LISTCUBE) and Datasource in SAP system (tcode RSA3).
    I'm using Datasource: 0FI_GL_10 and ODS: 0FIGL_O10.
    In SAP system (using RSA3):
    GL Account: 1202012091
    Comp Code : 5400
    Bus Area: 5401
    Fiscal Year :2008
    Period: 07
    Accum Balance : $0
    Credit Total: $350
    Debit Total: $350
    Sales : $0 
    And in ODS (using ListCube):
    GL Account: 1202012091
    Comp Code : 5400
    Bus Area: 5401
    Fiscal Year :2008
    Period: 07
    0BALANCE : $350  (it should be $0,shouldn't it? )
    0CREDIT: $0 (it should be $350,shouldn't it? )
    0DEBIT: $350
    0SALES: $350 (it should be $0,shouldn't it?)
    I have tried these :
    1. Check if there is a start routine / routine or not in transfer or update rules....and...i get nothing
    2. Check the data using tcode FBL3N..but the data is ok
    And then i tried to trace the process chain and find error on delta infopackage which pull these data
    Date
    07.30.2008 : 231141 records --> error (ODS activation) --> request has deleted forever
    07.31.2008 : 2848 records   --> error (ODS activation) --> request has deleted forever
    08.01.2008 : 135679 records --> error (ODS activation) --> request has deleted forever
    08.02.2008 : 135679 records --> successfull repeat delta action
    From 30 July until 1 August there is an error on delta infopackage because
    fail to activate ODS (No SID found for value 'RIM' of characteristic 0UNIT). On 2 August someone delete the red requests and repeat the delta and success. Is there any possibilities that this problem arise because of this? If this the cause, how to bring back 2848 and 231141 records which has deleted?
    Thank you and Regards,
    -Satria-

    Hi everyone,
    I have solved my problem. I executed my infopackage again (selection condition using GL Account which have wrong amount) with update mode 'Full update' and choose Scheduler > Repair Full Request and tick the checkbox.Thank you.
    Regards,
    -Satria-

  • Delta upload full upload

    Hi Gurus,
    In ODS already delta is loading every half an hour.  But, I don't see any data for one field in ODS.  In that case, I want to load the data again to see if i can get the data?  how can i do a full upload while delta is enabled? do i need to stop this delta and enable the full upload? or just full upload?
    any help appreciated.
    thanks

    hi S.K.B,
    i guess you enhanced business content datasource,
    yes you need to populate the new added field in r/3
    via user exit zxrsau01 (transaction data), take a look
    sample code
    Enhancement
    User exit
    and
    https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/docs/library/uuid/59069d90-0201-0010-fd81-d5e11994d8b5
    also weblogs on enhancement
    /people/sap.user72/blog/2005/02/14/logistic-cockpit--when-you-need-more--first-option-enhance-it
    /people/ajay.das/blog/2005/03/28/custom-fields-in-standard-extractors--making-a-mixed-marriage-work--part-12
    (http://www.ko-india.com/content/weblogs/weblog_custom_fields_1.pdf)
    hope this helps.

  • Init Data Upload from ODS to Info cube

    Hi,
    When i am trying to upload the date from ODS to info cube using initial upload option in info package, i am not able to see the selection option fields.
    Whereas using the same ODS & info cube, if i am trying to change the option to full upload in info package, i am getting all the selection fields.
    what could be the reason for the same.
    please help
    Rajiv

    Somnath,
    Let me explain u the complete scenario
    1) we have an existing ODS which populates an info cube on daily basis (delta upload)
    2) we have created a new info cube, which fetch the data from the same ODS. that means 2 info cubes which are getting populated from one ODS.
    3) now when we have tried to initialize the new info cube from the existing ODS, it asked us to delete the initialize upload options "scheduler - initialize option for the system". & we have done the same. but by doing this, ODS has removed the option of delta upload from old info cube upload info package &&  new info cube initial upload is not dislaying any of selection fields..
    4) now, i have a query
    - how can i activate delta on old info cube (without losing any data)
    - how to upload data in new info cube with selection fields and start delta
    Regards
    Rajiv

  • Full Upload

    Hi Experts,
    Would like to seek assistance on this...
    In the case of full upload to info providers what is the normal loading strategy?
    Is it always delete the existing records[requests] before loading the records again.
    I know delta is very straight forward that after initialization next step delta update takes care..
    Pls advise.
    Alok

    Hi,
    The stratagy depends on your data model and the type of data you are extracting.
    If you are using ODS as a staging layer in BW, you need not delete the previous request.
    If you are using any selections in info packages you need not delete the previouos full load request.
    If you are not using any selections in info package and updating the info cubes, and the data contains some previously loaded records then you need to delete the previous request.
    Hope this helps.
    Praveen

  • Error while making joins between ODS in a Infoset

    I am getting these 2 errors while making a join between ODS In InfoSet
    1).Incomplete ON conditions in an outer join
    2).Invalid ON condition for left outer join (ZSDO015)
    Please help what to do?

    Hi,
    Check both ODS's are having a common Infoobjects are not i.e. same name and datatype and length.
    http://help.sap.com/saphelp_nw04/helpdata/en/44/b21f98be886a45a4bb7e123e86e4c2/frameset.htm
    Thanks
    Reddy

  • Error occured while uploading to ODS.

    Hi,
    error occured while uploading to ODS  from R/3........
    Error message when processing in the Business Warehouse
    Diagnosis
    An error occurred in the SAP BW when processing the data. The error is documented in an error message.
    System response
    A caller 01, 02 or equal to or greater than 20 contains an error meesage.
    Further analysis:
    The error message(s) was (were) sent by:
    Second step in the update
    Procedure
    Check the error message (pushbutton below the text).
    Select the message in the message dialog box, and look at the long text for further information.

    Hi varun,
    thx's 4r reply
    I already check it in st22 transaction..... it's showing short dump.
    ShrtText
        Exception condition "NOT_EXIST" raised.
    What happened?
        The current ABAP/4 program encountered an unexpected
        situation.
    What can you do?
        Print out the error message (using the "Print" function)
        and make a note of the actions and input that caused the
        error.
        To resolve the problem, contact your SAP system administrator.
        You can use transaction ST22 (ABAP Dump Analysis) to view and administer
         termination messages, especially those beyond their normal deletion
        date.
        is especially useful if you want to keep a particular message.
    Error analysis
        A RAISE statement in the program "SAPLRSSM" raised the exception
        condition "NOT_EXIST".
        Since the exception was not intercepted by a superior program
        in the hierarchy, processing was terminated.
        Short description of exception condition:
        For detailed documentation of the exception condition, use
    and  one more thing. i don't know about the locks can u plz clear about that locks......
    when i enter the sm12 then its asking
    Lock argument and table name
    4r wht perpose these thing's
    thank's in advance
    regard's
    venkat

  • Generic Delta: Init, Full Upload and Deltas...

    Dear All,
    Last week on 10.04.2014 I've created "Generic DataSource with Generic Delta" and used CalDay as Delta-Specific Field and lower limit kept to blank and upper limit to 1. On 11.04.2014 I've initialized the Delta with Data Transfer using Process Chain: Time 1:15 p.m (Note: Posting was on)
    see below screen shot:
    Result: Process Chain was scheduled to immediate and it executed and initialized the delta and brought records as well. See below screen shot of RSA7:
    Q: 1: Why generic delta current status is 10.04.2014, as I'v scheduled delta through process chain on 11.04.2014 @ 20:50: and why Total column is showing 0 records? Even I've checked Display Data Entries with Delta Repetition, but could not find any records?
    Following is the InfoPackages which I created for Delta Loads and used in another Process Chain for Delta scheduled (daily @ 20: 50).
    Following is the DTP used into Delta Process Chain:
    On 12/04/2014 when I checked my InfoCube, and found that same number of records being loaded again despite InfoPackage and DTP were created as Delta, see screen shot:
    On 12/04/2014 See below PSA table:
    On 14/04/2014 when I checked InfoCube and I saw deltas being loaded successfully: see below screen shot:
    On 14/04/2014 see below PSA table:
    On 14/04/2014 see below RSA7 BW Delta Queue Maintenance screen shot:
    Q: 2: Why am I unable to load the Delta records on the day(11/04/2014) when I initialized it with Data and why same number of records being loaded again on 11/04/2014?
    Q: 3: On 11/04/2014, around 1:15 p.m I've initialized Delta with Data successfully and the same day (11/04/2014) was my first load of delta records (Time: 20: 50), could this be a problem? as the posting period was on when I initialized and I tried to load the delta on the same day?
    Q: 4: Why the pointer is showing me Current Status: 12/04/2014 today and this time around I can find last delta records from "Display Data" option with Delta Repetition?
    Q: 5: Have I missed something in the design or data flow?
    Q: 6: How can I verify deltas records accuracy on daily basis; is it from RSA3?
    or How can I verify daily deltas with ECC delta records?
    Q: 7: What is the best practice when you load full data and initialized delta and schedule deltas for Generic DataSources with CalDay as delta specific field?
    I will appreciate your replies.
    Many Thanks!!!
    Tariq Ashraf

    I clearly got ur point.....But
               If i have situation like ....." I loaded the LO data from R/3 with Init delta......then i loaded 5 loads of delta too.......due to some error in data now i am trying to load all th data......In this case if i do Full upload it is going to bring all the data till 5 th delta load .......If i do Init delta again it is going to bring the same data as full upload..........In LO while loading both Full load or delta init load we need to lock R/3 users....I believe we can load delta even after full upload by using "Init delta without data transfer"............So that wont be the barrier....then what is the difference or how should i figure out which load should i do.............If i am wrong please correct me.......
    Thanks

  • Full upload misses records while delta brings them

    Hi experts,
    I'm using 2LIS_12_VCHDR extractor and with delta exctractions everything is worknig correct. The problem comes when I want to make a full upload. It is not taking the last records (while delta took them). It is not bringing records into BW since the exact day I reloaded the 12 LIS tables.
    If I create a sales order, and this order is delivered, my delta infopackage brings it into BW, but my full infopackage is not bringing it.
    Any clues?
    Points will be given,
    Thanks

    My scenario is the next one for 2LIS_12_VCHDR:
    - Full upload on Nov 13th (6 months ago) and it takes 170103 records.
    - I make another full upload today and I obtain 107103 records (the same as 6 months ago, and there are a lot more).
    - I make an init request > I create a delivery in ECC> I make a delta upload an it takes it.
    - Then I make a full upload and it is not taking it. I still have 170103 records.
    Do I have to reiniciate the 2LIS_12 in ECC (OLI8BW)?
    In the other hand I have the 2LIS_12_VAITM working with delta every day. How it will affect to this extractor if I reiniciate all the 2LIS_12? I'm going to miss data? I'm going to duplicate data?
    Thanks

  • Deltas and full upload

    Hi all,
    Let's say we have a scenario like this.
    For document number 1001 and document item number 10,
    the order quantity was 200 initially. This was transferred to infocube. Now, the order quantity changed to 180. I assume that in a delta upload, -20 will be transferred to the cube. But, for full-upload what value gets transferred to BW. Is it 180? If so, how does these values get tallied? I mean, 200+180=380. What really happens here?

    hi Pooja,
    initialize in infocube
    doc no doc item order quantity
    1001   10       200
    change to 180 in delta upload, records in infocube will be
    doc no doc item order quantity
    1001   10       200
    1001   10       -200
    1001   10       180   (value in report = 180)
    full upload
    1001   10       200
    1001   10       180  (value in report = 380)
    hope this helps.

  • Datamart full Upload

    Hi all,
    FIGL , Inventory cubes loading with delta records in BI, whenever loading same data into SEM but it is not loading with delta, every time doing full upload only, but those cubes having huge data so it is impact on load , even i did init w/o after full upload but its not loading for delta, every time its loading the whole data from the both cubes. pl suggest me to do delta.
    Thanks
    Pinky
    Edited by: pinky reddy on Apr 25, 2009 9:31 AM

    Hi Pinky,
               Could you check your info packages update tab strip whether it is in delta package or init package
    Thanks,
    Vijay.

  • I am running full upload daily now i want to load delta records how can i

    Hi Gurus,
      I am running full upload daily now i want to load delta records how can i do that could any body help recording this..
    Regards
    Kiran Kumar

    Hello Kiran,
    First make sure that you datasource is delta enabled. If so create a new infopackage for Init without data transfer and execute it. Then create a InfoPackage for Delta.
    Thanks
    Chandran

  • Delete and full load into ODS versus Full load into ODS

    Can someone please advise, that in the context of DSO being populated with overwrite function, would the FULL LOAD in following two options (1 and 2) take the same amount of time to execute or one would take lesser than the other. Please also explain the rationale behind your answer
    1. Delete of data target contents of ODS A, Followed by FULL LOAD into ODS A, followed by activation of ODS
    2. Plain FULL LOAD into ODS A (i.e. without the preceeding delete data target contents step), Followed by activation of ODS
    Repeating the question - whether FULL LOAD in 1 will take the same time as FULL LOAD in 2, if one takes lesser than the other then why so.
    Context : Normal 3.5 ODS with seperate change log, new and active record tables.

    Hi,
    I think the main difference is that you with the process "Delete and full load" you will only get the full loaded data inside the dso. With full load you the system uses the modify logic (overwrite for all datasets already in the dso) and insert for all new datasets which are not inside yet inside the dso
    Regards
    Juergen

Maybe you are looking for

  • How to move one table to other schema

    Hi; imagine i have one schema A and i create table in this schema as test. Now i have one other schema,name is B. I wanna move my A.test table to B.test2 How i can do it? thanks

  • Images in Photoshop Express 10 albums not showing in iTunes

    iTunes and Photoshop Express 10 (PSE10) can collaborate to maintain a catalog of images contained in photo albums on the iPhone and in PSE10. This option is set in the iPhone window in iTunes under "Photos." The user is asked to check which albums sh

  • SOAP Error When Referencing Indexed Picklist Fields

    You may receive the following error, or something similar, when trying to access the "Indexed Picklist" fields via webservices. <siebelf:errormsg>Method 'SetFieldValue' of business component 'Opportunity Copy No Sales Process' (integration component

  • HT201272 How do I unhide a purchased book in iBooks?

    How do I unhide a purchased book in iBooks?

  • Taking tracks apart etc...

    Hi Guys, Just purchased my copy of Logic Studio 9 today and was wondering if i could get a little advice. I need to make the a backing track with the 3 different keyboards, so i can make sure they are the right voices i want to hear and have the righ