Delta INIT problems in production system

hi!
How do you do, I hope all fine, well, I have a problem in production system, I execute the delta init and the data is in BW the problem is when I want to activate the data, the system give me a processing error, somebody can helpme please?
BR

Hi
Please check with your Basis team to confirm there are enough resources.
Also Check what is max work process time . Tcode RZ11 parameter rdisp/max_wprun_time
Check this thread which mentions a lot of related notes.
Link: [Re: DSO-data activation issue;
Regards
Sanjyot

Similar Messages

  • Data Mismatch problem in Production system

    Dear Experts,
    We have been encountering serious data mismatch problems with our source system since 29th Dec 2010.
    We have realized this mismatch only on 11th Jan 2011. So, we have deleted entire data from 29th Dec 2010 onwards to 11th Jan 2011. Then we tried to load delta again as per our Process Chain sequence. Since then, our production system has become very inconsistent. We found that the entire data from 29th Dec 2010 to till date is started giving mismatch.
    Our system got damaged in terms of data drastically. If I delete any bad requests from our Cubes & Data Store Objects, the next time when we try to load data, cube & Data Store Objects is showing the deleted request no. is not updated properly. Because of this, our Process chain is daily failing to complete and giving so many technical errors. We are totally confused to restore our data from 29th Dec 2010 to till date.
    Can any one please help us to restore our data fully and Process Chain to run successfully...............?
    Thanks in Advance,
    Suman

    Hi Suman,
    I understood that you are using 3.x version of BW
    such kind of issues occurs if you distrub the delta sequence by deleting the requests without making them red.
    you can resolve this two ways......
    1) Go to reconstruction tab of the DSO and CUBE, select all the req from 29th Dec 2010 thru 11th Jan 2011 and reconstruct them. once they appear in the request tab. make all the requests to red (set the QM status to RED) one by one and delete them from the CUBE and DSO.
    This should set the delta status  back to 29th and you will be able to run a repeat delta from then in your next delta.
    2) Go to RSRQ and give the SID or the Request ID one by one that it is giving you ( request no. which is not updated properly)
        Mke the QM status as RED for all of them.
    Since you have already deleted them from the targets, just changing the QM status to RED will be sufficinet.
    You have make the QM status to RED inorder to inform the system that the requset is incorrect one and has been deleted.
    Once it is set to RED, system will not prompt any req and say "request no.# not updated properly".
    i suggest you the second option.
    Once you resume the delta load, it should run successfully.....and if you feel some data is still missing from 29th Dec 2010 thru 11th Jan 2011, just run a full repair load the get the data back.
    If your Datasource is of LO extraction u may need to fill the setup tables to perform full repair.
    Please let me know if the Datasource is of LO extraction. Also let me know once the issue is resolved.
    Regards,
    Sudheer.
    Edited by: Sudheer Kumar Kurra on Jan 22, 2011 7:58 AM
    Edited by: Sudheer Kumar Kurra on Jan 22, 2011 8:03 AM

  • Early Delta Initialization option in Production system

    Hi Experts,
    I have seen some threads related to Early Delta initialization in this forum. I have a question regarding this option.
    To minimize the downtime/block users in the R/3 production system during set up tables fillup I would like to use the Early delta option. Please give your suggestions on the following procedure:
    For Sales Orders, billing and Deliveries
    1. Initialize Early delta in the BI system.
    2. Go and fill setup tables
    3. Do a full load to BI
    4. Run delta job in R/3 (LBWE)
    5. Run delta job in BI Info package
    Do we miss any documents in this process for e.g. during set up tables fill up changes to existing documents or newly created sales orders.
    I feel we won't miss. All changes will be reflected in Delta queue. And even newly created orders also will be coming to delta queue.
    Please let me know if I am wrong.
    Thank you.

    Need Clarification on this thread,
    According to my understanding, during the filling of set up table if a user does any posting, then the data available in BW will be incorrect.
    am i right?
    In your sequence the 4th step
    4. Run delta job in R/3 (LBWE),     this means you are using queued or unseriallized v3 update.
    Early delta iniliazation essentially of no advantage because, as conventional initialization procedures, you can readmit document
    postings after successfully filling the setup tables.
    can anybody clear my doubt. is it possible during setup table filling user can post a document.
    I know the documents will be collected in the queue. But the data loaded to bw will be incorrect right?
    Regards,
    Anand.
    Edited by: araj123 on May 26, 2010 11:41 AM

  • Proxy generation Problem in Production system

    Hi Experts,
    I have consumed WSDL file and created Client Proxy in development system.which is working fine
    After testing it is Transported to the Quality system and Production system both.
    In Quality system Proxy is generated properly and functioning properly.
    But in Production system, Proxy is not generated and even in Tcode SE80 it is not displaying.
    How to solve this problem?
    Please suggest.
    Thanks in advance.
    Regards
    Jagesh

    Hello Experts,
    After Transporting Request to Production systems, all Dictionary objects are also created but Only Proxy is not created(Not appearing in SE80 under the respective package name).
    Please help me in this.
    Regards
    Jagesh

  • RE: Workflow binding problem in Production system(urgent)

    Hi Experts,
    Currently we transported our workflows to production server. When one of the user tested our workflow it failed. On Analysis I could find that the data are not passed from method to task container  but method is working fine separately. Even I tried executing t-code 'SWU_OBUF' , but still not working.
    Pls let me know of the solution .
    Thanks,
    Bharath

    Check for linkages. Is the binding activated betwwen workflow to task container.
    Check whether values appear in task container or not.
    Then check for binding between task & method container. Is there any event linked with the task. Is the event linkage is activated.
    The problem might be resolved if all the linkages are correct.

  • Zpca_c03, delta reset problem

    Hi All,
    I have a problem with the delta loads to ZPCA_C03.
    We have done full loads till March, monthly wise and set the delta init flag during a system down time.
    Again we happened to do December data re-load to the cube. As the cube is duplicated with the december data, in ambiguity we have done selective deletion to remove the double december data, which is wrong. Now the delta is not working.
    How to resolve this issue, possibly without a down time!!!
    We need to have december data again and on going delta to be reset.
    Regards
    Raveena

    Hi,
    I need a clarification on this.
    Is that the uploaded data successfullly loaded to the external system or not?
    Are you uploading data to the cube or some thing else?
    Regards,
    Srini

  • Using COMPRESSION any Problem in Production ?

    Hi All,
    we are woking in SD application in BI7.0.
    We plan to use the compression in SD application area for the Sales Cube (VBAK and VBAP), Delivery Cube(LIKP and LIPS ),Billing Cube (VBRK and VBRP ) and Shipping Cube ( VTTK and VTTS).
    The infoproviders are used in the infoset.The below mentioned scenario occurs in the data loads of the infoproviders because of which decided to use the COMPRESSION:-
    Sales Cube                                                                   Delivery Cube
    Sales Doc        Sales Qty                                                Delivery Doc         Del Qty        Sales Doc
    1                        11                                                        1                            11            1
    1                        -10
    1                        10
    InfoSet ( Before Compression)
    Sales Doc        Delivery Doc            Sales Qty            Del Qty
    1                        1                            11                        11
    1                        1                            -10                       11
    1                        1                            10                        11
    The report developed on the will show Sales Qty = 11, but Delivery Qty = 33. Which is incorrect.
    To avoid the above incorrect data planned to use the COMPRESSION so that the infoset will get the data as mentioned below
    Infoset(After Compression):-
    Sales Doc        Delivery Doc            Sales Qty            Del Qty
    1                        1                            11                        11
    The report developed on the compressed infoproviders will show Sales Qty = 11 and Delivery Qty = 11. Which is correct.
    Please give your suggestions that using the COMPRESSION do create any problems in the Production or is there any other way ?
    Thanks in advance.
    LNV

    Hi ,
    Thankyou for the above reply but as a precaution like to know that what are the scenarios where the COMPRESSION of the infoprovider creates a problem in Production System?
    Thanks in advance.
    LNV

  • Problem with Delta IP on 0TCT_DS23 in Production system

    Hi all,
    I'm encountering a quite tricky problem: I'm currently working on the 0TCT_DS23 Datasource, with the technical content cube.
    I can run an init infopackage on our 3 environments, Dev, Quality and production. Same for full packages. But when I try to run a delta IP, it is failing on the production system, with a dump of the extractor.
    Looking at ST22, it's saying that there is
    a syntax error in program "/BI0/SAPLQI0TCT_DS23 ".
    Causes
        Error in the ABAP Application Program
        The current ABAP program "SAPMSSY1" had to be terminated because it has
        come across a statement that unfortunately cannot be executed.
        The following syntax error occurred in program "/BI0/SAPLQI0TCT_DS23 " in
         include "/BI0/LQI0TCT_DS23$02 " in
        line 7:
        "A FUNCTION already exists with the name "/BI0/QI0TCT_DS23"."
        The include has been created and last changed by:
        Created by: "SAP* "
        Last changed by: "******"
    When browsing the program, it's really the case, in the include   /BI0/LQI0TCT_DS23UXX, there are 3 includes,
    *   THIS FILE IS GENERATED BY THE FUNCTION LIBRARY.             *
    *   NEVER CHANGE IT MANUALLY, PLEASE!                           *
    INCLUDE /BI0/LQI0TCT_DS23U01.
                        "/BI0/QI0TCT_DS23
    INCLUDE /BI0/LQI0TCT_DS23U02.
                        "/BI0/QI0TCT_DS23
    INCLUDE /BI0/LQI0TCT_DS23U03.
                        "/BI0/QI0TCT_DS230001
    The U01 and U02 are containing the same function, so it's failing.
    Do you ever have encountered the same problem, and if yes, how to solve it ? It already tried to reactivate the DS, but no way.
    Thanks in advance
    Frederic

    Hi, Ralf
    Thanks ! It worked ! I had to repair the function module. I assigned the points
    Edited by: Frédéric GAUTHIER on Jan 29, 2010 11:12 AM

  • How to make JOB CONTROL setting for delta extraction in Production system .

    Hello All,
    We are in the process to transport our development to PD server in BI . We have transported the data sources to ECC production and now we are filling the setup table for " SD BILLING BW " . After filling Setup table data in full mode through infoPackage we have to load data in delta mode .    
    How we can set our Update mode(LBWE setting) in PD server ?
    Is LBWE  T.Code authorisation required in ECC PD server to make the setting of Update mode   or   We have to set all  Job Control Paramater in Dev System at the time of transport the datasource request to ECC production .  Is our time which is set in job control of  ECC development  will be reflected in ECC production ?
    can any body tell me about how to make setting for delta extraction in Production system .
    Thanks ...

    Hi,
    How do you load the data in the development system?
    - you set up the update mode in the infopackage. There you must customize "Initialization"
    - you have to filled the setup table in the production system without laoding the data in the BW? then this step was for nothing. Filling the setup table and do the INIT in the BW, this must be together - during no changing will be done in your R/3 SD.
    Sven

  • Problem during import of 121 transport requests to productive system

    Hello
    We have problem during import of transport requests to productive system. Import of 121 transport requests stopped very soon in phase "N" (in TRBAT I have only one entry and in  TRJOB  as well).
    In sm50 there is an BGD running under user DDIC in client 000 now for 14453 seconds (program SAPLSDB2). This should be import.
    In SM37 I can see it as  job "RDDGEN0L" with  repport"RDDGENBB". Based on some literature it should perform "Converting all structure changes generated by the import and recorded in table TBATG, other than the structure changes to matchcode objects." Very interesting  that TBATG has only four entries related to 2 indexes in table "DFKKOPK" , one in table "DFKKREP06" and one" ENQU" for EFKKEXC". (only this last one has not status error)
    For fist two indexes I know they are not present  as objects "LIMU""INDX" in any transport request beeing imported.
    Also on productive system there is no"VOL" and "ZOL"indexes for table "DFKKOPK"(instead they are created on test system ie. not transported from development to test system)
    Last command for that process is "CREATE INDEX "DFKKOPK~HKO" ON "DFKKOPK" ("MANDT", "HKONT", "OPBEL") PCTFREE 10 INITRANS 002 TABLESPACE PSA
    PTR3 STORAGE (INITIAL 0000000016 K NEXT 0000000016 K MINEXTENTS 0000000001 MAXEXTENTS UNLIMITED PCTINCREAS"
    There is enaught space on disk and in tablespaces (it is an oracle/HPux server).
    Does anyone knows workaroun to solve production

    are you importing these transport requests simultaneously into production?
    I would suggest you try doing in smaller groups of 5 or 10 and then see whether you are able to import the requests
    Rohit

  • Problem of unit of measurment in production system

    Hi Friends,
    I'm facing a peculiar problem of unit of measurement like error in all quantity and curreny fields in report output after transporting a query from quality system to production system but this query is generated from multiprovdier. what might be the problem.   
    Regards,
    Srinivas.

    Hi Dave,
    Thanks for ur reply. but to trace in monitor it is not showing any error message.
    My Mail ID is [email protected]  Please the send the document
    Regards,
    bharat

  • Problem when transporting form from DEV system to PRODUCTION system

    Hi Experts,
    We are developing forms in ABAP, for example there is a form developed in dev system and this form has JS coding in some UI elements events like initialize, on change and on exit; also has one script object defined as variable. this script object has some functions defined in order to do some common validations and field specific ones too.
    The form in DEV work fine, all the functions calling, all the validations and the events are working properly as expected. But there is a problem when we moved the changes done in development system into production system. Specifically the script object has the problem: even though the coding is the same in both systems, in production system we had an script error: "Body.CATALOGPARAMS has no properties", as if the Body.CATALOGPARAMS was never instanciated, or it is not defined...
    The code that produces this error is the following:
    var itemCount = 0;
    itemCount = Body.CATALOGPARAMS.DATA.instanceManager.count;
    CATALOGPARAMS is table defined as context table coming from an ABAP FM where is filled and passed into the form.
    In order to fix this problem I changed that part for the following
    var itemCount = 0;
    var catalogTable = null;
    catalogTable = xfa.resolveNode("Body.CATALOGPARAMS.DATA");
    itemCount = catalogTable.instanceManager.count;
    This still works as fine as the other in DEV system. But my question is: will I have the same problem when we transport the changes to production system, you have to know that a transport is not something that you can do every day, so I am taking precautions before the transport. Which of both coding is the best for doing this?
    Any observations, comments, questions in order to clarify some points are welcome, so please do it.
    In advance, thanks a lot.
    Mauricio.-
    Edited by: Mauricio Poblete on May 11, 2010 4:20 PM

    As always, you are the first one to reply... thanks for that!
    before everything, I activated the form, then I added this form to a new transport using se80 transaction: I navigated through the form objects and I added the form to a new transport by second click on the form -> other functions -> write transport entry. is this the correct way to assign a transport package with the entire form (including script objects, layouts, and all you told in the last reply)??
    Can you give me a guide on how-to add the specific parts to the same transport for forms?
    as always, thanks in advance.
    Mauricio.-

  • If the delta is break in my R3 production system.

    Hello All,
    How to initialise the delta if the delta is break in my production system in R3. The delta is fetching records in more then 1 ODS. Once set up how to check if no single record is missed.
    Is there any proocedure how to take care of the existing records in BW production so that we get all the records from R3 to BW.
    Thanks,

    Hello Sandeep,
    What do you mean by this ??? "Is there any proocedure how to take care of the existing records in BW production so that we get all the records from R3 to BW"
    Regards,
    Jeannie

  • Delting Objects from production system

    Hello All,
    I need to delete some of the Infoobjects,ODS and Infocubes from my Production system.
    for e.g:
    We have Inserted infoobject ZXX as data target,so it is available under infoprovider.
    For this,we have update rules ZYYY.
    In the Infoprovider tab: I right click on the Infoobject ZXX and selected the "Remove Infoobject as data target option".So update rules and the infoobjects were removed from the Infoprovider tab.
    Also I deleted the infosource ZYYY.
    While deleting the Infoobject ZXX from the Infoobject tab,it says "the update rules ZYYY still exists and hence cannot delete the data"
    Please let me know,how can i delelt the Infoobject and the update rules.
    Any help will be appreciated.

    Hi,
    Make sure that you are having prober authorizations to delete the objects. If you have, try to log on again. Also Check for Data flow upward objects from this object.
    Edited by: P. Saravana Kumar on Apr 1, 2009 6:01 PM

  • Best practice to reduce downtime  for fulllaod in Production system

    Hi Guys ,
    we have  options like "Initialize without data transfer  ", Initialization with data transfer"
    To reduce downtime of production system for load setup tables , first I will trigger info package for  Initialization without data transfer so that pointer will be set on table , from that point onwards any record added as delta record , I will trigger Info package for Delta , to get delta records in BW , once delta successful, I will trigger info package for  the repair full request to  get all historical data into setup tables , so that downtime of production system will be reduced.
    Please let me know your thoughts and correct me if I am wrong .
    Please let me know about "Early delta Initialization" option .
    Kind regards.
    hari

    Hi,
    You got some wrong information.
    Info pack - just loads data from setup tables to PSA only.
    Setup tables - need to fill by manual by using related t codes.
    Assuming as your using LO Data source.
    In this case source system lock is mandatory. other wise you need to go with early delta init option.
    Early delta init - its useful to load data into bw without down time at source.
    Means at same time it set delta pointer and will load into as your settings(init with or without).
    if source system not able lock as per client needs then better to go with early delta init options.
    Thanks

Maybe you are looking for

  • Mac mini won't fill vertically mounted hdef tv screen

    Hi I have a brand new mac mini running yosemite hooked up to a hdef tv screen. When the screen is mounted horizontally it looks great. When it's mounted vertically, the image only fills the monitor from side-to-side with huge black borders top and bo

  • Hiding Parsys in Edit Mode

    Hello, I'm building a tabbed component and I'm having trouble hiding the parsys section for inactive tabs using CSS styling (whether visibility or display).  Even though I set the visibility of the parent DIV the parsys is sitting to hidden, it still

  • Toplink insert sequence

    Hi all: Is there any sequence in which toplink fires insert/update queries to the database if there is no object mapping..? Actually I have domain classes which have mapping using the primaryKeys but not the actual objects. I am using toplink 9037. R

  • Using an N95 and SIP without a SIM card

    Hello All, I recently upgraded to a N95 (S/W V20.0.016). As soon as I got the phone I configured it to use SIP through my home wireless point to my asterisk server. All worked well with the phone auto-connecting to my wireless network, registering wi

  • Tuning up indexes

    Environment: SQL Server 2008R2 Problem: the total cost value of the CPU for running basic Select statement is 99%. When integrating Select statement with left outer join, it slows down the performance of the data retrieval. There is 82% cost, 61% cos