Selective load to ODS

Hi Friends,
I have small query.
Scenario: We have an ODS getting delta data from R/3 and this ODS is feeding a cube in delta mode.Some of the records are missing in BW.
We identified the records to be loaded again.
How can we do this missing data load from R/3 to ODS to cube??
Regards,
Shylaja.

hi Shylaja,
you can try full repair,
http://help.sap.com/saphelp_nw70/helpdata/en/1b/df673c86d19b35e10000000a11402f/frameset.htm
You can use a request that was selected as a repair request via Scheduler ® Repair Full Request to carry out a full update in any data target. This also applies to data requests that already contain data from an initialization run or deltas for this DataSource/source system combination, and that have overlapping selection criteria.
Note 739863 - Repairing data in BW
Symptom
Some data is incorrect or missing in the PSA table or in the ODS object (Enterprise Data Warehouse layer).
Other terms
Restore data, repair data
Reason and Prerequisites
There may be a number of reasons for this problem: Errors in the relevant application, errors in the user exit, errors in the DeltaQueue, handling errors in the customers posting procedure (for example, a change in the extract structure during production operation if the DeltaQueue was not yet empty; postings before the Delta Init was completed, and so on), extractor errors, unplanned system terminations in BW and in R/3, and so on.
Solution
Read this note in full BEFORE you start actions that may repair your data in BW. Contact SAP Support for help with troubleshooting before you start to repair data.
BW offers you the option of a full upload in the form of a repair request (as of BW 3.0B). If you want to use this function, we recommend that you use the ODS object layer.
Note that you should only use this procedure if you have a small number of incorrect or missing records. Otherwise, we always recommend a reinitialization (possibly after a previous selective deletion, followed by a restriction of the Delta-Init selection to exclude areas that were not changed in the meantime).
1. Repair request: Definition
If you flag a request as a repair request with full update as the update mode, it can be updated to all data targets, even if these already contain data from delta initialization runs for this DataSource/source system combination. This means that a repair request can be updated into all ODS objects at any time without a check being performed. The system supports loading by repair request into an ODS object without a check being performed for overlapping data or for the sequence of the requests. This action may therefore result in duplicate data and must thus be prepared very carefully.
The repair request (of the "Full Upload" type) can be loaded into the same ODS object in which the 'normal' delta requests run. You will find this request under the "Repair Request" option in the InfoPackage (Maintenance) menu.
2. Prerequisites for using the "Repair Request" function
2.1.  Troubleshooting
Before you start the repair action, you should carry out a thorough analysis of the possible cause of the error to make sure that the error cannot recur when you execute the repair action. For example, if a key figure has already been updated incorrectly in the OLTP system, it will not change after a reload into BW. Use transaction RSA3 (Extractor Checker) in the source system for help with troubleshooting. Another possible source of the problem may be your user exit. To ensure that the user exit is correct, first load a user exit with a Probe-Full request into the PSA table and check whether the data is correct. If it is not correct: Search for the error in the exit user. If you do not find it, we recommend that you deactivate the user exit for testing purposes and request a new Full Upload. It If the data arrives correctly, it is highly probable that the error is indeed in the user exit.
We always recommend that you load the data into the PSA table in the first step and check the result there.
2.2. Analyze the effects on the downstream targets
Before you start the Repair request into the ODS object, make sure that the incorrect data records are selectively deleted from the ODS object. However, before you decide on selective deletion, you should read the Info Help for the "Selective Deletion" function, which you can access by pressing the extra button on the relevant dialog box. The activation queue and the ChangeLog remain unchanged during the selective deletion of the data from the ODS object, which means that the incorrect data is still in the change log afterwards. After the selective deletion, you therefore must not reconstruct the ODS object if it is reconstructed from the ChangeLog. (Reconstruction is usually from the PSA table but, if the data source is the ODS object itself, the ODS object is reconstructed from its ChangeLog). You MUST read the recommendations and warnings about this (press the "Info" button).
You MUST also take into account the fact that the delta for the downstream data targets is created from the changelog. If you perform selective deletion and then reload data into the deleted area, this may result in data inconsistencies in the downstream data targets.
If you only use MOVE and do not use ADD for updates in the ODS object, selective deletion may not be required in some cases (for example, if incorrect records only have to be changed, rather than deleted). In this case, the DataMart delta also remains intact.
2.3. Analysis of the selections
You must be very precise when you perform selective deletion: Some applications do not provide the option of selecting individual documents for the load process. Therefore, you must first ensure that you can load the same range of documents into BW as you would delete from the ODS object. This note provides some application-specific recommendations to help you "repair" the incorrect data records.
If you updated the data from the ODS object into the InfoCube, you can also delete it there using the "Selective deletion" function. However, if it is compressed at document level there and deletion is no longer possible, you must delete the InfoCube content and fill the data in the ODS object again after repair.
You can only perform this action after a thorough analysis of all effects of selective data deletion. We naturally recommend that you test this first in the test system.
The procedure generally applies for all SAP applications/extractors. The application determines the selections. For example, if you cannot use the document number for selection but you can select documents for an entire period, then you are forced to delete and then update documents for the entire period in the data target. Therefore, it is important to look first at the selections in the InfoPackage exactly before you delete data from the data target.
Some applications have additional special features:
Logistics cockpit: As preparation for the repair request, delete the SetUp table (if you have not already done so) and fill it selectively with concrete document numbers (or other possible groups of documents determined by the selection). Execute the Repair request.
Caution: You can currently use the transactions that fill SetUp tables with reconstruction data to select individual documents or entire ranges of documents (at present, it is not possible to select several individual documents if they are not numbered in sequence).
FI: The Repair request for the Full Upload is not required here. The following efficient alternatives are provided: In the FI area, you can select documents that must be reloaded into BW again, make a small change to them (for example, insert a period into the assignment text) and save them -> as a result, the document is placed in the delta queue again and the previously loaded document under the same number in the BW ODS object is overwritten. FI also has an option for sending the documents selectively from the OLTP system to the BW system using correction programs (see note 616331).
3. Repair request execution
How do you proceed if you want to load a repair request into the data target?  Go to the maintenance screen of the InfoPackage (Scheduler), set the type of data upload to "Full", and select the "Scheduler" option in the menu -> Full Request Repair -> Flag request as repair request -> Confirm. Update the data into the PSA and then check that it is correct. If the data is correct, continue to update into the data targets.
hope this helps.

Similar Messages

  • Load from ODS into InfoCube gives TIME-OUT runtime error after 10 minutes ?

    Hi all,
       We have a full load from ODS into InfoCube and it was working fine till the last week upto with 50,000 records. Now, we have around 70,000+ records and started failing with TIME_OUT runtime error.
       The following is from the Short Dump (ST22):
       The system profile "rdisp/max_wprun_time" contains the maximum runtime of a
    program. The current setting is 600 seconds. Once this time limit has been exceeded, the system tries to terminate any SQL statements that are currently being executed and tells the ABAP processor to terminate the current program.
      The following are from ROIDOCPRMS table:
       MAXSIZE (in KB) : 20,000
       Frequency       :  10
       Max Processes : 3
      When I check the Data Packages under 'Details' tab in Monitor, there are four Data Packages and the first three are with 24,450 records.  I will right click on each Data Package and select 'Manual Update' to load from PSA. When this Manual Update takes more than 10 minutes it is failing with TIME_OUT again.
      How could I fix this problem, PLEASE ??
    Thanks,
    Venkat.

    Hello A.H.P,
    The following is the Start Routine:
    PROGRAM UPDATE_ROUTINE.
    $$ begin of global - insert your declaration only below this line  -
    TABLES: /BIC/AZCPR_O0400, /BIC/AZCPR_O0100, /BIC/AZCPR_O0200.
    DATA: material(18), plant(4).
    DATA: role_assignment like /BIC/AZCPR_O0100-CPR_ROLE, resource like
    /BIC/AZCPR_O0200-CPR_BPARTN.
    $$ end of global - insert your declaration only before this line   -
    The follow definition is new in the BW3.x
    TYPES:
      BEGIN OF DATA_PACKAGE_STRUCTURE.
         INCLUDE STRUCTURE /BIC/CS8ZCPR_O03.
    TYPES:
         RECNO   LIKE sy-tabix,
      END OF DATA_PACKAGE_STRUCTURE.
    DATA:
      DATA_PACKAGE TYPE STANDARD TABLE OF DATA_PACKAGE_STRUCTURE
           WITH HEADER LINE
           WITH NON-UNIQUE DEFAULT KEY INITIAL SIZE 0.
    FORM startup
      TABLES   MONITOR STRUCTURE RSMONITOR "user defined monitoring
               MONITOR_RECNO STRUCTURE RSMONITORS " monitoring with record n
               DATA_PACKAGE STRUCTURE DATA_PACKAGE
      USING    RECORD_ALL LIKE SY-TABIX
               SOURCE_SYSTEM LIKE RSUPDSIMULH-LOGSYS
      CHANGING ABORT LIKE SY-SUBRC. "set ABORT <> 0 to cancel update
    $$ begin of routine - insert your code only below this line        -
    fill the internal tables "MONITOR" and/or "MONITOR_RECNO",
    to make monitor entries
       clear DATA_PACKAGE.
       loop at DATA_PACKAGE.
          select single /BIC/ZMATERIAL PLANT
             into (material, plant)
             from /BIC/AZCPR_O0400
             where CPR_EXT_ID = DATA_PACKAGE-CPR_EXT_ID
             and ( MATL_TYPE = 'ZKIT' OR MATL_TYPE = 'ZSVK' ).
           if sy-subrc = 0.
              DATA_PACKAGE-/BIC/ZMATERIAL = material.
              DATA_PACKAGE-plant = plant.
              modify DATA_PACKAGE.
              commit work.
           endif.
           select single CPR_ROLE into (role_assignment)
                         from /BIC/AZCPR_O0100
                         where CPR_GUID = DATA_PACKAGE-CPR_GUID.
            if sy-subrc = 0.
              select single CPR_BPARTN into (resource)
                         from /BIC/AZCPR_O0200
                         where CPR_ROLE = role_assignment
                         and CPR_EXT_ID = DATA_PACKAGE-CPR_EXT_ID.
                   if sy-subrc = 0.
                      DATA_PACKAGE-CPR_ROLE = role_assignment.
                      DATA_PACKAGE-/BIC/ZRESOURCE = resource.
                      modify DATA_PACKAGE.
                      commit work.
                   endif.
              endif.
           clear DATA_PACKAGE.
           endloop.
    if abort is not equal zero, the update process will be canceled
      ABORT = 0.
    $$ end of routine - insert your code only before this line         -
    Thanks,
    Venkat.

  • How to select on an ODS with index to consume less internal memory?

    Hi all,
    I want to load an ODS B from ODS A with an enhanced communication structure. The enhanced fields are populated through look up on a third ODS C based on joining condition with certain fields of the data_package (structure of ODS A).
    The entire code is written in the start routine from A to B and the look up on C is done by a <b>select statement on C for all entries in data_package ...etc</b>. However this ODS C has huge data and when we are loading the data from A to B and hitting this select in start routine it is giving us a short dump due to shortage of extendable memory space.
    To solve this problem we had built an index on ODS C with the fields used in where clause of the select statement but the problem persists. Can anybody tell me how can we take the benefit of the index so created in the select statement on ODS C in start routine ? Because it is the internal memory and not the query size that is giving the trouble is there any way to partition ODS C and use select on the partitions separately? Should an internal table of type ODS C be declared as sorted table with unique key instead of standard table ?
    FYI, the index fields we have added in ODS maintenance screen are checked for unique.

    I guess the short dump you are encountering is due to the size of the internal table you are filling up from ODS C not the index. Check the where clause on the select statement. You can reduce the data package size. Try using selections on the infopackages from A to B to limit the data... you can try other options to split the data into multiple (serial sessions) based on the business rules & logic in place at your implementation.
    Default installation has a limitation of 2GB on the size of internal sessions considering signed 32bit variable on 32 bit o/s systems. Now that most of the unix based systems are 64bit and with unsigned 32bit variables you can use 4GB or more.
    Check the note 548845 for more info
    Gopal
    Pls assign points if you find the response helpful !

  • Total number of records loaded into ODS and in case of Infocube

    hai
    i loaded some datarecords from Oracle SS in ODS and Infocube.
    My SOurceSytem guy given some datarecords by his selection at Oracle source system side.
    how can i see that 'how many data records are loaded into ODS and Infocube'.
                     i can check in monitor , but that not correct(becz i loaded second , third time by giving the ignore duplicate records). So i think in monitor , i wont get the correct number of datarecords loaded in case of ODS and Infocube.
    So is there any transaction code or something to find number records loaded in case of ODS and Infocube .
    ps tell me
    i ll assing the points
    bye
    rizwan

    HAI
    I went into ODS manage and see the 'transferred' and 'added' data records .Both are same .
    But when i total the added data records then it comes 147737.
    But when i check in active table(BIC/A(odsname)00 then toal number of entries come 1,37,738
    why it is coming like that difference.......
    And in case of infocube , how can i find total number of records loaded into Infocube.(not in infocube).
               Like any table for fact table and dimension tables.
    pls tell me
    txs
    rizwan

  • Re-Loading an ODS from a flat file.

    Hi,
    I was loading an ODS and due to some system problem it was cancelled after 2 hours. Now I want to load the same data again. Do I have to just schedule the infopackage again or do I have to delete any request.

    HI,
    In RSA1 > Modeling > Infoprovider selection > find your ODS > right click > select manage > goto the Requests tab. Here you see all the requests laoded to the ODS. Each will have a Green, Yellow or Red light against it. Since your previous load failed, that request should be Red. You can select the row and delete it from the same screen.
    Hope this helps...

  • Problem when loading from ODS to the CUBE

    Hi Experts,
    I am facing an unusual problem when loading data from ODS to the cube.
    I have a first level ODS where the delta postings are updated. I have checked the active table of the ODS and the data is accurate.
    I have deleted the entire data from the Cube and trying to do a full load from ODS to the CUBE.
    I am sure when I run a full load the data goes from Active table of the ODS.
    After the full load the the keyfigure values are 0. I have tried doing the testing by loading couple of sales documents and still the keyfigure values are 0.
    I wonder when I load the full. The data should be picked exactly the way it is in active table of the ODS.
    I also dont have any fancy routines in the update rules.
    Please help me in this regard.
    Regards
    Raghu

    Hi,
    Check the procedure did u do that exactly or not. just follow the the laymen steps here:... and let me know ur issue
    o     First prepare a flat file in Microsoft excel sheet for master data and transaction data and save it in .CSV format and close it.
    o     First select info objects option and create info area then create info object catalog then char, & key figure. Then create ‘id’ in Char, Name as attribute activate it then in key figures create no and activate it.
    o     In info sources create application component then create info source with direct update for master data and flexible update for transaction data. The flat files would be created for master data create info package and execute it.
    o     For transaction data go to info provider then select the file right click on it then select option create ODS object. Then a new screen opens give the name of the ODS and create other screen opens then drag character to key fields.
    o     Activate it then update rules it will give error to it then go to communication structure and give 0Record mode then activate it then go to update rules it will activate it.
    o     Then go to info source then create info package then external data then processing update, Data targets then the scheduling then click start then the monitor.
    o     Then other screen opens there we can see if we doesn’t able to see the records then come back to the info provider right click on it then other screen opens then select the option activate data in ODS then a screen opens.
    o     so select the option QM where it would be in a yellow co lour then by double click then another screen opens then select green color option then click continue it and it comes to the previous screen then select the option and click save it would be underneath of it just click it up and close it.
    o     Then come back to the main screen to the info source to the info package go to the package and right click on it then a screen opens then click schedule without distributing any of the tab directly click the monitor tab. Then refresh it until the records come to the green.
    o     Once it is green the click the data target and see the file executed.
    o     Now go to the info provider then a right click on the file name then create info cube. Then a new screen opens and gives the name and activates it. The drag the key figures to the other side, time characteristics to the other side, and chacetristics to the other side.
    o     Then create the dimension table then assign it and activate it .Then go to file name right click on it then select the option create update rules select ODS option and give the file name and activate it .
    o     Then come back to the main screen and refresh it. Whenever you see 8 file names you will know that it is undergoing the data mart concept.
    o     In the main screen click ODS file right click then a pop up screen opens select the option update ODS data target. Then another screen opens in that we are able to see 2 options full update, initial update then select initial update.
    o     Then another screen opens info Package there the external data , data targets , scheduler select the option later in background option then a another screen opens just click immediate then click save automatically it closes then click start option.
    o     Then select monitor option then the contents then the field to be elected then the file to be executed.
    regards
    ashwin

  • Load fail - ODS to Cube - INIT req

    Hi experts..
    Im loading init request from ODS to my cube.The req contains 12 million records in the ODS.I got the following error message in the RSMO.
    Job termination in source system
    Diagnosis
    The background job for data selection in the source system has been terminated. It is very likely that a short dump has been logged in the source system
    Procedure
    Read the job log in the source system. Additional information is displayed here.
    To access the job log, use the monitor wizard (step-by-step analysis)  or the menu path <LS>Environment -> Job Overview -> In Source System
    Error correction:
    Follow the instructions in the job log messages.
    This load is from ODS to CUBE ,and how does the source system is related here..
    Bcos it says error in the source system.????
    How to correct this error????
    thanks & regards
    ragu

    Hi Ragu
    When you are loading from ODS to Cube at that time source system BW Myself system (your own BW system) acts as source system.
    As given in Procedure section ->To access the job log, use the monitor wizard (step-by-step analysis) or the menu path <LS>Environment -> Job Overview -> In Source System->>> Look for job log.
    Also check in ST22 for short dump and let us know the error description in detail.
    Regards
    Pradip

  • Can I do Parallel Full loads from ODS to Cube.

    Hai,
       Usually I will do one a full update load from OD'S to Cube. to speed up the process can I do parallel Full update loads from ods to Cube?
    Please advise.
    Thanks, Vijay,

    Assuming that the only connection we are talking about is between a single ODS and a single Cube.
    I think the only time you could speed anything up is in full drop an reload scenario. You could create multiple InfoPackages based on selection and execute them simultaneously.
    If the update is a delta there is really no way to do it.
    How many records are we talking about? Is there logic in the update rule?

  • Index for loads from ODS To Cube

    The load from ODS to cube is taking a long time - In the start routine another ODS is being looked up - The keys for look up is say X and Y
    There is already an index existing on keys X , Y & Z -
    Will this index be used while doing the select on that ODS or I need to create a new index with only X and Y keys ?
    Thnx

    When you are running the start routine - run an SQL trace - ST05 - that will tell you if the index is being used.
    Arun

  • Delta load from ODS to cube failed - Data mismatch

    Hi all
    We have a scenario where the data flow is like
    R/3 table - >dataSrc -- > pSA - >InfoSrc -> ODS ->Cube.
    The cube has an additional field called "monthly version"
    and since it is a history cube , it is supposed to hold data
    snapshots of the all data in current cube for each month .
    We are facing the problem that the Data for the current month
    is there in the history ODS but not in the cube . In the ODS ->Manage->requests
    tab i can see only 1 red request that too with 0 recs.
    However ,In the cube -> manage-> reconstruction tab , i can see 2 Red request
    with the current month date. Could these red requests be the reason for
    data mismatch in ODS and cube .
    Please guide as to how can i solve this problem .
    thanks all
    annie

    Hi
    Thanks for the reply.
    Load to Cube is Delta and goes daily .
    Load to ODS is a full on daily basis .
    Can you help me how to sort this issue . I have to work directly in production env . so has to be safe and full proof .
    Thanks
    annie

  • Error Message when loading an ODS from an ODS

    Within a process chain we are loading an ODS from another ODS.  This fails with the following message:
    Job REQU_49HIKNN81WXVPRI80DC2TVCUF removed from periodical scheduling and terminated
    Message no. RSM1170
    Diagnosis
    An end-date for the job is specified when the job is scheduled (no longer "start after date/time").
    When the job is started for the first time, this end-date is applied to ALL subsequent periodical jobs.
    This is different to the standard R/3 batch schedulings.
    The end-date is checked each time the job is restarted, and if the end-date has passed, the job is removed from the schedule.
    This is what has happened in this case.
    Procedure
    Reschedule the job if you want to keep it running.
    Does anyone know where I can see/change the end date for this job or how I reschedule it?

    Dear Barry,
    1) Just go and make your  req Red both n Data target and PC
    2)Delete the error request
    3)Get the Infopackage from PC header.
    4)reshedule it again
    Hope it solves your problem
    Assign points if helpful
    Regards
    Bala

  • Error when loading from ODS to Cube

    Hello Friends,
    I am having trouble loading data from ODS to Infocube in my 2004S system.When loading data i get this message
    07/03/2007     13:10:25     Data target 'ODSXYZ ' removed from list of loadable targets; not loadable.
    07/03/2007     13:28:42     Data target 'ODSXYZ ' is not active or is incorrect; no loading allowed     
    I checked for ODSXYZ in my data target but there is nothing by that name.Even the infopackage doesnt have it.What needs to be done.Please help.
    Thanks.

    Its a expected behavior. When you migrate ur DS, the infopacks associated with it, will grey out all the data targets that they were feeding before, that applies to any infopack you create even after the migration. You womt be able to delete it.
    Having said this, this shouldnt impact ur loads from ODS to Cube. As this shoudl be taken care of by ur DTP's rather than ur infopacks.
    A few questions:
    How are you loading ur cube?
    Did the data get thru fine to PSA with the infopack in question?
    How did you load ur DSO(assuming the load was successful)?
    Message was edited by:
            voodi

  • Text file manipulation before load to ODS

    Hello Experts,
    I have a text file with 3 fields(comma separated): GLCode (Number), Desc1 (Char),
    Desc2(Char) and need to load it into BW.
    My Text file looks like:
    1011.00,"Mejor PC Infrastructure","This line is ok."
    1012.00,"Telephone Equipment $","This line ends in next line.    Need to add next line,
    1)Need to change the equipment immediately.
    2)Take immediate action"
    1013.00,"V1 Major Computer Server Infrastructure # Equip","For purchases
    of components that make up the company's network, such as servers, hubs, routers etc."
    1014.00,"Flash Drive","Need to provide all IT Developer"
    This is how file looks like. Now I need the followings:
    1. Need to remove the space and need to adjust the splitted line into one. Say here
    line/record 2 is splitted into 3 lines and need to adjust in 1 line.
    2. In Line 5 (Record 3) data splitted into 2 lines and need to make 1 line.
    3. Need to remove bad characters.
    4. Load into ODS after cleanup.
    Need guidance, example.
    Thank you.
    Regards,
    Bhakta
    Could someone help me please how to proceed ?

    Hi,
    Go to Details tab in monitoring.
    Just check for any packages are in yellow.
    If any package i yellow then right click and take the Update manually option for that package only.
    Or else you have to delete the request from the target and make the request in to red in monitor under Status tab.
    Now you can try to load the data.
    This is because of IDocs not processed successfully.
    Regards,
    Venkat

  • How to automate selective deletion of ods in process chain

    Hi,
    How can we automate selective deletion of ods in Process chain. I tried with the proceedure. Using T-code "delete_fact" I gave selection parameters and Name of the program and saved this as variant and gave the same program and variant in Process cahin. But it is not deleting data. So plz guide me whether I have missed anything in this proceedure.

    Hi purushottam,
                           you have to give to program variants in the process chain for data deletion to take place. in the first program variant include the rsdrd_delete_facts as program name and in the program variant give a variant for your infoprovider selection.
    Now again select another program variant and include the name of the program which you get after selecting the infoprovider name and  the generate selection program  option in the "program for deleting selected entries from the data target" screen " and press execute.
    again give another variant for your field selection.
    now try execute the process chain.

  • [3003 + R3299 + RSM340] error messages loading an ODS

    Hello everybody,
    I'm trying to load an ODS which normally is been loaded w/o any issues via a full InfoPackage
    I've tried by using the previously created infopackages and the same errors were raised, I tried to create a new IP with the same features...
    Update Mode     Full update
    Filter: 0STAT_DATE     From 20110305     to 20110314
    Processing     PSA and then into Data Targets (Package by package)
    and system is retrieving the following errors:
    Message no. R3003 ("Error                                                   7 when sending an IDoc")
    Message no. R3299 ("Exception condition "NOT_EXIST" raised.")
    Message no. R3299 ("Exception condition "NOT_EXIST" raised.    3")
    Message no. RSM340 ("Errors in source system")
    Additional comments:
    I checked IDocs, tRFCs and there´s nothing wrong.
    Loads are from BW to BW.
    An additional thing I realized is that the extraction is performed ok (records are extracted and passed ok), then a request is added into the ODS requests and keeps hung in yellow state never ending.
    I had previously deleted bad entries on PSA, and even checked via RSRV the ODSs analysis and repair options but nothing was found.
    I don't think it is because of authorization issues.
    I hope you can help. With best regards,
    Bernardo

    Nevermind, I fixed it by applying the OSS note [Note 1229437 - 70SP19: DM pointer calculated incorrectly|https://service.sap.com/sap/support/notes/1229437] suggestion for the case #3.
    I hope this is useful for each one of you who experience the same.
    Regards,
    Bernardo

Maybe you are looking for

  • BannerMaker not working after system update.

    Hello all, I am not trusting Apple now for system updates anymore, because the last couple of updates, after updating my system software, I have problems. I never have these issues with Windows. Getting to my point, I use BannerMaker (an app created

  • NEF support in Photoshop CS5

    I just bought a new Nikon d5200. The Raw format for this camera is NEF; I am using photoshop CS5 and it could not open the files. So after searching, it appeared that I just had to update the Camera Raw plugin for CS5, and my camera was listed as sup

  • New GL - Profit Center Issue

    Hello Gurus, If New GL is activated, do we still need to attach Profit Center to Controlling Area? Some one checked the Profit Center Accounting in the Controlling Area and we did uncheck that profit center accounting indicator on Controlling Config

  • To open a pdf document or word document from a JSP into a new window

    Hi, How to open a PDF document or word document from a JSP into a new window. i.e say from a jsp when a hyperlink or button is clicked it should open a new window with the pdf document or word document in it. thanks in advance, radki_j

  • Rolling Months

    I created 12 rolling session variables to use to get 12 month rolling sums for my report. After creating the variables i made this filter to get sums for my GL balances (12 months ending). FILTER ("Facts - GL Balance"."Balance Amount") USING ("Time".