Full Re-Load of 8 years of FI Data

I recently posted a question regarding load failures that were happening at my new job (now 3 days in).
Load Error 1st day on the Job - help please
After some troubleshooting I have found that there was a bad request to the ODS (0FIGL_O06) on 8/20 that was causing subsequent loads to fail.  I deleted the bad request in the ODS and now the ODS is current except for 1 day 8/20 and the cube (0FIGL_C01) is missing data between 8/20 and 8/29.
The PSA only holds data for 10 days and is as such missing the data for 8/20.
I guess I need to do a new full update load from R/3 to the ODS and then from ODS to cube.  If there's another way please let me know.
I have never worked with data extraction before.  Can someone please provide me with detailed steps to extract (from R/3) and re-load to BW?

Hi,
The datsource you are giving is the ODS; what is the R/3 datasource feeding your ODS??
you have two options:
- delete all the data from your ODS and ICube. Delete your ODS delta init and your R/3 datasource delta init. Reload everything from R/3 into your ODS by executing a delta init (with data) and load your cube by executing a delta init (with data) on your ODS.
This option obviously is valid if you have a reasonable amount of records. In my case, FI_GL ODS has more than 100 mio record for 4 years thus that wouldn't be an option (my users will have no data during this process....)
- I guess that your ODS is always overwritting the loaded data from R/3 and passes the differences to your cube. You are missing the 8/20 delta request.
Assuming that this request only has data for the FISCPER 2007.008 (2007.007 should be closed by now, right?), you just need to reload this period in your ODS (full load with selection criteria), activate it and load the delta as usual in your cube... Further deltas can continue as usual.
We are doing this very often when users want an update within the day... however we I am talking about R/3 datasource 0FI_GL_4, should be the same for yours...
let us know...
Olivier.

Similar Messages

  • Using Repair Full to load Data

    Hi Guys
    I have to load several years of data. Being loaded into ODS and Cubes in delta mode. I want to break the load into small chunks to load in a controlled fashion. That's what I have planned to do
    1- Initialize without data transfer
    2-Run Repair full for each year from 2001 to 2006
    3-Then start running the deltas.
    I am just curious if u know if this method is going to work. Does the delta that i m going to run after running the last repair full will bring all the data again or just the changed records.
    I also want to go back historically with repair fulls to make the current data available. Is this going to work or not. If not what should I do to load the data periodically.

    Hi,
    If you want to do this year load to cube then you can full load sepcifying the year in the infopacakge.
    You need not run the program which i specified earlier for cube, but in case you want to do this load year full to ODS and in case delta load is going on in the ODS then you will have to first do the full loads for the years u want to and then excute the program and then carry on with delta loads.
    I hope I'm clear.
    In case you are not able to see the radio buttons then you can drop a mail to [email protected] and tell them that you are not able to give points for this post.
    Regards,
    Rohini

  • 9237 with full-bridge load cell: load cell_null_off_shuntcal.vi throws error 200077

    Hi,
    I'm trying to use the example
    load_cell_null_off_shuntcal.vi  with a full-bridge load cell (Honeywell
    Model 31, unamplified). I am using LabView 8.6,  cDAQ-9172 and NI9237. The
    load cell is connected to pins 2,3,6 and 7.
    The inputs for the VI front panel
    are: internal excitation10V; mV/V 2.1492 (calibration sheet); max weight 10
    lbs; bridge resistance 350 ohms (Honeywell specs); 9237 internal shunt
    resistance 100 kohms; shunt location R4 (default setting). I have selected
    "Do offset null" and "Do shunt cal".
    This is the error I receive:
    Error -200077 occurred at DAQmx
    Perform Shunt Calibration (Bridge).vi:1 Possible reason(s):
    Measurements: Requested value is not
    a supported value for this property.
     Property:
    AI.Bridge.ShuntCal.GainAdjust
    You Have Requested: -61.980405e3
    Valid Values Begin with: 500.0e-3
    Valid Values End with: 1.500000
    If the "Do shunt cal"
    green button is not selected, there is no error. I understand that the Gain
    adjust value should be approx 1, whereas the one I get is much larger. The  subVI  DAQmx PerformShuntCalibration
    (bridge).vi contains a "Call library function node" which I don't
    know how to interrogate.  
    Has anyone else had experience
    with this error? Do you have any advice on:
    1)    
    How to "see" the calculations being
    perfomed inside the "call library function node"?
    2)    
    What the correct shunt element
    location for a full bridge load cell is? (although changing this location does
    not eliminate the error, I can't find this info).
    3)    
    Anything I may be doing wrong with
    my inputs to cause this error?
    Thanks,
    Claire.
    Solved!
    Go to Solution.

    Hi Claire,
    You have to physically connect the SC terminals to one arm of the bridge (normally R3). The terminal is not provided for connecting external resistors.
    See example 
    C:\Program Files\National Instruments\LabVIEW 8.6\examples\DAQmx\Analog In\Measure Strain.llb\Cont Acq Strain Samples (with Calibration) - NI 9237.vi
    "A VI inside a Class is worth hundreds in the bush"
    യവന്‍ പുലിയാണു കേട്ടാ!!!

  • Whats the Actual use of the Setup table apart from the Full/Init  loads

    Hi  guru's
      could u explain  whats the Actual use of the Setup table apart from the Full/Init  loads  in Lo Extraction

    Hello Guru,
    The Setup table is used mainly for storing the Historical data whereas the new and changed records will be maintained in the Update table.
    By storing the Historical data in the setup table you don't need to disturb the Application table anymore.
    Please go through the following blogs
    /people/sap.user72/blog/2004/12/16/logistic-cockpit-delta-mechanism--episode-one-v3-update-the-145serializer146
    /people/sap.user72/blog/2004/12/23/logistic-cockpit-delta-mechanism--episode-two-v3-update-when-some-problems-can-occur
    /people/sap.user72/blog/2005/01/19/logistic-cockpit-delta-mechanism--episode-three-the-new-update-methods
    /people/sap.user72/blog/2005/04/19/logistic-cockpit-a-new-deal-overshadowed-by-the-old-fashioned-lis
    /people/sap.user72/blog/2005/02/14/logistic-cockpit--when-you-need-more--first-option-enhance-it
    /people/vikash.agrawal/blog/2006/12/18/one-stop-shop-for-all-your-lo-cockpit-needs
    https://websmp201.sap-ag.de/~form/sapnet?_FRAME=CONTAINER&_OBJECT=011000358700002719062002E
    Hope it helps
    Thanks,
    Chandran

  • Init ,full,delta load

    Hi friends,
    can u tell difference between init ,full,delta updates
    and what situation we use these updates ?and any more update methods are there other than these?

    manohar ,
    init - it is used to initialize the data load and will be followed by delta loads thereafter
    delta - incremental data loads and can be done only after an init
    full - you load all the data at once and do not wish to init. You cannot run a delta load after a full load. you will always have to init first and then do the delta loads.
    please refer back to previous postings for more information on the same.
    Arun

  • Pre-requiste for Full incremental Load

    Hi Friends,
    I have installed and set up BI apps environment with OBIEE, BI Apps, DAC , Informatica. Now what are the immediate steps to follow in order to do full incremental load for EBS 12R for Financial and SCM.
    SO PLEASE GUIDE ME AS IT IS CRITICAL FOR ME TO ACCOMPLISH FULL LOAD PROCESS.
    Thanks
    Cooper

    You can do that by changing the Incremtal workflows/sessions to include something like update_date < $$TO_DATE and specify that as a DAC parameter. You willl have to do this manually. Unfortunately there is no built in "upper limit" date. There is a snapshot date that can extend to a future date but not for the regular fact tables.
    However, this is not a good test of the incremental changes. Just because you manually limit what you extract does not mean you have thoroughly unit tested your system for incremental changes. My advise is to have a source system business user enter the changes. Also..they need to run any "batch processes" on the source system that can make incremental changes. You cannot count the approach you outlined a a proper unit test for incremental.
    Is there any reason why you cannot have a business user enter transactions in a DEV source system environment and then run the full and incremental loads against that system? I dont mean a new refresh..i mean a manual entry to your DEV source system?

  • 9237 + full-bridge load cell: load cell_null_off_shuntcal.vi - error 200077

    I'm trying to use 
    load_cell_null_off_shuntcal.vi  with load cell (Honeywell Model 31,
    unamplified). I am using LabView 8.6,  cDAQ-9172 and NI9237. Inputs:
    internal excitation10V; mV/V 2.1492 (calib. sheet); max weight 10 lbs; bridge
    resistance 350 ohms (Honeywell specs); 9237 internal shunt resistance 100
    kohms; shunt location R4 (default setting). Have selected "Do offset
    null" and "Do shunt cal".
    Error -200077 occurred at DAQmx
    Perform Shunt Calibration
    (Bridge).vi:1 Possible reason(s):
    Measurements: Requested value is not
    a supported value for
    this property.
     Property:
    AI.Bridge.ShuntCal.GainAdjust
    You Have Requested: -61.980405e3
    Valid Values Begin with: 500.0e-3
    Valid Values End with: 1.500000
    If "Do shunt cal" green
    button not selected, no error. Gain adjust should be approx 1. subVI 
    DAQmx PerformShuntCalibration (bridge).vi contains "Call library function
    node" which is locked (?).  
    Any ideas?
    What is the correct shunt element
    location for a full bridge load cell? Changing this location does not eliminate
    error.
    Solved!
    Go to Solution.

    Hello CFJ,
    The problem is most likely in your external connections of the NI 9237 and the load cell.  As referenced in the NI 9237 Operating Instructions and Specifications, page 9, the SC+ and SC- pins should be connected across the resistor specified in the DAQmx Perform Shunt Calibration (Bridge).vi (in the case of a full bridge it would be R3).
    Let me know if you are still having issues with your calibration.
    Message Edited by Dan_K on 01-22-2009 04:31 PM
    Regards,
    Dan King

  • 2LIS_13_VDITM u0096 Full Repair Load

    I will be creating an InfoPackage to carry out Full Repair Loads.  Due to memory restrictions (which are currently being looked at) I will be using selections based on created on dates to extract the Data.
    How can I avoid duplication of data in the ODS since it already contains data, especially if I have to rerun the package again? For example I get a dump if the selection sizing is too large and causes a memory dump. Thanks

    As the automatic deletion of similar requests can only be applied when updating InfoCubes, you have to avoid the duplication of records by using the correct primary key for your ODS and the correct update method of key figures.
    It might be easier to delete the ODS data.

  • DTP Routine to load only last year data

    Hi Gurus,
    I have a requirement to load only last year ( 0CALMONTH  i.e MM/YYYY)  data based on current year in transformation, Please could you correct below code...
    DATA: l_idx like sy-tabix.
      DATA: DATE_H TYPE D.
      DATE_H = SY-DATUM(4) - 1.
      l_t_range-SIGN = 'I'.
      l_t_range-OPTION = 'EQ'.
      l_t_range-HIGH = DATE_H.
      if l_idx <> 0.
        modify l_t_range index l_idx.
      else.
        append l_t_range.
      endif.
      p_subrc = 0.
    Right now it is bringing upto last year data. Please could you guide me what changes needs to be done...
    Thanks
    Ganesh Reddy.

    DATA: DATE_H(6) TYPE c,
                 DATE_L(6) TYPE c'
                 L_YEAR(4) TYPE c.
    L_YEAR = SY-DATUM(4) - 1.
    CONCATENATE L_YEAR '01' INTO DATE_L.
    CONCATENATE L_YEAR '12' INTO DATE_H.
    l_t_range-SIGN     = 'I'.
    l_t_range-OPTION = 'BT'.
    l_t_range-LOW     = DATE_L.
    l_t_range-HIGH     = DATE_H.

  • How to Run ebs adaptor data load for 1 year using DAC

    Hi,
    iam trying to Run the ebs adaptor data load for 1 year for procurement and spend ,Please let me know the parameter to set in DAC.
    last extract date is set as Custom Format(@DAC_SOURCE_PRUNED_REFRESH_TIMESTAMP
    Thanks

    You need to set $$INITIAL_EXTRACT_DATE to a year ago. The LAST EXTRACT DATE is something used for incremental loads and you do not manually set that.
    if this helps, mark as correct or helpful.

  • Loading a subset of BW Transactional data into BPC

    Hello Experts
    Hopefully someone can help me with a problem I have been facing.
    I am running: EPM 10 SP13
    Currently, we load a full set of transactional data from a staging BW cube into the BPC model.  In terms of time, this is from 1990 to date.
    Each evening, before the above mentioned load, we clear data in the BPC model against a specific dimension category to handle deleted data which does not come in on the flat file full load each night.
    I would not like to only run this clear and load process for the last 2 years worth of data.
    The 'Clear' package via a BW process chain using UJD_TEST_PACKAGE and an answer prompt setting as below works a treat:
    %SELECTION%     DIMENSION:ANALYSIS||DIMENSION:BRAND||DIMENSION:CATEGORY|BW_ACTUAL|DIMENSION:CUSTOMER||DIMENSION:DATASRC|DS_ACT_BW_INPUT|DIMENSION:INDUSTRY||DIMENSION:METRIC||DIMENSION:PRODUCT||DIMENSION:RPTCURRENCY||DIMENSION:SUPPLY_SITE||DIMENSION:TIME|2012.01,2012.02,20012.03,2012.04,2012.05,2004.06,2012.07,2012.08,2012.09,2012.10,2012.11,2012.12,2013.01,2013.02,2013.03,2013.04,2013.05,2013.06,2013.07,2013.08,2013.09,20013.10,2013.11,12.2013
    %ENABLETASK%     1
    %CHECKLCK%     0
    However, I cannot get the load run package (Load Transaction Data from BW InfoProvider UI) to pick up only 2 years worth of data.
    %InforProvide%     ZBPC_11
    %SELECTION%     <?xml version="1.0" encoding="utf-16"?><Selections xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns:xsd="http://www.w3.org/2001/XMLSchema"><Selection Type="Selection" /><Selection Type="FieldList"><FieldID>0CALMONTH</FieldID><FieldID>0CALYEAR</FieldID><FieldID>0CHNGID</FieldID><FieldID>0RECORDTP</FieldID><FieldID>0REQUID</FieldID><FieldID>ZBPC_1</FieldID><FieldID>ZBPC_10</FieldID><FieldID>ZBPC_6</FieldID><FieldID>ZJ_DEALER</FieldID><FieldID>ZJ_DEFIND</FieldID><FieldID>ZJ_MODEL</FieldID><FieldID>ZJ_MRPSTE</FieldID><FieldID>ZJ_TYPE</FieldID></Selection></Selections>
    %TRANSFORMATION%     \ROOT\WEBFOLDERS\COMPANY\S_AND_OP\DATAMANAGER\TRANSFORMATIONFILES\BW Actuals.xls
    %DATATRANSFERMODE%     10
    %CLEARDATA%     No
    %RUNLOGIC%     No
    %CHECKLCK%     No
    I have tried setting the option in the EPM add-on when I run the Load Transaction Data from BW InfoProvider UI package but this results in the chain running for ages and never completing, despite their being activity in SM50.  Almost like an infinite loop!
    I then tried using the code in the UJD_TEST_PACKAGE script pointing to the above:
    <Selection Type="Selection"><Attribute><ID>0CALMONTH</ID><Operator>1</Operator><LowValue>2020</LowValue><HighValue /></Attribute></Selection>
    Does anyone know how I can set the import package to only include the last 2 years ?  I have found that pre BPC 10, you could add something to the transformation file but I do not believe that is now supported.
    Thanks for any guidance.
    Ian

    Hi Ian,
    Q: Could you elaborate some more on automated this process?  What do you mean by a macro to create the xml files ?
    This process its for running package from excel. This document explain https://scn.sap.com/docs/DOC-32636
    Q: Using your alternative suggestion of created a start routine, do you mean as part of the transformation in the BW staging area or the BPC transformation file sourcing the data manager package ?
    I mean the BPC transformation file. http://scn.sap.com/docs/DOC-4230

  • Master data loads Request not updated to any data target using delta

    When checking the PSA the request updated ICON has red triangle and the mouse over says
    "Request not updated to any data target using delta".
    I am doing full loads for text data and both full and delta's loads for Attributes. This is BI 7.0 system but the loads are still from 3.1.  DTP has not been implemented yet. The system was just upgraded in July.  I am unable to schedule deletes from the PSA for successful loads.  However, I think the data is updating to the info objects.  My Info package has the selection to PSA then in the InfoObject package by package.  
    How do I schedule the deletes from PSA and why does the Request updated show red but the monitor for the info package show green?
    Edited by: Joe Mallorey on Jan 27, 2009 5:46 PM

    Hi shikha,
    The load has not failed but I am unable to delete the load from the PSA.
    If you do a manage on the Data Source or go to PSA from RSA1 the first column has the green gear icon instead of a green check mark I have red triangle the mouse over says "request not updated to any data target using delta" The data has loaded to info object. I am trying to schedule deletes from the PSA and using the option to "delete only successfully booked/updated requests" So how do I get the request updated column to show a green check mark so my deletes will Process? This is for master data only. My transactions load fine and delete properly according to my settings.
    Thanks for the reply.
    Regards,
    JoeM

  • How to parse the year from a date.

    Hi,
    I have the following date format in database.
    05-SEP-07
    18-OCT-07
    18-OCT-07
    25-JUL-07
    18-OCT-07
    What I am trying to do is get only the full year from the date, example
    2007
    2008
    2009
    I tried to use extract function but that doesn't work due to ORA-01843: not a valid month.
    Thanks in advance.
    select extract(year from date '18-OCT-07') from dual;

    OK. You got an error using date literal. Did you at least look in documentation on date literals? I guess not, otherwise you'd see that data literal synatx is DATE 'YYYY-MM-DD':
    SQL> select extract(year from date '18-OCT-07') from dual;
    select extract(year from date '18-OCT-07') from dual
    ERROR at line 1:
    ORA-01843: not a valid month
    SQL>
    SQL> select extract(year from date '2007-10-18') from dual;
    EXTRACT(YEARFROMDATE'2007-10-18')
                                 2007
    SQL> SY.
    P.S. "I have the following date format in database" is completely wrong dates are always stored in one format - internal date format that stores year, month, day, hour, minute and second. When you select date using client tools like SQL*Plus, SQL Developer, Toad, etc. date is converted to string using either explicit or implicit format where you specify which parts of date you want to see.

  • Delta loads - for DB Connect  and master data.

    Hi there,
    Would appreciate if any one clarify me on:
    1) Are delta loads are possible as far as Master data is concerned?
    2) What are all types of data loads (Initial/ delta/ Full)are possible as far extracting data from DB connect is concerned?
    Thanks in anticipation.
    Rgds,
    A.
    Message was edited by: Ash

    Hi Ash,
    As far as I know, with DB connect you have the only option to have delta functionality - to use ODSs. Make full load from the source system to ODSs. Then update data targets from ODSs. During this update you'll have an option for init/full/delta load.
    Best regards,
    Eugene

  • Data loading of 0vendor from PSA to Data Target(info object)

    Dear Friends
    Here I have one problem that..i have loaded the master data 0vendor data is coming upto PSA but not going to data target i.e info object .
    Here I am giving the system message in RSMO (status tab)
    <b>Processing in Warehouse timed out; processing steps missing
    Diagnosis
    Processing the request in the BW system is taking a long time and the processing step has still not been executed.
    System response
    <DS:DE.RSCALLER>Caller  is still missing.
    Procedure
    Check in the process overview in the BW system whether processes are running in the BW system under the background user.
    If this is not the case, check the short dump overview in the BW system.</b>
    Yesterday also I got this error is it concerned to BASIS guy or have to do in BW.
    I remind you this is daily job .
    Thanks and Regards
    Rajasekar

    Hi Manfred
      Thanks for you response..
      i checked in ST22 i didnt find short dump and in RSMO i find <b>transfer(Idocs and trfc):Errors occured,
    Data Package 1 : arrived in BW ; Processing : Data packet not yet processed</b> in details tab..i have activated transfer rules also..data is available in PSA..in infopackage tab i have selected full update
    and in Processing tab:i find
    Data Package 1 ( 11442 Records ) : Errors occurred
    also showing transfer rules with red button.
    so if u give any solution for above problem i will be highly thankful to you
    Thanks and Regards
    Rajasekar

Maybe you are looking for