Loading historical data in DBI

Hi
We are implementing DBI v7.x in a project. Any suggestions on how we can load 3 years historical data (currently in a non-Oracle environment) into the DBI
Thanks
Senthil

You need to have Oracle Apps Implemented to implement DBI.
But the latest versions of DBI has the flexibility to get data from other sources and create DBI dashboards with the old data.

Similar Messages

  • Loading Historical data to the new field without deleting the data in cube

    Dear BI Experts,
    I have enhanced a new field to the Generic data source in BI 7.0 .
    I need to load historical data to the newly  appended field.
    As we are having very huge data it is not possible to delete and do init again.
    Is there any  other possibility to load the historical data for the new appeneded field without deleting the old requests?
    Thanks for your Kind  help.
    Kind Regards,
    Sunil

    Dear Sushant,
    Thanks for your reply.
    But I am just wondeing if there is any possibility of loading historical data for new field using Remodelling  concept
    with out deleting old requests.
    I do not know about Remodelling conept but heard that it is not recommeneded to use.
    Can you please suggest and help.
    Thanks and Regards,
    Sunil Kotne

  • Reg: Loading historic data for the enhanced field

    Hello All,
    We need to add a new field 0VENDOR to our datasource 0FI_GL_4. This field is available in our BSEG table. Hence, we are planning to go ahead with datasource enhancement.
    Now, please advice on how to update the historical data to this newly added field.I have heard there is a BW functionality/program to do so without deleting the entire data. Kindly advice on the possible solutions.
    Thanks & Regards
    Sneha Santhanakrishnan

    HI Sneha,
    Using remodeling option you will be able to do that, ie.. loading historical data for new attributes without deleting existing data. But the problem is in remodeling either you can assign constant or some other attribute value or values determined using EXIT.
    Now when you are loading data from source system and if you need historical data as well the best practise is delete existing data and reload it from source sytsem.
    But if you don't want to do that then I can give you one trick but not sure whether it will work or not. The idea is to populate the historical values for 0Vendor using customer exit option of remodeling. Now to get the historical values in customer exit you will need all that data in some BW table, here you can think of creating some generic extractor which will store the values of all the documents and the respective vendor, as you will load data form source system you will get historical values as well.
    Now read that table in customer exit and populate vendor value. This will be one time process to populate the historical values.
    Regards,
    Durgesh.

  • Init Load,Historical Data Load & Delta Load with 'No Marker Update'

    What is the difference b/w Compression in Inventroy cube while Init Load,Historical Data Load & Delta Load with 'No Marker Update'???
    please help for this quesy..
    Thanks
    Gaurav

    Hi Gaurav,
    I believe you will find the answers here...
    [http://www.sdn.sap.com/irj/scn/index?rid=/library/uuid/e0b8dfe6-fe1c-2a10-e8bd-c7acc921f366&overridelayout=true]
    regards
    Pavel

  • Loading historical data with 0CO_PC_ACT_02

    Hello,
        I need to load historical values with infosource 0CO_PC_ACT_02.  I can load the current period, but get no data from R/3 for historical periods.  When I run RSA3 on R/3 for a historical period, I get no data, so I don't believe that this is a BW Issue.
        My question:
       1.  Is there a job or something on R/3 I need to run to enable this data source to pull historical data?  If so, what is it and how do I run it?
       2.  Is this data source simply not able to pull historical data?
    Thanks.
    Dave

    Hi All,
    I have same issue , any one got work around to load history data with this Extractor(0CO_PC_ACT_02) ?

  • Loading historical data from excel

    Hi GURU..In the best practice i have read it's written that a sample excel file.csv exists. Where can i find it? i dont have a cd with the best practices but i've seen them in a file.doc..thx
    Edited by: Andreolli2 on Mar 28, 2011 3:42 PM

    Sorry im new..im talking about the demand planning best practices..when i try to start the job to load the data a read the message Error in conversion exit CONVERSION_EXIT_CUNIT_INPUT. If i go to the monitor the status i red and i read:
    Error message from the source system
    Diagnosis
    An error occurred in the source system.
    System Response
    An error occurred in Extractor .
    Check the error message.
    Procedure
    The procedure for removing errors takes place depending on
    the error message:
    Note
    If the source system is a Client Workstation, then it is possible that the file that was to be loaded was being edited at the time of the data request. Make sure that the file is in the specified directory and is not being edited, and restart the request.
    In the details i read:
    -error in data request
    -error occurred in the data selection
    -processing (data packet): No data

  • Historical Data with it's delta loading from PSA to ODS in 3.x model

    Hello ,
    I want to load the one year back historical data with it's delta's to ODS object in BW 3.x model .
    So can you please give steps to load historical data from PSA table into ODS object(not from the source system).
    Thnaks alot
    Regards
    BI Beginner

    Hi
    Run the full load IP from PSA to ODS with selections on 0CALDAY(give one year selections)
    make this full load as repair full request.In display mode of IP, in menu bar click on scheduler --> select repair full --->check the option and click ok.
    now execute the IP.
    If you run like this, your init which has between your PSA and ODS will no get disturb.
    Regards,
    Venkatesh.

  • Loading History Data

    Hi All.
    I have some data loaded in BW flow from Nov 13 to Till date.Its a daily load.Now the client wants the data before Nov. 13 also.One month data has above 2 crore records.Its an FIGL Load.As its a daily load,i have a small window of time to load the history data before the daily data load happens.So i plan to load data for 1 month at a time.
    How to go about this.Repair Full i need to do?Have not done it before,can anyone guide?Any chance of PSA getting corrupt?How will delta work
    ?Will it affect the daily load?Do i need to make the history request red in PSA to load and then load into my flow maually when the daily chain is not going as the FIGL load goes into many other flows.Its in the Production so if anything goes wrong.Everything will be impacted.
    Thanks
    Aditya

    My understanding was this:
    You enhanced BW cube with 2 fields, move to prod. everything was fine.
    On BPC Side you loaded data from BW Enhanced cube by usig DM pkges as-usual. But your BPC cube was doubled after this load.
    If yes then, Check as suggested above.
    One more major point here. You did bw cube enhanced as your business in BW only(guess). For added fields you may be loaded historical data.  am i right.think so.
    So now your BW cube have historical data till now. When coming BPC cube, you may not be deleted data from  BPC Cube (old data) which you loaded,because its not necessary. If your BPC have some data before this load, then after this load its may doubled. because BW Cube have historical data.  BPC Cube old data was lied in bw historical data, whatever option you used at DM pkg, you get double.
    Even if you loaded delta pkg at BPC side also you will get doubled. because source bw cube have old data in new request form.
    for BW enhancement - BW deleted data and reloaded
    For BPC - didn't deleted,without deletion, data requested from bw cube, so bpc data will be double.
    Work around: if above my points are right then you just delete BPC cube data and reload whole again.
    When enhancing BW cubes which are used in BPC models. need to remember about this data issue.
    Thanks.

  • Historical data and Column Chart

    Hello
    I am trying to retrive historical data from the repository and show it as a column chart through flex UI Implementation.
    My Metric is defined as below and it has two columns defined as KEY columns
    <Metric NAME="EnergyConsumptionCost" TYPE="TABLE" USAGE_TYPE="VIEW_COLLECT" FORCE_CACHE="TRUE">
    <Display>
    <Label NLSID="flex_energyCosumptionCost">Energy Consumption Cost Metric</Label>
    </Display>
    <TableDescriptor>
    <ColumnDescriptor NAME="DeviceID" TYPE="STRING" IS_KEY="FALSE">
    <Display>
    <Label NLSID="flex_DeviceID">Device ID</Label>
    </Display>
    </ColumnDescriptor>
    <ColumnDescriptor NAME="DeviceName" TYPE="STRING" IS_KEY="TRUE">
    <Display>
    <Label NLSID="flex_deviceName">Device Name</Label>
    </Display>
    </ColumnDescriptor>
    <ColumnDescriptor NAME="EnergyCost" TYPE="NUMBER" IS_KEY="FALSE">
    <Display>
    <Label NLSID="flex_EnergyCost">Energy Cost</Label>
    </Display>
    </ColumnDescriptor>
    <ColumnDescriptor NAME="CurrencySymbol" TYPE="STRING" IS_KEY="FALSE">
    <Display>
    <Label NLSID="flex_Currency">Currency</Label>
    </Display>
    </ColumnDescriptor>
    <ColumnDescriptor NAME="EnergyConsumption" TYPE="NUMBER" IS_KEY="FALSE">
    <Display>
    <Label NLSID="flex_energyConsumption">Energy Consumption</Label>
    </Display>
    </ColumnDescriptor>
    <ColumnDescriptor NAME="Unit" TYPE="STRING" IS_KEY="FALSE">
    <Display>
    <Label NLSID="flex_Unit">Unit</Label>
    </Display>
    </ColumnDescriptor>
    <ColumnDescriptor NAME="PerUnitRate" TYPE="NUMBER" IS_KEY="FALSE">
    <Display>
    <Label NLSID="flex_PerUnitRate">Unit Rate</Label>
    </Display>
    </ColumnDescriptor>
    <ColumnDescriptor NAME="DateCollected" TYPE="STRING" IS_KEY="TRUE">
    <Display>
    <Label NLSID="flex_DateCollected">Metric Collection Date</Label>
    </Display>
    </ColumnDescriptor>
    <ColumnDescriptor NAME="ExtraDetails" TYPE="STRING" IS_KEY="FALSE">
    <Display>
    <Label NLSID="flex_extraDetails">Extra Details</Label>
    </Display>
    </ColumnDescriptor>
    </TableDescriptor>
    <QueryDescriptor FETCHLET_ID="OSLineToken">
    <Property NAME="scriptsDir" SCOPE="SYSTEMGLOBAL">scriptsDir</Property>
    <Property NAME="classpath" SCOPE="GLOBAL">$CLASSPATH:%scriptsDir%/EnergyDashboardEjbBeanService-Client.jar</Property>
    <Property NAME="command" SCOPE="GLOBAL">java -cp %classpath% com.avocent.trellis.oem.plugin.energy.insight.proxy.EnergyDashboardEjbBeanServicePortClient GraphRequest %DeviceID% %StartDate% %EndDate% </Property>
    <Property NAME="startsWith" SCOPE="GLOBAL">ext_result=</Property>
    <Property NAME="delimiter" SCOPE="GLOBAL">|</Property>
    </QueryDescriptor>
    </Metric>
    I am implement the UI now using flex programming and below is the code of my page contoller class
    public class emersontrellisflexHomePageController extends ActivityController
    private var page:emersontrellisflexHomePage;
    [Bindable]
    public var metricDataArrayCollection:ArrayCollection;
    public function emersontrellisflexHomePageController()
    override public function init(pg:IActivity):void
    super.init(pg);
    page = pg as emersontrellisflexHomePage;
    // get EnergyConsumptionCost metric to get energy consumption cost information
    var procMetric:Metric = ApplicationContext.getTargetContext().getMetric("EnergyConsumptionCost");
    var procSelector:MetricSelector = procMetric.getSelector(['DeviceID', 'DeviceName','EnergyCost','CurrencySymbol','EnergyConsumption','Unit','PerUnitRate','DateCollected','ExtraDetails']);
    procSelector.getData(energyConsumptionCostHandler, MetricCollectionTimePeriod.CURRENT, page.getBatchRequest());
    // structure of the data is as follows:
    // MetricResultSet -- contains an Array (results) of all the datapoints. If the
    // time period is LAST_* then this will be multiple datapoints for each timestamp.
    // results[] -- each row in results is a TimestampMetricData datapoint.
    // TimestampMetricData -- a datapoint that includes a timestamp and then a table (data) of
    // data for all keys included in the sample. If the metric does not have any keys
    // then the table will only have a single row.
    // data -- the data table, each row is a KeyMetricData object
    // KeyMetricData -- the data row, it includes a MetricKeyValue (metricKey) that
    // identifies the key associated with that row (if there is one). All other
    // data is available via a direct reference by column name, e.g. data[n]['ColumnId']
    public function energyConsumptionCostHandler(procResult:MetricResultSet, fault:ServiceFault):void
    if(fault != null)
    MpLog.logError(fault, "Get Processes");
    return;
    if(procResult!=null)
    var dataLength:int = 0;
    var dp:TimestampMetricData = procResult.results[0];
    if(dp != null)
    dataLength = dp.data.length;
    var metricData:EcecObject ;
    var data:KeyMetricData;
    metricDataArrayCollection= new ArrayCollection();
    for(var i:int=0; i<dataLength; i++)
    data= dp.data;
    Alert.show("Data["+i+"]="+dp.data[i].toString());
    metricData= new EcecObject();
    metricData.deviceID=dp.data[i]['DeviceID'];
    metricData.deviceName=dp.data[i]['DeviceName'];
    metricData.energyCost=dp.data[i]['EnergyCost'];
    metricData.currencySymbol=dp.data[i]['CurrencySymbol'];
    metricData.energyConsumption=dp.data[i]['EnergyConsumption'];
    metricData.unit=dp.data[i]['Unit'];
    metricData.perUnitRate=dp.data[i]['PerUnitRate'];
    metricData.dateCollected=dp.data[i]['DateCollected'];
    metricData.extraDetails=dp.data[i]['ExtraDetails'];
    metricDataArrayCollection.addItem(metricData);
    page.setModel("metricDataCollection",metricDataArrayCollection);
    } //End of function energyConsumptionCostHandler
    } //End of class
    In the handler code I am populating the ArrayCollection with the objects of class "EcecObject" which is defined by me and looks as below
    package mpcui
    [Bindable]
    public class EcecObject
    public function EcecObject()
    public var deviceID:String;
    public var deviceName:String;
    public var energyCost:Number;
    public var currencySymbol:String;
    public var energyConsumption:Number;
    public var unit:String;
    public var perUnitRate:Number;
    public var dateCollected:String;
    public var extraDetails:String;
    public function toString():String
    // TODO Auto Generated method stub
    var result:String= "DeviceId="+deviceID+" DeviceName="+deviceName+" EnergyCost="+energyCost+" CurrencySymbol="+currencySymbol+" EnergyComsumption="+energyConsumption+" Unit="+unit+" unitRate="+this.perUnitRate+" dateCollected="+dateCollected+" ExtraDetails="+extraDetails;
    return result;
    Sample below shows the data populated in the object of type "EcecObject"
    (mx.collections::ArrayCollection)#0
    filterFunction = (null)
    length = 15
    list = (mx.collections::ArrayList)#1
    length = 15
    source = (Array)#2
    [0] (mpcui::EcecObject)#3
    currencySymbol = ".25"
    dateCollected = (null)
    deviceID = "KWH"
    deviceName = (null)
    energyConsumption = NaN
    energyCost = NaN
    extraDetails = "d6ffcc59-ec5a-424d-a950-dcf3944059c9"
    perUnitRate = NaN
    unit = "Dollar"
    So if we see in the attribute "currencySymbol" the "perUnitRate" is poulated even if my handler code has the following statement
    metricData.currencySymbol=dp.data[i]['CurrencySymbol'];
    Want to know why the data is swaped in most of the cases. Any correction to be done in the handler code.
    Appreciate your help.
    Regards
    Anand.

    The design depends on how much data you are having. If you have lot of data its better to use them separately.
    When you create a cube you load historical data from source systems through a full load and later you will be doing delta loads to process new records.
    As for as the data is conccerned a cube can contain both historical and transaction data.

  • No PS9.0_SDE & SIL Mapping to load Historical PS E-Perfor data into OBIEE

    Hi Folks,
    we have the latest BI environment OBIEE 10.1.3.4.1 and using PeopleSoft 9.0 as the source.
    The Peoplesoft 9.0 has new set of E-Performance tables when compared to older Peoplesoft tables.
    so when the systems were upgraded from Peoplesoft 8.2 to 9.0, the historic E-Performance data is not there in the new PS 9.0 E-Performance table.
    when this PS9.0 is processed into OBIEE, the OBA Warehouse contains only New e-perf data and NO Historic data.
    i went through all the mappings and did not find any mapping related to load the Historic data. the SDE_PSFT_90_ADOPTER contains mappings only related to new Peoplesoft tables. is there any mapping related to this? or a new customized mapping has to be created.?
    New Peoplesoft9.0 table: PS_EP_APPR and PS_REVW_RATING_TBL
    OLD PeopleSoft table: PS_Employee_review
    and in OBAW, the table that is populated using the new table is w_wrkfc_Evt_f (NRMLSD_PERF_RATING column) and Performance Band Code
    your suggestions are appreciated
    thank you

    Had a similar issue at my current client. If you check the PSoft upgrade docs, Oracle actually had no upgrade path for the PS_EMPLOYEE_REVIEW table as the effective date logic for this table is ambiguous. Here is an excerpt:
    The existing Employee Review family (PS_EMPLOYEE_REVIEW) stores both the Commercial and Federal reviews/appraisals. After analyzing the key structure and functionality of the Employee Review structures, it was realized that there is no way to upgrade this data into the new ePerformance (PS_EP_APPR) key structure with any certainty of functional accuracy. This is because the keys to PS_EMPLOYEE_REVIEW are:The fact that PS_EMPLOYEE_REVIEW is effective-dated, and its FROM/THRU dates attributes are both optional, make it almost impossible to translate into PS_EP_APPR's required Period dates. Even the fundamental question of “What does a new effective-date mean?” is ambiguous
    I also checked the OBIA 7.9.6 mappings, the PS_EMPLOYEE_REVIEW table is not brought in. Only the PS_EP_APPR table is used as a source for the SDEs. I even checked the PSoft 8.8 mappings, and cound not find it there either. We had to customnize the existing ETLs to extract from the PS_EMPLOYEE_REVIEW table.
    Ahsan

  • Loading historical conversion data

    Hi,
    I am trying to load Claims master data with history of the status from legacy system as flat file into the PSA and then DSO (Staging). From here I want to load to Claim_number master data where status is time dependant. I have 166 status history records for 10 claims (took a smaller subset) and it loads fine in PSA and DSO. BUt when it loads into the master data, only the last status is being updated and the history doesn't load. Can anyone help me here. I need to load 10 yrs history into BI.

    the key fields in dso should be claim_no,date_from,date_to.
    so you should have multiple records for same claim number with different date_to and date_from.
    check whether all these are taken care properly.then if you load into master you will get historic data.

  • Historical data from ACE20 load balancer modules

    Hello,
    I am trying to get some historical data from our ACE20 load balancer modules which are housed in a 6504, but cannot seem to find where to get this information from. I would like to get information such as memory used, CPU active connections, throughput, http requests etc over the past month, is there any way that I can do this? I am using ANM 2.0
    Thanks

    Hello,
    Below are teh steps to load from DB connect.
    Just go to Rsa1->modeling->source system->double click on the source system. On right habd side side right click on the topmost node.now it will take u to a screen, there u can give the table name (if u know) or just execute. it will display all the tables u have in Oracle. there double click on th eselected table and select the fields requared and now generate data source. now come back to RSA1->source system-> replicate the data source. now assign infosouce as usual.And create infopackage and start the load.
    And this doesn't support delta process.
    This document could help:
    https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/2f0fea94-0501-0010-829c-d6b5c2ae5e40
    Regards,
    Dhanya

  • Historical data load for newly added fields

    Hi,
    we are using cube 0UCSA_C01 with delta update which has gone live now the requirement is we have to  add 2 new fields to this cube  but the problem is how can we load previous data for dese fields dat has been updated already . Plz letme know hw this can b acheieved  plz letme know the proceedure to achieve dis..
    Thnx
    help will b apprciated

    Hi,
    populate historical data into a newly added field in an infoprovider
    How to populate historical data into a newly added field in an infoprovider
    /people/dinesh.lalchand/blog/2006/02/07/loopback-process--follow-up
    http://sap.ittoolbox.com/documents/popular-q-and-a/adding-characteristics-to-infocubes-2138
    Thanks,
    JituK

  • Trading Partner field (VBUND) is left blank in EC-PCA historical data load

    Hello everyone,
    I have a question regarding trading partner field (VBUND) in EC-PCA. In my company we are turning on this Profit Center Accounting functionality and we need to know how to bring this trading partner field filled when we make the historical data load.
    We have tested the transactions 1KE8 (FI docs), 1KEC (MM docs) and 1KE9 (SD docs) that are used to transfer real data for GLPCA table. Unfortunately, VBUND field is left blank when using these transactions.
    This impacts the opened journal entries for AR/AP (Accounts Receivable and Accounts Payable), as the ones made prior to the go-live of this EC-PCA functionality would not have this trading partner information, and the ones made after go-live would carry the information (i.e., not matching the documents).
    Does anyone know how to solve this problem?

    I worked on your issue but it seems there is no standard option to update VBUND field apart from writing an ABAP program to update GLPCA from FI documents.
    Regards
    Siva

  • Load Historic Sales Data into SEM

    Hi experts,
    I have a requirement in my project where as part of system readiness check it has been asked to "Load Historic Sales Data into SEM".Does anyone know as to how to proceed with the requirement.
    Has anyone dealt with this kind of requirement ever ?
    If yes can you tell me the steps please.
    Watever information you guys have please share it with me.
    Usefull answers will be rewarded subsequently.
    Thanks in advance,
    Abhinav Mahul.

    Hi Abhinav,
    Hav you integrated your system with BW system.
    If yess then you need to upload all previous sales information from CRM system to BW system.
    Secondly, copy all the CRM sales order data residing in an infocube to be replicated to a sales planning sales order infocube of SEM.
    This is what is meant by Loading historic sales data into SEM.
    To do this you would require a SEM/BW consultant.
    You can use a funtion name 'COPY' in BPS-SEM to copy one sales infocube to other.
    Best Regards,
    Pratik Patel
    <b>Reward with Points!</b>

Maybe you are looking for

  • IdM 8.1: Problems to provision Solaris 10 with non root user.

    Hello When IdM 8.1 tries to create a user into Solaris 10 I got this Error: com.waveset.util.WavesetException: An error occurred adding user 'testSolaris' to resource 'Test_Solaris'. com.waveset.util.WavesetException: Script failed waiting for "_,)#(

  • I just bought a C4680 all-in-one new in box. It puts an irregular border around "borderless" prints

    I just bought a C4680 all-in-one new in box.  It puts an irregular border around "borderless" prints (at least in 8.5 x ll size) --border is about 1 inch on right, 3/4 at top, 3/8 at bottom and 3/16 at left.  How do I eliminate this?   Using win XP.

  • Generic cursor type // generic datatype

    Hi, I would like to dynamically declare a cursor in my procedure and retrieve rows from it for firing insert / update statements on another table. The procedure should take a table_name in VARCHAR2 and use it to declare a cursor like: procedure thisP

  • Outlook 2011 with Time Machine Back up ??

    I am using Timemachine backup by using Lacie usb harddisk. I am not sure if my Outlook 2011 documents are being backed up? How can i make sure about that?

  • Issues with Query Caching in MII

    Hi All, I am facing a strange problem with Query caching in MII query. Have created one xacute query and set cache duration 30 sec. The associated BLS with the query retrieves data from SAP system. In the web page this value is populated by executing