Data Staging Doubt??

Hi Gurus,
I learning BI. I created couple of scenarios in BI where sataging is done starting from DataSource-> PSA->Transformation->InforCube. Data load is done using DTP and directly data is reaching from datasource to InfoCube (Not using any infosource inbetween). It's working fine for me without any hassles.
My question is if it can be done in this way then whats the use of InfoSource in sending data from DataSource-> InfoCube?
Can you please explain me the use of InfoSource in SAP BI scenarios? with staging steps in BI 7.0?
Ritika

Hi,
In BI 7.0, infosource is used if there is more than one target to be updated from the same data source.
Datasource ->(Transfer rules)   infosource ->(Update rules)  datatargets
Infosource is not mandatory anymore. Scenarios for flexible infosource:
•A flexible infosource is necessary in order to use currency or unit conversion from the source datasource u2192 define infosource as an intermediate structure
•You can use a flexible infosource as a uniform source for several targets; the infosource can be the target from different sources.
So we can use an InfoSource as an optional structure between a transformation and another transformation. This is necessary, for example, to consolidate several sources into several targets or to use InfoObject-based information.
Note: for ‘direct’ InfoSources (For master data updates), there is no difference between ‘old’ and ‘new’ infosource, i.e. you can define a transformation as well as transfer rules.
Regards,
Priya.
Edited by: Priyadarshini K A on Aug 7, 2009 8:57 AM

Similar Messages

  • Difference between Data staging and Dimension Table ?

    Difference between Data staging  and Dimension Table ?

    Data Staging:
    Data extraction and transformation is done here.
    Meaning that, if we have source data in flat file, we extract it and load into staging tables, we take care of nulls, we change datetime format etc.. and after such cleansing/transformation at then end, load it to Dim/Fact tables
    Pros: Makes process simpler and easy and also we can keep track of data as we have data in staging
    Cons: Staging tables need space hence need memory space
    Dimension Table:
    tables which describes/stores the attribute about specific objects
    Below is star schema which has dimension storing information related to Product, Customer etc..
    -Vaibhav Chaudhari

  • Joining head- and item- records during data-staging

    Hi experts,
    I've got the following scenario:
    I get data from 2 datasources, one (myds_h) provides documents-heads and the other one provides document-items (myds_i).
    Unfortunately the myds_i-DataSource does not contain the head-fields (but foreign-key-fields with the reference to the head).
    For the reporting I'd like to provide item-level data containing the document-head-information as well.
    Which point of the data-staging in the BW-system would you recommend for doing this join?
    Maybe some options:
    a) I could enhance the myds_i-DataSource and do the join in the source-system.
    b) I could enhance the item-data using the transformation between the item-PSA and an item-ODS.
    c) I could enhance the item-data using a transformation between an item-ODS and an additional item/head-ods
    d) I could enhance the item-data using the transformation between the item-ODS and the final InfoCube.
    e) I could use an Analysis-Process-Chain and an InfoSet instead of the above mentioned transformations.
    Thanks for your comments and input in advance!
    Best regards,
      Marco
    Edited by: Marco Simon on Feb 13, 2012 3:52 PM - inserted one option.

    Hello Marco,
    In your solution a) to d), you will have some delta pb. If header data modification can occur without any modification on the items, you will lose some header modification in your item DSO.
    The easiest solution will probably be to handle the header data as a masterdata (header data as attributes of this masterdata), Header data will then be available with your item data. This solution may cause some performance problem if you have a lot of headers.
    Another solution will be to build a transformation between you header DSO and your item DSO (every header modification will modify all items of the header). This will bring some ABAP
    Regards,
    Fred

  • Difference bw the data staging of generic and appli specefic data sources

    Hi,
       Can anyone tell the difference between data staging of generic and appl specific data sources. Like we know that LO data stage in queued delta, update queue and BW delta queue, i want to know actually where the generic data stages before it is loaded into BW.
    Thanks.

    Generic data sources are based on either a DB table/view, a function module or an ABAP query. So normally the data stages in the corresponding DB tables or you calculate it at extraction time. There is no update queue like in LO.
    Best regards
       Dirk

  • Data staging in APO BW

    We are now running SCM 4.1 which has BW 3.5 and are currently using a external BW system for data staging before pulling it over into APO.
    Are there any consequences to doing the data staging on the APO BW side itself without using an external BW system? We can still use the external BW system for Reporting.

    without BW if you do APO/BW directly and then you use BW for reportiing the data will not be in sync.

  • Currency Data Model doubt

    Dear experts,
    I have a doubt regarding data modelling in BPC.
    I have a dimension called Units which is my Currency dimension in the Application, I don't use the standard RptCurrency diemnsion. This dimension has members which are crrencies such as EUR, DOL, LC and units such as BOX, KG.
    We have done this because we plan in the input schedules, Sales Volumes in BOX and Prices in EUR.  Now we want to calculate de Net Incomes, which should be BOX * EUR. We are not able to create the script logic which does this because of the currency. 
    Is this model correct? This is, does it have sense in BPC or is it possible, to multiply two accounts with different currencies, or should we instead, need another dimension (user defined) for the units maintaining the standard currency Dimension (RptCurrency)?
    This is a very common thing to do when planning, how do you experts solve this?
    Thanls a lot for your time and your help.
    Best regards,
    Cecilia

    it contains only 11 places before decimals ...
    below program gives a dump with
    'Overflow when converting from "1234567890123.11"'
    data : v_curr type PAD_AMT7S. "( Curr 13 , 2)
    v_curr = '1234567890123.11'.
    while it runs perfectly when we give
    v_curr = '12345678901.11'. " ( 11 + 2 )

  • Data type doubt

    Hi Friends,
    I have a small doubt, Actually I have created a 'Z' table and in that I took a field as integer. And send to the production it is working fine. Now the user wants to change the data type (he wants to enter negatives values for that field) If I change the table and do it does it effect the data which is already keyed ?
    If so how to do it.
    Regards,
    Line

    CURR fields can also have negative values by default in the database.  There is nothing special that you need to do to handle this when writing to the database.  If your issue is allowing the user to enter this value in a screen, then there is some formatting that you do require.
    Go to your screen using screen painter
    Click on the element list tab
    click on the texts/i/o templates
    Find your field in the list
    Now in the Text or I/O field, you will see an underscore ie _______________
    Just add a "V" to this, so it looks like this  ______________V
    Doing so, is a placeholder for the sign.  Now the user can enter negative values in your screen.
    Regards,
    Rich Heilman

  • Data loading doubts

    Hi Gurus
    Just trying to join the pieces to understand the system more accurately.Would you pl help m to clear my following doubts?
    1.what is RSRV testing and where can we use it? How to do it?
    2.From where an we schedule the process chain ? Can I have the path to schedule it? And if we are scheduling process chain then is there a need to schedle the info pkg?
    3.What is time stamp and how it works ?
    Thanks in advance
    KK

    Hi Krishna,
    It is not reqd to do this after every data load. But lets say you want to see the  number of records in the cube tables (for performance tuning) then you can use an RSRV test: RSRV > Elementary Tests > database > Database information about InfoProvider tables.
    Or sometimes the SIDs of InfoObject values are corrupted and you can try to repair with the RSRV test for Master data.
    If you open up eahc test in RSRV, you can click on the Help button (rightmost) and read a description.
    Hope this helps...

  • Master data Load doubt

    Hi All,
    We can load Master data with Infosource with Direct Update and Infosource with Flexibel update.
    While doing Insource with direct update our communication Structure will be act as datatarget. If we load by use of direct update report is not possible for Master data.
    While doing Flexible update Reporting is possible from master data.
    BW is mainly focussed on Analyzing data and taking report only, for this case why they have given Direct update method.
    Whatz the befit of using this Infosource with direct update method.
    Need more clarification on this.
    Regards,
    Arun.M.D

    Dear,
    Master data can be stored in 2 places
    1. in Infoobject itself
    2. in a data target (ODS)
    We use Direct Infosource when we want to store the master data in the infoobject itself
    We use Flexible Infosource when we want to store the master data in the data target like ods.
    Hence, when master data is stored in data targets, it is available for reporting.
    If your data is stored in infoobject through direct infosource, still you can use it as data target by setting the option for that infoobject as data target( in Master data tab).
    Thanks

  • Data Element Doubt

    Generally we will specify three types of descriptions(Long,Meduim,Short) in the data element along with the data element description.
    1. In which cases those three types of texts are used ?
    2. I think the description of data element comes to the field description in the table .... Right ?

    hi manjunath,
    the reson for specifying 3 texts are that
    1) used in screen elements based on DATA element.
        there is something called program parameter OR
        Dictionary parameter
        when you take dictionary parameter based on the screen field size the appropriate text is shown on the screen - i.e either in push buttons or other elements
    2) in ALV you may build a field catalog manually where in the output columns should have a text.
    for this you can either doctionary texts(S,M,L) so then the Texts maintained against the Data element are shown in output.
    OR when you use the FIELDCATALOG MERGE Function Module to create your Field Catalog based a structure or Internal table then the Column Texts are displayed based on the user interaction on the output columns.
    Second Question - Yes the Description you write for a  Data Element will be Used as you Field Description as you know that whenver a FIELD in a table is created you can specify the length of the field either through Predefined Direct Types or Trhough Data Elements.
    When you define thorugh DATA elements the Data Elements Description is taken as Field Description
    ELSE
    When you take Direct Type then you need to also Manually Write the Field Description.
    Hope this clarifies what you have been looking for.
    Encuorage others to answer your queries by suitably rewarding them.
    Thanks
    Venugopal

  • Data Staging Training

    Dear all,
    I am a certified BW 2.0 Consultant that also followed the delta courses for both 3.0 and 3.5. However, if I want to become a certified BW 3.5 Consultant, I need to do a complete certification again. I already received hardcopies of most of the documentation for 3.5 but I am missing the BW340. Can somebody supply me with a softcopy?
    Best regards,
    Robert Boneschansker
    [email protected]

    Well to be honest , there are some online institutes that do offer training but Ralph Kimball has some great books on it.
    Mohammad Farhan Alam

  • Can we use 0INFOPROV as a selection in Load from Data Stream

    Hi,
    We have implemented BW-SEM BPS and BCS (SEM-BW - 602 and BI 7 ) in our company.
    We have two BPS cubes for Cost Center and Revenue Planning and we have Actuals Data staging cube, we use 0SEM_BCS_10 to load actuals.
    We created a MultiProvider on BPS cubes and Staging cube as a Source Data Basis for BCS.
    Issue:
    When loading plan data or Actuals data into BCS (0BCS_C11) cube using Load from Data Stream method, we have performance issue, We automated load process in a Process Chain. Some times it take about 20 hrs for only Plan data load for 3 group currencies and then elimination tasks.
    What I noticed is, (for example/) when loading Plan data, system is also reading Actuals Cube which is not required, there is no selection available in Mapping or selection tab where I can restrict data load from a particular cube.
    I tried to add 0INFOPROV into databasis but then it doen't show up as selection option in the data collection tasks.
    Is there a way where I can restrict data load into BCS using this load option and able to restrict on cube I will be reading data from ?
    I know that there is a filter Badi available, but not sure how it works.
    Thanks !!
    Naveen Rao Kattela

    Thanks Eugene,
    We do have other characteristics like Value Type (10 = Actual and 20 = Plan) and Version (100 = USD Actual and 200 = USD Plan), but when I am loading data into BCS using Load from Data Stream method, the request goes to all the underlying cubes, which in my case are Planning cubes and Actual Cube, but I don't want request to goto Actual Cube when I am running only Plan load. I think its causing some performance issue.
    For this reason I am thinking if I can use 0INFOPROV as we use in Bex queries to filter the InfoProvider so that the data load performance will improve. 
    I was able to to bring in 0INFOPROV into DataBasis by adding 0INFOPROV  in the characteristics folder used by the Data Basis.
    I am able to see this InfoObject Data Stream Fileds tab. I checked marked it to use it in the selection and regenerated the databasis.
    I was expecting that now this field would be available for selection in data collection method, but its not.
    So If its confirmed that there is no way we can use 0INFOPROV as a selection then I would suggest my client for a redesign of the DataBasis it self.
    Thanks,
    Naveen Rao Kattela

  • Multiple data loads in PSA with write optimized DSO objects

    Dear all,
    Could someone tell me how to deal with this situation?
    We are using write optimized DSO objects in our staging area. These DSO are filled with full loads from a BOB SAP environment.
    The content of these DSO u2013objects are deleted before loading, but we would like to keep the data in the PSA for error tracking and solving. This also provides the opportunity to see what are the differences between two data loads.
    For the normal operation the most recent package in the PSA should be loaded into these DSO-objects (as normal data staging in BW 3.5 and before) .
    As far as we can see, it is not possible to load only the most recent data into the staging layer. This will cause duplicate record errors when there are more data loads in the PSA.
    We all ready tried the functionality in the DTP with u201Call new records, but that only loads the oldest data package and is not processing the new PSA loads.
    Does any of you have a solution for this?
    Thanks in advance.
    Harald

    Hi Ajax,
    I did think about this, but it is more a work around. Call me naive but it should be working as it did in BW3.5!
    The proposed solution will ask a lot of maintenance afterwards. Beside that you also get a problem with changing PSA id's after the have been changed. If you use the posibility to delete the content of a PSA table via the process chain, it will fail when the datasourcese is changed due to a newly generated PSA table ID.
    Regards,
    Harald

  • Issue in import data from OIM to OIA

    Hi All,
    While importing entitlements from OIM to OIA ,we are getting following exceptions in OIA logs :
    19:01:29,839 INFO [OIMDataProviderImpl] Start : Get Status of Data Collection ...
    19:01:29,860 DEBUG[IamDataSyncMonitorImpl] OIMStatus FAILED
    19:01:29,860 DEBUG[DBIAMSolution] Data Staging Failed: DATA COLLECTION FAILED. Aborting. Let'sFinalize data collection.
      19:01:29,860 INFO [OIMDataProviderImpl] Start Finalize Data Collection ...
      19:01:30,063 INFO [OIMDataProviderImpl] End Finalize Data Collection ...
      19:01:30,063 ERROR[DBIAMSolution] ERROR: Data Staging Failed: DATA COLLECTION FAILED. Aborting.
      19:01:30,071 DEBUG[DBIAMSolution] publishing import completed event...
      19:01:30,072 DEBUG[DefaultIAMListener] Queuing IAM Event.com.vaau.rbacx.iam.IAMEvent[source=com.vaau.rbacx.iam.db.DBIAMSolution@75592f8b]
      19:01:30,072 ERROR[DBIAMSolution] Data Staging Failed: DATA COLLECTION FAILED. Aborting.
    19:01:30,899 DEBUG[DefaultIAMListener] ProcessingEvent:com.vaau.rbacx.iam.IAMEvent[source=com.vaau.rbacx.iam.db.DBIAMSolution@75592f8b],
    exception:null
    Please provide any helpful pointer to resolve this issue.
    Thanks,
    RPB

    Hi All,
    While importing entitlements from OIM to OIA ,we are getting following exceptions in OIA logs :
    19:01:29,839 INFO [OIMDataProviderImpl] Start : Get Status of Data Collection ...
    19:01:29,860 DEBUG[IamDataSyncMonitorImpl] OIMStatus FAILED
    19:01:29,860 DEBUG[DBIAMSolution] Data Staging Failed: DATA COLLECTION FAILED. Aborting. Let'sFinalize data collection.
      19:01:29,860 INFO [OIMDataProviderImpl] Start Finalize Data Collection ...
      19:01:30,063 INFO [OIMDataProviderImpl] End Finalize Data Collection ...
      19:01:30,063 ERROR[DBIAMSolution] ERROR: Data Staging Failed: DATA COLLECTION FAILED. Aborting.
      19:01:30,071 DEBUG[DBIAMSolution] publishing import completed event...
      19:01:30,072 DEBUG[DefaultIAMListener] Queuing IAM Event.com.vaau.rbacx.iam.IAMEvent[source=com.vaau.rbacx.iam.db.DBIAMSolution@75592f8b]
      19:01:30,072 ERROR[DBIAMSolution] Data Staging Failed: DATA COLLECTION FAILED. Aborting.
    19:01:30,899 DEBUG[DefaultIAMListener] ProcessingEvent:com.vaau.rbacx.iam.IAMEvent[source=com.vaau.rbacx.iam.db.DBIAMSolution@75592f8b],
    exception:null
    Please provide any helpful pointer to resolve this issue.
    Thanks,
    RPB

  • How to make use of Index of a table in report to fetch data?

    Hi,
    I need a sample code for select statement which is making use of INDEX of a table
    to fetch data.
    Doubt:
    Can I fetch all the fields in the table by passing certain key fields of INDEX in where condition?

    Hi Raja,
    1) Mention the fields that you wish from database table (incase you don't need all the fields from the database table).
    2) Don't use the INTO CORRESPONDING FIELDS OF TABLE ztable clause.
    3)Instead use INTO TABLE ztable (But take care that during the declaration of the ztable, the fields declared are in order that in database table to fetch the Records in sequence).
    Please Find the Syntax and Code Below..
    SELECT *  FROM <TABLE>
      WHERE  <WHERE>
        %_HINTS ORACLE 'INDEX("<TABLE>~<INDEX ID")'.
    SELECT carrid
    INTO TABLE t_spfli
    FROM spfli
    WHERE carrud IN s_carrid AND
    connid IN s_connid
    %_HINTS ORACLE 'INDEX("&SPFLI&" "SPFLI~XXX")'.
    Hope this Is helpFul
    Thanks
    kalyan

Maybe you are looking for

  • Posting to Two different GL account for one condition type

    Hi All, In MM pricing procedure we are assigning one new condition type as Management Exp Charges. we creat two Transaction/Event key and assigned one in ActKy Colum and another in in Accural column. Suppose we are charging  10 $ towards Management E

  • BAPI_GOODSMVT_CREATE - Still to be Delivered Quanity

    Hi All, In the transaction Me23n, I can display my Purchase Order Number. In the Header Data, Under the Status tab, I can see the Quantities as below: Ordered: 400 Delivered: 300 Still to be delivered: 100 Now I run the BAPI_GOODSMVT_CREATE with the

  • Sendmail/Postfix problem

    Hi We are having troubles sending email from Sendmail/Postfix on a new Mac Mini/10.5.2. The mail log continually shows "Operation timed out (port 25)" errors for every outbound mail. There is no firewall between the server and the Internet, so nothin

  • Function setting

    I wrote a dozen of simular functions like below, the bolded word can be anything like velocity, scale, ect... function velocityGen() {     velocityArray.splice(0, velocityArray.length-1);     for (var i:int = 0; i<count; i++) {         velocity = Mat

  • ABAP program RHALEINI

    Hi All,     I want to execute the SAP standard program RHALEINI from my customised program using SUBMIT statement. I want to pass the values for Plan Version , Object Id and Object type from my customised program. If you have any clue to which variab