Filtering data loaded to an infocube from an ODS

Hello people!
I have a question for you.
I have an ODS (lets name it ODS_1) that has for example information about sales and infcubes (CUBE_A and CUBE_B) that uses the ODS as DataSource.
In ODS_1 I have the information for all sales, but then, I want to load the information of a sale in CUBE_A or CUBE_B depending on the value of a characteristic of the ODS...
Is it possible? how?
Thanks in advance, any clue would be appreciated and awarded.

hi Sebastian,
you can use that particular characteristic in the data selection tab of the infopackage which you will be creating to load the data from ODS to Cube. for infopackage_A where you will select characterestic value as A, u can select cube_A as data target and in the infopackage_B with characteristic value B select data target as infocube_B.
hope this helps.
regards,
manish

Similar Messages

  • Loading second Infocube from an ODS

    Hi SDN,
    I have a cube C1 with already data in it through the ODS. I have copied C1 to C2 and would like the same data to come in C2 from the ODS. After I create the update rules, how do I get in the data into C2?
    Thanks

    Hi All,
    Thanks for the suggestions.
    Using 'Update Data Targets' I am unable to schedule because though I see C2 in Data Targets, the message says that delta for the request so and so already processed. I selected 'Initial Update' to set up the load. I did uncheck C1 from the list of data targets. I also cannot do anything with Data Selection tab.
    If I use the method of creating an export data source from C1, how can I avoid that it does not get transported to QAS or PRD?
    Please help.
    Saf.

  • Query regarding the data type for fetcing records from multiple ODS tables

    hey guys;
    i have a query regarding the data type for fetcing records from multiple ODS tables.
    if i have 2 table with a same column name then in the datatype under parent row node i cant add 2 nodes with the same name.
    can any one help with some suggestion.

    Hi Mudit,
    One option would be to go as mentioned by Padamja , prefxing the table name to the column name or another would be to use the AS keyoword in your SQL statement.
    AS is used to rename the column name when data is being selected from your DB.
    So, the query  Select ename as empname from emptable will return the data with column name as empname.
    Regards,
    Bhavesh

  • Data load failed at infocube level

    Dear Experts,
    I hve data loads from the ECC source system for datasource 2LIS_11_VAITM to 3 different datatargets in BI system. the data load is successful until PSA when comes to load to datatargets the load is successful to 2 datatargets and failed at one data target i.e. infocube. I got the following error message:
    Error 18 in the update
    Diagnosis
        The update delivered the error code 18 .
    Procedure
        You can find further information on this error in the error message of
        the update.
    Here I tried to activate update rules once again by excuting the  program and tried to reload using reconstruction fecility but I get the same error message.
    Kindly, please help me to analyze the issue.
    Thanks&Regards,
    Mannu

    Hi,
    Here I tried to trigger repeat delta in the impression that the error will not repeat but then I encountered the issues like
    1. the data load status in RSMO is red but where as in the data target the status is showing green
    2. when i try to analyze psa from rsmo Tcode PSA is giving me dump with the following.
    Following analysis is from  Tcode  ST22
    Runtime Errors         GETWA_NOT_ASSIGNED
    Short text
         Field symbol has not yet been assigned.
    What happened?
         Error in the ABAP Application Program
         The current ABAP program "SAPLSLVC" had to be terminated because it has
         come across a statement that unfortunately cannot be executed.
    What can you do?
         Note down which actions and inputs caused the error.
         To process the problem further, contact you SAP system
         administrator.
         Using Transaction ST22 for ABAP Dump Analysis, you can look
         at and manage termination messages, and you can also
         keep them for a long time.
    Error analysis
        You attempted to access an unassigned field symbol
        (data segment 32821).
        This error may occur if
        - You address a typed field symbol before it has been set with
          ASSIGN
        - You address a field symbol that pointed to the line of an
          internal table that was deleted
        - You address a field symbol that was previously reset using
          UNASSIGN or that pointed to a local field that no
          longer exists
        - You address a global function interface, although the
          respective function module is not active - that is, is
          not in the list of active calls. The list of active calls
          can be taken from this short dump.
    How to correct the error
        If the error occures in a non-modified SAP program, you may be able to
        find an interim solution in an SAP Note.
        If you have access to SAP Notes, carry out a search with the following
        keywords:
        "GETWA_NOT_ASSIGNED" " "
        "SAPLSLVC" or "LSLVCF36"
        "FILL_DATA_TABLE"
    Here I have activated the include LSLVCF36
    reactivated the transfer rules and update rules and retriggered the data load
    But still I am getting the same error...
    Could any one please help me to resolve this issue....
    Thanks a lot,
    Mannu
    Thanks & Regards,
    Mannu

  • Data load into SAP ECC from Non SAP system

    Hi Experts,
    I am very new to BODS and I have want to load historical data from non SAP source system  into SAP R/3 tables like VBAK,VBAP using BODS, Can you please provide steps/documents or guidelines on how to achieve this.
    Regards,
    Monil

    Hi
    In order to load into SAP you have the following options
    1. Use IDocs. There are several standard IDocs in ECC for specific objects (MATMAS for materials, DEBMAS for customers, etc., ) You can generate and send IDocs as messages to the SAP Target using BODS.
    2. Use LSMW programs to load into SAP Target. These programs will require input files generated in specific layouts generated using BODS.
    3. Direct Input - The direct input method is to write ABAP programs targetting on specific tables. This approach is very complex and hence a lot of thought process needs to be applied.
    The OSS Notes supplied in previous messages are all excellent guidance to steer you in the right direction on the choice of load, etc.,
    However, the data load into SAP needs to be object specific. So targetting merely the sales tables will not help as the sales document data held in VBAK and VBAP tables you mentioned are related to Articles. These tables will hold sales document data for already created articles. So if you want to specifically target these tables, then you may need to prepare an LSMW program for the purpose.
    To answer your question on whether it is possible to load objects like Materials, customers, vendors etc using BODS, it is yes you can.
    Below is a standard list of IDocs that you can use for this purpose to load into SAP ECC system from a non SAP system.
    Customer Master - DEBMAS
    Article Master - ARTMAS
    Material Master - MATMAS
    Vendor Master - CREMAS
    Purchase Info Records (PIR) - INFREC
    The list is endless.........
    In order to achieve this, you will need to get the functional design consultants to provide ETL mapping for the legacy data to IDoc target schema and fields (better to ahve sa tech table names and fields too). You should then prepare the data after putting it through the standard check table validations for each object along with any business specific conversion rules and validations applied. Having prepared this data, you can either generate flat file output for load into SAP using LSMW programs or generate IDoc messages to the target SAPsystem.
    If you are going to post IDocs directly into SAP target using BODS, you will need to create a partner profile for BODS to send IDocs and define the IDocs you need as inbound IDocs. There are few more setings like RFC connectivity, authorizations etc, in order for BODS to successfully send IDocs into the SAP Target.
    Do let me know if you need more info on any specific queries or issues you may encounter.
    kind regards
    Raghu

  • Reload an infocube from an ODS with erroneous information

    Hi:
    I have an Infocube that is update from an ODS. My problem consists in the R/3 system from LO, they put an invalid string. It pass without a problem from R/3 to the ODS. But when it makes the request to load the infocube it displays red light because of this string.
    I made a selective delete from the ODS, but when i run the chain proces just to load the data from ODS to the infocube it continue this mistake.
    Is there any table that I have to delete?
    Points for helpful answers!!
    Thanks

    Hi Herr,
    The selective delete from the ODs would just remove the records from the Active table of the ODS, not the Change logs (from which the delta records move). If you like you can set up error handling in the InfoPackage loading to the cube, so that this record is held back and the others go through and reporting is enabled. Or if the record is required and removing it will change the numbers, you can try to change this record in the PSA and correct it, then load from PSA to the cube.
    Hope this helps...

  • Data Load into 0FIAP_C03 - InfoCube consuming very long time

    Hi,
    I'm loading data into the InfoCube - 0FIAP_C03 from the DSO - 0FIAP_O03. On a daily basis I delete the contents of this InfoCube and reload full data.
    In the DTP monitor I can see that the step 'Overlapping check with archived data areas for InfoProvider 0FIAP_C03' under 'Updating to InfoCube 0FIAP_C03 : 50000 -> 50000 Data Records' for each Data Package is taking the most amount of time. It is generally true that the step 'Overlapping check with archived data areas for InfoProvider <Cube_Name>' takes the most time. But is there any specific reason for this? Also, is there any thing that can be done to reduce the time taken to execute this step?
    Thanks,
    Sri.

    Hi Sri,
    Ideally it should not take much time-
    I think the reasons would be:
    1) Volume of data in Source DSO
    2) Complex logic in Srart/Field/End routine
    3) Cube Index (Delete cube index before load)
    Also check the dimension design of your infocube --> SE38 --> SAP_INFOCUBE_DESIGNS
    Refer the below thread for further info:
    "density" in SAP_INFOCUBE_DESIGNS
    Thanks,
    Peddi

  • Data loads into multiple InfoObjects from 0EHS_PHRASE_TEXT DataSource.

    Dear Experts,
    I am working on SAP HCM-BW 7.0 Implementation and am trying to load data into 8 different InfoObjects (Texts) through 0EHS_PHRASE_TEXT (Phrases) extractor.
    There are many InfoPackages in place created during the previous project loading into different set of Standard BCT InfoObjects.
    After migrating the 0EHS_PHRASE_TEXT DataSource from 3.x to 7.0 version, I have created different InfoObjects with Transformations and DTPs upon business requirements to load them. The relevant data is available in 0EHS_PHRASE_TEXT when checked in Extractor Checker (RSA3). However, when I create InfoPackages to load these different InfoObjects, I don't see the Data Targets listed and it makes sense since the data is first loaded into PSA and then will be loaded into Data Targets (in this case InfoObjects with Text) using DTPs.
    My concern is... when I create a Process Chain to load Master Data into these Objects, the following action is happening:
    1. Load 10 Records into PSA and then into InfoObject A. All 10 Records are loaded into A.
    2. Load 5 Records into PSA and then into InfoObject B. Previous 10 + New 5 Records are loaded into B.
    3. and it continues... for 8 InfoObjects.
    My question is... is there are way that I can see the corresponding Data Target tab in InfoPackages so the Data is immediately picked from PSA of 0EHS_PHRASE_TEXT DataSource and then loaded into the corresponding InfoObjects.
    Or should I make use of Transfer Rules by Re-storing the 0EHS_PHRASE_TEXT DataSource from 7.0 to 3.x version...!!
    Your help is much appreciated.
    Thanks,
    Chandu

    Hi Andreas
    Thanks for replying and sorry for the confusion.
    The extractor is delivering 10 and then 5 because of different selection parameters in the InfoPackage "selecting" which InfoObject I wish to load the texts for.
    Your understanding of my requirements is correct.
    Could you please elaborate Option 1 bit more..!
    Regarding your Option 2, I don't want the DTPs to extract directly in full from the datasource because by doing so, all the data apart from the relevant data is also loaded into each and every InfoObject. And I don't want that to happen. I want to load only relevant data into respective InfoObjects from the PSA using DTPs.
    I have tried to setup filters in DTPs with different Selection Parameters and tried to load relevant InfoObjects. For example, I am trying to load ZEHS_SUBS InfoObject with EHS_INJ_SUB_SUBSTANCE as the Selection Parameter. There are 54 records in both Source System and PSA for this parameter.
    And the request is showing 54 in Transferred Records but only the very last 1 in Added Records. When I check in the Target (ZEHS_SUBS) InfoObject, it is showing only 1 Record. I have tried many combinations. They are:
    1. Loaded only EHS_INJ_SUB_SUBSTANCE data from Source System into PSA and then tried to execute the relevant DTP.
    2. Checked the 'Do Not Extract from PSA but Access Data Source (for Small Amounts of Data) and tried it.
    3. Tried with both Delta and Full Extraction Mode.
    4. Set Filter on Single Value as 'EHS_INJ_SUB_SUBSTANCE' to extract this data only.
    5. Set Filter Excluding all the other Single Values.
    6. Checked the 'Handle Duplicate Record Keys'.
    Still only the last 1 records out of 54 is showing up in the Target InfoObject.
    Please let me know if you have any idea as to why this is happening so.
    Thanks for your time.
    Chandu

  • Checking the Data Loaded in the InfoCube

    Hi all,
    With 3.x version and also with RSA1OLD, it's possible to see infosources that can be loaded for this data target and their status with infosources overview.
    Now, with 2004s version, there's no Infosources. It was really helpful to see if all master data are loaded.
    Is there any new functionality? RSA1OLD is still necessary?
    Thanks
    Best regards

    Hi Andreas and Michael,
    Thanks for replies.
    Andreas, not exactly, it's more about DTP than PSA.
    I know how to view all information one by one. But for an Infocube you can have x Master Data (texts, attributes and hierarchies).
    How can I see if all data are loaded? (date and status)
    As Michael said, we can use the process chain to check if all init, delta or full are successfully loaded. But we need to check it in more than 5 process chain.
    With "overview infosources" it was possible to check with just one click if all data are loaded.
    You're right Michael it's clearly more about ergonomy.
    Best regards

  • Data Load for 20M records from PSA

    Hi Team,
                   We need to reload a huge volume of data (around 20 million records) of Billing data (2LIS_13_VDITM) PSA to the first level DSO and then to the higher level targets.
    If we are going to run the entire load with one full request from PSA to DSO for 20M records will it have any performance issue?
    Will it be a good approach to split the load based on ‘Billing Document Number’?
    In Case, If we the load by 'Billing Document Number'; will it create any performance issue from the reporting perspective (if we receive the data from multiple requests?) Since most of the report would be ran based on Date and not by 'Billing Document Number'.
    Thanks
    San

    Hi,
    Better solution put the filter based on the year and fiscal year.
    check the how many years of data based on the you can put filter.
    Thanks,
    Phani.

  • Data Load to BI (7.0 SP 9) from R3(ECC 6.0 SP-Basis 9)

    Dear All,
    We have new instance of Devlopment BW System with version 7.0 and R/3 upgraded to ECC6.0. We connected the source system. When we extract the data through DTP the data load is sucessful with 0 Records.
    This is case with all the extractors.
    The data base is on Oracle 10.2
    Observations for this:
    0) Source system connection check OK.
    1) When I test in RSA3 for the same extract I could fetch some data there.
    2) I could transfer the global setting
    3) I could not see any of the iDoc generated in BW and received in R/3
    4) No back ground job is generated in R/3 in SM37
    5) I could extract the data from other Source System(SEM) instance based on 3.5 technology.
    As a progress on this issue I could load the load sucessfully by 3.X methodolgy (Using update rules) but not by BI7.0 Methodology(Using Transformation).
    As a standards by the client we have to use 7.0 Methodology so still need to find a solution for the same.
    No clue on how to solve and what is going on wrong. Help me in solve this issue.
    Thanks in Advance,
    PV
    Message was edited by:
            USERPV

    I am not sure if you have followed all the necessary steps to do a data load to the infocube. I also wish I had more information about your system and the error message you are getting. A data load can fail due to a variety of reasons -- depending on the BW version, system settings and the procedure you followed. Please use the data load monitor transaction rsmo, identify the error message and take the necessary action.
    if it may be useful reward point are appreciated

  • Infocube data loads fail with UNCAUGHT_EXCEPTION dump after BI 7.01 upgrade

    Hi folks,
    We're in the middle of upgrading our BW landscape from 3.5 to to 7.01. The Support pack we have in the system is SAPKW70103.
    Since the upgrade any of the data loads going to infocubes are failing with the following
    ABAP dump UNCAUGHT_EXCEPTION     
    CX_RSR_COB_PRO_NOT_FOUND.
    Error analysis                                                                  
        An exception occurred which is explained in detail below.                   
        The exception, which is assigned to class 'CX_RSR_COB_PRO_NOT_FOUND', was not
         caught and                                                                 
        therefore caused a runtime error.                                           
        The reason for the exception is:                                            
        0REQUEST is not a valid characteristic for InfoProvider                     
    Has anyone come accross this issue and if so any resolutions you adopted to fix this. Appreciate any help in this regard.
    Thanks
    Mujeeb

    hi experts,
    i have exactly the same problem.
    But I can`t find a test for infoobjects in rsrv. How can I repair the infoobject 0REQUEST`?
    Thx for all answers!
    kind regards
    Edited by: Marian Bielefeld on Jul 7, 2009 8:04 AM
    Edited by: Marian Bielefeld on Jul 7, 2009 8:04 AM

  • Open HUB ( SAP BW ) to SAP HANA through DB Connection data loading , Delete data from table option is not working Please help any one from this forum

    Issue:
    I have SAP BW system and SAP HANA System
    SAP BW to SAP HANA connecting through a DB Connection (named HANA)
    Whenever I created any Open Hub as Destination like DB Table with the help of DB Connection, table will be created at HANA Schema level ( L_F50800_D )
    Executed the Open Hub service without checking DELETING Data from table option
    Data loaded with 16 Records from BW to HANA same
    Second time again executed from BW to HANA now 32 records came ( it is going to append )
    Executed the Open Hub service with checking DELETING Data from table option
    Now am getting short Dump DBIF_RSQL_TABLE_KNOWN getting
    If checking in SAP BW system tio SAP BW system it is working fine ..
    will this option supports through DB Connection or not ?
    Please follow the attachemnet along with this discussion and help me to resolve how ?
    From
    Santhosh Kumar

    Hi Ramanjaneyulu ,
    First of all thanks for the reply ,
    Here the issue is At OH level ( Definition Level - DESTINATION TAB and FIELD DEFINITION )
    in that there is check box i have selected already that is what my issue even though selected also
    not performing the deletion from target level .
    SAP BW - to SAP HANA via DBC connection
    1. first time from BW suppose 16 records - Dtp Executed -loaded up to HANA - 16 same
    2. second time again executed from BW - now hana side appaended means 16+16 = 32
    3. so that i used to select the check box at OH level like Deleting data from table
    4. Now excuted the DTP it throws an Short Dump - DBIF_RSQL_TABLE_KNOWN
    Now please tell me how to resolve this ? will this option is applicable for HANA mean to say like , deleting data from table option ...
    Thanks
    Santhosh Kumar

  • I am extracting the data from ECC To bw .but Data Loading taking long tim

    Hi All,
                     i am extracting the data from ECC To BI Syatem..but Data Loading Taking Long time. from last   6 hoursinfopackage is running.still it is showing yellow.Manually i made the red.and delete again i applied repeat of the last delta.but same proble is coming .in the status job is showing bckground job is not finished at source system.we requested to basis.basis people killed that job.again we schedule the chain also again same problem is coming.how can i solve this issue.
    Thanks ,
    chandu

    Hi,
    There are different places to track your job. Once your job is triggered in BW, you can track your load job where exactly it is taking more time and why. Follow below steps:
    1) After InfoPackage is triggered, then take the request number and go to source system to check your extraction job status.
    You can get the job status by taking the request number from BW and go to transaction SM37 in ECC. Then give the request number with begining '' and ending ''.  Also give '*' to user name.
    Job name:  REQ_XXXXXX
    User Name: *
    Check the job status whether job is completed or cancelled or short dump. If the job is still running check in SM66 whether you can see any process. If not accordingly you got to check in ST22 or SM21 in ECC. If the job is complete, then the same in BW side now.
    2) Check the data arrived in PSA, if not check whether Transfer routines or start routines are having bad SQL or code. Similarly in update rules.
    3) Once it is through in Source system (ECC), Transfer rules , Update Rules, then the next task is updating the data might some time take more time which might be based on some parameters ( Number of parallel process to update database ). Check whether updating the database is taking more time and may be you got to check with the DBA guy also.
    At all the times you should see minimum of atleast once process running all the time in SM66 till the time your job gets complete. If not you will see a log in ST22.
    Let me know if you still have questions.
    Assigning points is the only way of saying thanks in SDN.
    Thanks,
    Kumar.

  • Optimize the data load process into BPC Cubes on BW

    Hello Gurus,
    We like to know how to optimize the data load process and our scenario for this is that we have ECC Classic Ledger,  and we are looking for the best way to load data into the BW Infocubes from an ECC source.
    To complement the question above, from what tables the data must be extracted and then parsed to BW so the consolidation it´s done ?  also, is there any other module that has to be considered from other modules like FI or EC-CS for this?
    Best Regards,
    Rodrigo

    Hi Rodrigo,
    Have you looked at the BW Business Content extractors available for the classic GL? If not, I suggest you take a look. BW business content provides all the business logic you will normally need to get data out of ECC and into BW for pretty much every ECC application component in existence: [http://help.sap.com/saphelp_nw70/helpdata/en/17/cdfb637ca5436fa07f1fdc0123aaf8/frameset.htm]
    Ethan

Maybe you are looking for

  • Getting right end of text displayed in cells of a JTable column

    Hey All, Platform: WinXP, Java 5.0. I have a simple one-column table whose cells contain pathnames of user-selected files. The pathnames may be quite long, and when they're too long for the fixed width of the column (the containing JFrame is not resi

  • Something trivial but I can't work it out :/

    Okay basically, I want my buttonOval to draw an oval for me. Here is my code: import java.awt.*;                                                  //Import all the classes within the abstract windows toolkit import java.applet.*;                      

  • How to activate 'AFS Details' Button in ME5A screen.

    Hi Gurus, This is related to AFS Material Management. In my system the button called 'AFS Details' is missing on ME5A ( List display of Purchase Requisition) screen. This button appeares when you run T.code ME5A after giving your selection parameter.

  • I can not update anymore my apps

    my apps need to be updated but when I enter my password nothing happens. I have already restaured my i phone, but that does not work

  • I am about to buy a new Smart TV....Panasonic Viera or a Samsung with Smart Hub.

    Running on Snow Leopard and I am about to upgrade...OS X Mountain Lion Which new TV will work alongside it best....ie Home Network etc Samsung say that their TV works with "Compatible Devices" but does not list them..... is there an issue, their phon