How to decide target related to data source

Hi Friends,
                How can i decide data target(ODS/CUBE) related to Data Source.

Hi Vishali,
It depends on the type of data load and volume of data.
If it is full upload with small volume of data, you can directly load cube.
if it is full upload with large volume of data, then you can load a DSO and then load delta to the cube.
If you have delta load which supplies only the after image, then you have to load that into DSO before loading delta to the cube.
If your delta load supplies before and after image then you may or may not keep a DSO in between.
What is your scenario?
Thanks,
Krishnan

Similar Messages

  • How to find tables related to data sources

    Hi...
    Can anybody tell me how to fine Database tables related to Data source.
    Ex: I have a data source 0fi_ap_3. How can i fine From   which tables it is extracting the data.
    Thanks & Regards
    Pankaj Angra

    Hi Pankaj,
       This has already been discussed in the following forum links,.. they will be helpful...
    Re: Find R/3 tables which are in the datasource
    Tables associated with DataSource!
    Find R/3 tables which are in the datasource
    Hope this helps...
    Thanks,
    Raj

  • For a function module how can I find its assigned data source name?

    Hi BW Gurus,
    If i know the data source name then the  assigned fn. module/Table/Infoset I could find from RSO2. But for function module How do I know its assigned data source name?
    Thanks a lot for the response.
    Regards
    Ven

    Hi Ram,
    In SE16, enter the table name as ROOSOURCE and in contents choose field EXTRACTOR for selection and enter the name of the function module.
    It will return the list of datasources where the function module has been used.
    Best Regards,
    Ankit Agrawal

  • How to get group when the data source from system instead of UME database

    Hig guys,
    How to get group when the data source comes from backend system instead of UME database?
    I tried to use
    IUMPrincipal RefGroup = WPUMFactory.getGroupFactory().getGroup(groupName);
    But I was not able to get the group. But in "UserAdministrator", I can find this groupName.
    Which kind of API can I use?
    Thanks in advance!
    Regards,
    Liying
    Message was edited by:
            Liying Wang

    Ok,
    try this:
    com.sapportals.portal.security.usermanagement.IGroupFactory ep5GroupFactory = userManagementService.getGroupFactory();
    IGroupFactory groupFactory = UMFactory.getGroupFactory();
    com.sap.security.api.IGroup group = groupFactory.getGroupByUniqueName(groupName);
    IUMPrincipal ep5Principal = ep5GroupFactory.getEP5Group(group);
    This should do the trick,
    Romano
    PS: and thanks for the stars!

  • How to populate target directory from the source XML in Receiver File Adap?

    Hi All,
    Our scenario is IDoc - XI -(Receiver File adapter) File. Is it possible to populate complete "Target Directory" from the source XML message??
    Lets say we added field to maintain target directory in Idoc structure and some how populated value to it, then grab this target directory from the IDoc-XML and pass in Comunication Channel. I think its possible through Variable Substitation ...just want to make sure and if sombody has done the similar scenario their inputs would be great.
    Thanx
    Navin

    Hi,
    Please see the belowlinks
    /people/jayakrishnan.nair/blog/2005/06/28/dynamic-file-namexslt-mapping-with-java-enhancement-using-xi-30-sp12-part-ii
    /people/sriram.vasudevan3/blog/2005/11/21/effective-xsl-for-multimapping-getting-source-filenames-in-legacy-legacy-scenarios
    Re: Dynamic  File Name for Receiver File Adapter
    Variable Substitution
    http://help.sap.com/saphelp_nw04/helpdata/en/bc/bb79d6061007419a081e58cbeaaf28/content.htm
    http://help.sap.com/saphelp_nw04/helpdata/en/bc/bb79d6061007419a081e58cbeaaf28/content.htm
    try with adapter specific
    Example code...
    String newfilename="";
    DynamicConfiguration conf = (DynamicConfiguration) container.getTransformationParameters().get(StreamTransformationConstants.DYNAMIC_CONFIGURATION);
    DynamicConfigurationKey key = DynamicConfigurationKey.create("http://sap.com/xi/XI/System/File","FileName");
    // Get Sourcefilename
    String oldfilename=conf.get(key);
    //extract first 3 chars of source filename
    newfilename=oldfilename.substring(0,2);
    //get the date
    java.text.SimpleDateFormat dateformat = new java.text.SimpleDateFormat( "yyyyMMdd" );
    dateformat.format( new java.util.Date() );
    //append sourcedateL
    newfilename=newfilenamedateformat"L";
    // determine if prod/ dev / qa
    map = container.getTransformationParameters();
    senderService = (String) map.get("SenderService");
    if(senderServcie.equald("Prod"){
    newfilename=newfilename+"P";
    // change to new file name
    conf.put(key, newfilename+".tmp");
    Change it according to your requirement
    Regards
    Chilla..

  • How to change database schema of data source in OWB?

    Hi,
    Could someone give me hint about my questions:
    I select many database schema in TOAD. like "My Shema" and "All Schema".
    If I create a Oracle Data Source in OWB 10.2. All tables/viewsI can see only from "My Schema" (I think it is default), However, I need to connect "All Scema". There are lots of prod tables/views. I don't know how to set/change/config in OWB even reading guide.
    Thanks a lot
    Lan

    Lan,
    You can ONE of the following:
    1. Dbl click the src/target module and go to metadata location tab and select a location that has the details pointing to the different schema (If you do not have this location created, you may have to create one).
    2. Create synonyms in "My Schema" to objects from other schema. Then try to import the tables again. This time you will be able to see the other tables (the icon will be different for synonyms.)
    Hope That Helps.
    Kaushik.

  • How to copy tables between diff data source ?

    if copy tables in same datasource, the problem is easy , just
    CREATE TABLE DESTDB.TABLE as select * from SOURCEDB.TABLE
    but in diff data source, such as copy access table to a non-exsit oracle table,
    need to create the target table first . my problem is how to create the target table by source table structure. though the source table structure can be read out, but parse it and make the SQL statement to create the table is a tough job.
    anyone have good idea ??
    thanks.

    You are explaining a huge topic in simple terms.
    You are trying to do Database Migration from MS-Access to Oracle. It's not all that easy. But if you are restricting to creating few tables then, do a small exercise.
    1. Use ResultSetMetaData to get the no of coluns and their data types.
    2. Construct a create statement by using above information.
    3. Then query the table1 for all the records and insert those records into created table.
    Sudha

  • How can I delete an ODBC data source after 10g client has been uninstalled?

    I uninstalled the 10g client and now when I attempt to delete the ODBC data source, I get an error: "The setup routines for the Oracle in Oraclient10g_home2 ODBC driver could not be found. Please reinstall the driver. I re-installed the client and I get the same error, but I had to specify a new home. I still get the error. Anyone have an answer or a workaround to deleting the ODBC entry so that I can re-create with the same name later under the 9i client?

    I have the same problem... and nobody gots the point:
    1st. the ODBC and the client/home exists on the machine...
    2nd. the client/home is unistalled and so...
    3rd. tada! Now we have a impossible-to-unistall ODBC entry!
    That is happening on a Windows, at last to me...
    Please, help us if someone knows how to remove this ODBC entry... if know how to DO the stuff (permissions, system, all that things is answered above).
    Best regards,
    Bruno Araujo

  • How to find all uses of data source views in an SSIS solution

    I am upgrading Visual Studio from 2008 to 2013 (with SSDT) and SQL Server 2008 R2 to  2012.  I have a solution with over 30 dstx files. Each file has multiple OLE data sources and lookup tasks.  There is inconsistent usage of data source views
    throughout (as compared to SQL queries or table references).  It is my understanding that all data source views need to be removed before upgrading SSIS packages from BIDS to SSDT.  I tried searching the files as XML for the DSVs but it appears the
    GUID reference changes per dstx file.  It seems like I will have to look at each source/lookup. Is there quicker way to search for where they are used? 
    Thanks
    Adding this question here.  Posted question incorrectly in VSO forum.

    All right, yes, they are dropped
    Never upgraded a package that had them.
    What happens if you just upgrade leaving a copy?
    Arthur My Blog
    This ended up being what I did for many of the packages.  Upgrading the packages severed the data source views and left the SQL in the related tasks (e.g. OLE Source task).  Sorry for the delayed mark as answered.

  • How to make enhanced fields of data source delta enabled?

    Hi ,
    I have a scenario, in which we are using the DS 0CUSTOMER_ATTR to load the master data.There are some fields added to this data source but this fields are not delta enabled. I have tested these fields from R/3 side using BDCP, CDPOS tables. All the fields are writing the changes correctly. But when extracting them using delta they are not capturing.If we load the full,it is capturing all the changes.
    So running full load every week is taking more time.
    Can any one suggest how to make this fileds delta enabled to capture the change in BW side as well to extract only changed records?
    Any help will be greatly appriciated !
    Thanks & Regards,
    Vishnu

    HI,
    It is not possible to load delta this data. Your datasource only detects changes in the customer table and when any customer has been modified the record is loaded to BW. In this moment the datasource fill the append extructure.
    The problem is that if you modifies some record from the append extructure the datasources doesn't detects the changes. When you does a full load you are filling the append extructure for the entire records and it works.
    If you want to catch this deltas you should make a delta datasource for CDPOS but depending on the number of records of the tables it won't be possible.
    You could check if there is any event to detects the modification and create a delta queue datasource for the append extructure.
    Then, you can mix both datasources in a DSO in BW.
    You can look for a " how to create generic datasources which use delta queue"
    I hope it helps

  • How to use web service as data source for forge or endeca ?

    hi,
    is there any way to use webservice as data source in pipeline ?
    i have requirment to get data from web service and dump it in endeca. for this i need some idea on how can be this achieved. is it possible to do so?

    Crawling is part of the CAS, which is kind of the data ingest process in Endeca as you would know. There are a bunch of OOTB Crawlers available like FileSystem based, XML Crawlers, etc., and there could be a case where you have to write your own and thats what I am suggesting because the XML crawl thats OOTB expects a specific format which isnt really mentioned in the Documentation. Please refer to CASDevGuide for more information. CAS, PlatformServices and ToolsAndFrameworks should be running when you start off with Endeca, you can see that from
    ps -ef | grep java
    on your machine

  • How to make a Infocube as data source and upload data? urgent

    Hi all,
    I have 2 same InfoCubes, and I want to make the test InfoCube(ZCO_1T) as data source and upload data to another Cube(ZCO_1), Now I has Generated Export DataSource on ZCO_1T and create a update rule between ZCO_1T and ZCO_1. what will I do the next step? Thanks.

    hi delve,
    once u generate datasource from test cube it acts as an data source for further upload.
    now u choose update target3.x in additional functions in context menu of the test cube.
    u will get an infopackage pop up with data target to the cube ZCO_1,processing to datatarget only.
    selection criterion can bve selected from the first tab.
    under schedule tab you can start extraction.
    reward points if helpful.

  • How to load 3.x hierarchy data source from a certain level ?

    Hi all,
    I have an old 3.x hierarchy data souce which has a structure like below :
       Level 0 :  root node
                  Level 1:  Product Category
                      Level 2: Product Line
                          Level 3: Product
                              Level 4 : Product Version
    each level has independent infoobject for it.
    The hierarchy was initially constructed for the infoobject on the level 4 'Product Version'.
    Now here comes anther requriement, we also want to have a hierarchy for 'Product' and try to reuse the datasource.
    So the question is , is it possible to extract from the same data source, while just update the hierarchy for 'Product ' starting from the level 3 (up to level 0)?
    PS: i can't migrate the data souce to 7.X, because the source system doesn't support.
    I hope I made my question clearly.
    Really apprericated that you can give me a hint.
    Thanks,
    Amon

    Finally I solved this problem myself.
    there are many mays if you want to realize the requirement, only if you change your way of thinking, don't be stuck your mind with the thought we have to use the original data source coming from the source system.
      I'd like to introduce 2 ways I am thinking of:
    1. use the SAP standard program 'Z_SAP_HIERARCHY_DOWNLOAD' to save the hierarchy as a .csv file, then you can refer to http://scn.sap.com/community/data-warehousing/bw/blog/2012/01/08/bw-730-hierarchy-loading-becomes-easier-with-the-help-of-new-framework
    however this didn't work for me, because the program might be outdated in my working environment BW7.4, though I did some bugfix for this program , the output file seems with some errors.
    2. since the hierarchy for the 4th level infoobject has been successfully updated in my case, we can use it as the data provider directly . Here is the general dataflow : 4th level infoobject hierarchy -> DSO -> target infoobject hierarchy (3rd level infoobject) .
    for this DSO strucure, you should also refer to the link above, the structure is just like the EXCEL file provided. but you should be aware of adding external characteristics in the data fields of your DSO if there is external characteristics in your hierarchy.
    in the transformation of 'source infoobject hierarchy' , you should select 'hierarchy (one segment)' as the subtype of hierarchy.
    in the transoformation of 'DSO -> target infoobject' , you should map the source and target carefully , especially for the segment 'Structure' , you should make sure the field 0H_HIERNODE and the external chars. have proper value filled.

  • How can I use the same data source twice?

    in our system there is already a complete transfer of calculations to BI (0CO_PC_PCP_01).
    However I want to create a new data flow from ERP starting from the same data source.
    Is this possible?
    Thanks for help!!

    Hi!
    welcome to SDN forums.
    anydata source can be mapped only to a single infosource. but it will not ristrict you to have different datatargets from a single infosource. so you have to use filters in infopackages to select the data target and filter the data for this data sources. Only thing you have to do with old  dataflow is just ristrict the data relavent for  the 1st Datatarget in old Infopackage and re.initialize.
    with regards
    ashwin

  • How to avoid using dataset as data source

    Hi all,
    using dataset cause producing report slow. could anyone tell me how to use a stored procedure directly as data source?
    Thank you very much
    Clara

    You may also want to determine where exactly is the slow part of this happening. Remember that to get a report, there is a number of steps, other than connecting and retrieving data from the datasource.
    E.g.;
    Load of the runtime, load of the report, processing of the report, formatting of the report, etc.
    If you do find that datasets are the issue, than Don's suggestion is excellent.
    Ludek

Maybe you are looking for