Loading 2 data Sources in One Infocube

Hello,
I need to load Infocube using 2 datasources like 2LIS_11_VAHDR and 2LIS_11_VAITM using BI 7.0 is this possible ?
I need to take 10 fields from 2LIS_11_VAHDR and 8 fields from 2LIS_11_VAITM  and load into Infocube. 
Do I need to create 2 Tranformations under infocube?
I searched in SDN I couldn't find any blog on this
If anybody have doc plze share I will assign full points
Thanks

How can this possible let me give one example
Data in 2LIS_11_VAHDR as follows
Doc no:    12345
CCode:     cp123
Data in my datasource 2LIS_11_VAITM as follows
MatDsce: Finish goods
In my first transformation Doc no and Ccode will be mapped to infocube
2nd transformation Matdsc will be mapped to Same infocube.
I'm having doubt like how do we know Matdsc is the correct material description for Doc no :1234
Where is the relationship between two datasources?
Thanks

Similar Messages

  • Error while loading data from DataSource to InfoCube/DSO through DTP

    Hi Friends,
    I am trying to load data from DataSource to InfoCube/DSO through DTP and getting the following error:
    Exception in substep: Extraction completed
    Processing terminated
    Data package 1 / 30.04.2009 22:38:30 / Status 'Processed with Errors'
    When i see the detail message it says:
    Syntax error in routine "convert_to_applstru", line 25 of generated program"
    I checked all my transformation and looks ok. I was able to load data earlier and i getting this error all of sudden. Does some one know what is this routine "convert_to_applstru" ?
    Thanks,
    Amit

    Hi Arun,
    Where do i see this generated program in RSRV. I dont see this in RSRV. Please guide me.
    Thanks,
    Amit

  • How to send data from 4 different data sources to one ODS

    Hello Gurus,
    There is a transaction called KSB1 in R/3.
    It has data related to cost center , cost element, G/L .
    In BI, i need to transfer the data from these Data sources to one ODS.
    Can any body give me some idea.
    Points will be awarded for any kind of response .
    Thanks .
    Anu

    Hi Rupa..
    In your requirement some data sources (Cost centre, Cost element) are master data sources.
    They are available as info Objects in BI (Content).
    So there is no need to create ODS on these.
    But the general scenario for creating ODS from multiple data sources in case of Transaction documents
    Like :
    PO header, PO line items...
    For this the Pre-requisite is to have common fields between these data sources (eg PONo).
    Hope this gives idea for u....
    Cheers...
    Varma

  • Error While Loading data from ODS to Infocube

    I am trying to load data from ODS to Infocube  thru Update ODS data into data target. My requirement was to take a small subset of fields from ODS and design IC and load the data.
    My load fails at Extraction process as I get 0 records from total number of records sent in each package. Please let me know if you need more information.
    Please advise.
    Thanks,
    RR

    In Details tab of monitor, in extraction step,
    Extraction(messages): Errors occured
    Green Light for Data Request Received
    Green Light for Data Selection Scheduled
    Yellow for 25000 Records sent(0 records received)
    Yellow for 25000 Records sent(0 records received)
    Yellow for 15000 Records sent(0 records received)
    Green Light for Data Selection Ended
    Please let me know, If you need more information.
    Thanks,
    R R

  • Accessing two data sources in one method

    I wanted to access two connection pools in one EJB project. I manage to create two connection pools in weblogic server and I successfully wrote the lookup code. I wanted to call these two data sources in one method. But it only gives result from one data source connection. other is null. Im getting following message also
    Connection has already been created in this tx context for pool named jdbc_informix_pool. Illegal attempt to create connection from another pool: JDBC_informix_webdata
    "jdbc_informix_pool" and "JDBC_informix_webdata" is the names of my connection pools. the two data sources works perfectly when they are accessed separately. but whenever im tryng to call two data sources in one method it retrive data from only one data source and gives me above message.
    anyone know how to fix it.?
    thnx
    sameera

    hi...
    thnx man.. it worked. But there is another problem. I cant close my data source connection. whenever im tryng to close it, it says
    Result set already closed
    i manage to run my program without closing the data source connection. but after a while it says
    No resources currently available in pool JDBC_informix_webdata to allocate to applications, please increase the size of the pool and retry
    i can increase the size of the pool and run my program. but i guess if I close the data source connection this will be resolved. But how can I close my data source connection. I tried finally clause. but it gives me the result set already closed message
    how can i do it.?
    thnx
    sameera
    Edited by: Sameera on Jun 18, 2009 8:35 PM
    Edited by: Sameera on Jun 18, 2009 8:40 PM

  • Two data sources to One Infosource?

    HI GURUS
    Can we connect Two data sources to One Infosource?
    Regards
    Durai

    Hi Expert !
    Thanks
    If we assigned two data sources to an Infosource, all those fields will be availabe in the transfer rule maintanence screen that we can choose and assign to Infosource infoobjects.
    Am I correct?
    Regards
    Durai

  • Remove double records during data upload from one InfoCube to another

    Dear Experts
    We have transactional financial data available in an InfoCube including cummulated values by period . Some companys have 12 reporting periods (0FISCPER3) and some companys have 16 periods (but all 16 periods are not always filled) . The data must be prepared for a consolidation system which expects only 12 periods. Therefore I bulit up a routine with the following logik:
    If period > 12, result = 12, else result = source field.
    But as the data target is (must be) an InfoCube the new values with reporting period 12 are not overwritten, but summarised instead. This means the original records with period 12 and the new records - see example:
    Records before transformation:
    Period   Amount
    12          100
    13           120
    Records after transformation in the InfoCube:
    Period   Amount
    12          100
    12           120
    This would lead to the following aggregation:
    Period   Amount
    12          240
    But as the values per period are cummulated, the consolidation system only needs the last period. So there should be only  one record left in the InfoCube:
    Period   Amount
    12           120
    Is it possible to delete dublicate records or do you have any other idea to to keep only the record with the last period (e.g. period 13) and to assign the value 12?
    Thanks a lot in advance for your help!
    Regards
    Marco

    Hi,
    You have two options here, you can put DSO in between the Datasource and infocube, and load the delta using the change log.
    Second is use delete the overlapping request from the infocube, it will delete the previuos requests and load the new request.
    Check the below article:
    [Automatic Deletion of Similar or Identical Requests from InfoCube after Update|http://www.sdn.sap.com/irj/scn/index?rid=/library/uuid/e0431c48-5ba4-2c10-eab6-fc91a5fc2719]
    Hope this helps...
    Rgs,
    Ravikanth

  • Error while loading data from DSO to Infocube

    Hi all,
    I'm loading data from flat file to the cube via a DSO. I get an error in the DTP.
    The data loads to the DSO, i'm able to see the contents in the active data table.
    Transformation from DSO to Cube is activated without any errors. Its a one to one mapping. No routines used.
    When i execute the DTP ( i use full load option as it is a one time load) i get the following error in DTP: All of which are Red.
    Process Request
    Data Package 1: Error during processing
    Extract from DSO  XXXXX
    Processing Terminated
    The long text of the error gives the msg:  Data package 1: Processed with errors    RSBK257
    Thanks

    Hi,
    You can check the below forums.
    DTP with error RSBK257
    - Jaimin

  • Multiple Data Sources In One Logical Table

    I am new to OBIEE and I have came accross an issue. I appologize if this information is in the forum somewhere but I have searched and cannot find it.
    My situation is that I would like to have one logical table that contains multiple data sources which have all the same columns. I already have session variables setup to differentiate the user's security through a row-wise variable for a specific column and a session variable for another column which determines the user's association to the data source in which they belong to. This security works well when the data sources are seperated in the Business Model and Mapping portion but the issue that arises is that the user's cannot share reports when the data sources are seperated in the BM&M.
    I have dragged and dropped a table from the Physical model to the BM&M, I then dragged the second data source (with same meta data structure) over to the "Sources" folder in the first data source table in the BM&M. On the Content tab or each data source table I have defined the WHERE clause as such, where VALUEOF(NQ_SESSION."SCHOOL") session variable is my row-wise column filter and the VALUEOF(NQ_SESSION."GROUP") filter is my data source determinative:
    sandbox."".SANDBOX.OBIEE_CROSS_ENROLLMENTS.HOME_SCHOOL = VALUEOF(NQ_SESSION."SCHOOL") AND sandbox."".SANDBOX.OBIEE_CROSS_ENROLLMENTS.DATA_SOURCE = VALUEOF(NQ_SESSION."GROUP")
    Unfortunatley this only returns values in the BI Answers for the first drag and drop Pysical table in the BM&M layer and not the second Physical table drug into the "Sources" folder. I have also tried to create a new logical table and drag both tables into the "Sources" folder to no avail. I have experimented with the Fragmentation content on the "Content" tab of the seperate logical tables, checking the "This source should be combined with other sources at this level", which gives me an error in BI Answers that a column does not exist which really does.
    What could I be missing? Advanced thanks to those who reply.
    Thank you,
    Kyle

    Stijn,
    Thank you for the article link. That was very helpful! It seems that I had a few things off as you do need the "This source should be combined with other sources at this level." checked. In my two table source columns for DATA_SOURCE I defined a literal ('086496' and '085597' for the other) in the Column Mapping tab. I pasted the following in the Fragmentation content, checking the "This source..." box on the Content tab:
    eSIS.SANDBOX4_SCHOOLS.DATA_SOURCE = '086496'
    And pasted the following into the WHERE clause, checking "Select distinct values" on the Content tab:
    sandbox4."".OBIEE.NWOCA_SCHOOLS.SCHOOL_CODE = VALUEOF(NQ_SESSION."SCHOOL") AND sandbox4."".OBIEE.NWOCA_SCHOOLS.DATA_SOURCE = VALUEOF(NQ_SESSION."GROUP")
    This took care of my user's security, utilizing the session variables in the WHERE clause. I am now able to generate reports that only one user can access from one data source and share that same report with another user who can only see data from the other data source.
    Many thanks!!!

  • Shared-library in OC4J - load data-sources.xml throws error

    H
    To share common libraries (framework.jar, JAXB jars, axis.jar, log4j, tibco*.jars)
    across application(app1/app2/app3) in the OC4J container.
    Currently all these jars are loaded with common jars separately application, a total 3 common
    footprints of these jars.
    1 - Placed all common jars under $J2EE_HOME/shared-lib folder, updated server.xml in $J2EE_HOME/config.
    2 - Connecting Tibco queues using generic JMS resource adapter (connectors - JCA).
    3 - Connecting to databases using data-sources.xml.
    4 - With following entries in orion-application.xml
    <imported-shared-libraries>
    <import-shared-library name="my.shared.libraries" max-version="1.0"/>
    </imported-shared-libraries> ---> Entry 1
    <connectors path="oc4j-connectors.xml"/> ---> Entry 2
    <data-sources path="data-sources.xml"/> ---> Entry 3
    Getting the following error while deploying the code.
    Error occurred initializing connectors. Exception is: Exception creating Manage
    d DataSource ConnectorPropertySet. Exception: Error setting up resource adapter
    for embedded resource archive inside application 'deployGood1'. ManagedConnecti
    onFactory implementation class 'oracle.oc4j.sql.spi.ManagedConnectionFactoryImpl
    ' cannot be set up: Error setting JavaBean property 'managedDataSourceConfigXML'
    for ManagedConnectionFactory class 'oracle.oc4j.sql.spi.ManagedConnectionFactor
    yImpl for embedded resource archive inside application 'deployGood1'. Exception
    is: java.lang.reflect.InvocationTargetException
    However orion-application.xml
    Option A
    Entry 1 (imported-shared-libraries)
    Entry 2 (connectors)
    Works perfectly fine - however the Service complains of not finding the datasource JNDI
    5 - If all jars are loaded separately with each application
    With following entries in orion-application.xml
    <library path="C:/soft/jar/framework.jar" />
    <library path="C:/soft/jar/log4j-1.2.14.jar" />
    <library path="C:/soft/jar/axis.jar"/>
    <library path="C:/soft/jar/commons-discovery-0.2.jar"/>
    <library path="C:/soft/jar/commons-logging-1.0.4.jar"/>
    <library path="C:/soft/jar/fscontext.jar"/>
    <library path="C:/soft/jar/gjra.jar"/>
    <library path="C:/soft/jar/jax-qname.jar"/>
    <library path="C:/soft/jar/jaxb-api.jar"/>
    <library path="C:/soft/jar/jaxb-impl.jar"/>
    <library path="C:/soft/jar/jaxb-libs.jar"/>
    <library path="C:/soft/jar/jaxb-xjc.jar"/>
    <library path="C:/soft/jar/jaxrpc.jar"/>
    <library path="C:/soft/jar/jmxri.jar"/>
    <library path="C:/soft/jar/jmxtools.jar"/>
    <library path="C:/soft/jar/namespace.jar"/>
    <library path="C:/soft/jar/providerutil.jar"/>
    <library path="C:/soft/jar/relaxngDatatype.jar"/>
    <library path="C:/soft/jar/saaj.jar"/>
    <library path="C:/soft/jar/tibcrypt.jar"/>
    <library path="C:/soft/jar/tibjms.jar"/>
    <library path="C:/soft/jar/tibjmsadmin.jar"/>
    <library path="C:/soft/jar/tibjmsapps.jar"/>
    <library path="C:/soft/jar/tibrvjms.jar"/>
    <library path="C:/soft/jar/wsdl4j-1.5.1.jar"/>
    <library path="C:/soft/jar/wss4j-1.5.2.jar"/>
    <library path="C:/soft/jar/xercesImpl.jar"/>
    <library path="C:/soft/jar/xmlsec-1.4.0.jar"/>
    <library path="C:/soft/jar/xsdlib.jar"/> -------> Entry 1
    <connectors path="oc4j-connectors.xml"/> ---> Entry 2
    <data-sources path="data-sources.xml"/> ---> Entry 3
    This configuration works perfectly fine as expected, but this configuration defeats the
    purpose of shared-library
    Thanks
    sunder

    H
    To share common libraries (framework.jar, JAXB jars, axis.jar, log4j, tibco*.jars)
    across application(app1/app2/app3) in the OC4J container.
    Currently all these jars are loaded with common jars separately application, a total 3 common
    footprints of these jars.
    1 - Placed all common jars under $J2EE_HOME/shared-lib folder, updated server.xml in $J2EE_HOME/config.
    2 - Connecting Tibco queues using generic JMS resource adapter (connectors - JCA).
    3 - Connecting to databases using data-sources.xml.
    4 - With following entries in orion-application.xml
    <imported-shared-libraries>
    <import-shared-library name="my.shared.libraries" max-version="1.0"/>
    </imported-shared-libraries> ---> Entry 1
    <connectors path="oc4j-connectors.xml"/> ---> Entry 2
    <data-sources path="data-sources.xml"/> ---> Entry 3
    Getting the following error while deploying the code.
    Error occurred initializing connectors. Exception is: Exception creating Manage
    d DataSource ConnectorPropertySet. Exception: Error setting up resource adapter
    for embedded resource archive inside application 'deployGood1'. ManagedConnecti
    onFactory implementation class 'oracle.oc4j.sql.spi.ManagedConnectionFactoryImpl
    ' cannot be set up: Error setting JavaBean property 'managedDataSourceConfigXML'
    for ManagedConnectionFactory class 'oracle.oc4j.sql.spi.ManagedConnectionFactor
    yImpl for embedded resource archive inside application 'deployGood1'. Exception
    is: java.lang.reflect.InvocationTargetException
    However orion-application.xml
    Option A
    Entry 1 (imported-shared-libraries)
    Entry 2 (connectors)
    Works perfectly fine - however the Service complains of not finding the datasource JNDI
    5 - If all jars are loaded separately with each application
    With following entries in orion-application.xml
    <library path="C:/soft/jar/framework.jar" />
    <library path="C:/soft/jar/log4j-1.2.14.jar" />
    <library path="C:/soft/jar/axis.jar"/>
    <library path="C:/soft/jar/commons-discovery-0.2.jar"/>
    <library path="C:/soft/jar/commons-logging-1.0.4.jar"/>
    <library path="C:/soft/jar/fscontext.jar"/>
    <library path="C:/soft/jar/gjra.jar"/>
    <library path="C:/soft/jar/jax-qname.jar"/>
    <library path="C:/soft/jar/jaxb-api.jar"/>
    <library path="C:/soft/jar/jaxb-impl.jar"/>
    <library path="C:/soft/jar/jaxb-libs.jar"/>
    <library path="C:/soft/jar/jaxb-xjc.jar"/>
    <library path="C:/soft/jar/jaxrpc.jar"/>
    <library path="C:/soft/jar/jmxri.jar"/>
    <library path="C:/soft/jar/jmxtools.jar"/>
    <library path="C:/soft/jar/namespace.jar"/>
    <library path="C:/soft/jar/providerutil.jar"/>
    <library path="C:/soft/jar/relaxngDatatype.jar"/>
    <library path="C:/soft/jar/saaj.jar"/>
    <library path="C:/soft/jar/tibcrypt.jar"/>
    <library path="C:/soft/jar/tibjms.jar"/>
    <library path="C:/soft/jar/tibjmsadmin.jar"/>
    <library path="C:/soft/jar/tibjmsapps.jar"/>
    <library path="C:/soft/jar/tibrvjms.jar"/>
    <library path="C:/soft/jar/wsdl4j-1.5.1.jar"/>
    <library path="C:/soft/jar/wss4j-1.5.2.jar"/>
    <library path="C:/soft/jar/xercesImpl.jar"/>
    <library path="C:/soft/jar/xmlsec-1.4.0.jar"/>
    <library path="C:/soft/jar/xsdlib.jar"/> -------> Entry 1
    <connectors path="oc4j-connectors.xml"/> ---> Entry 2
    <data-sources path="data-sources.xml"/> ---> Entry 3
    This configuration works perfectly fine as expected, but this configuration defeats the
    purpose of shared-library
    Thanks
    sunder

  • Equipement type master data load data source?

    Hi,
    How to check whether there is any SAP delivered data source for Equipement type master data load?
    if that data source is not available then how to proceed?

    There is Master Data Texts for Equipment Type in the standard content. The standard content DataSource is 0EQUITYPE_TEXT and it extracts data from the T370K_T table in R3/ECC.
    No Master Data Attributes exist for Equipment Type on the T370K table in R3/ECC.

  • Error while loading data from DSO to Infocube for 0Fiscvarnt

    Hi All,
    Could you please help me  any one to resolve the below error.
    *"Fixed value discrepancy with InfoObject 0FISCVARNT value 'K4 ', expected 'K2 "  (MessageClass RSAU).
    We have two  Staging DSO's upto Second level DSO FiscalyearVariant  both K2 and K4 updating, but second level DSO to infocube data loading we had a above problem. all are direct mapping, no routiunes or constants at all transformation level.
    But we checked R/3 side both K2 and K4 variants are available.
    Thanks,
    Ramesh

    Hi,
    It's a constant issue I met before. I guess you have assigned a constant value K2 in your target ODS.
    Please change the constant back. Go to RSA1, change target ODS, find 0FISCVARNT, right click it and choose 'Provider-specific Properties', remove K2 from 'Constant'.
    Now try the reload and I'm sure no more issues. If you really want the constant then you have to change data or transformation.
    p.s. Don't forget re-activating transformations and DTP
    Regards,
    Frank
    Edited by: Frank Lee on Jul 14, 2009 3:13 PM

  • Load data to a modified InfoCube and ODS used in Production

    Hi,
    I have added few fields (InfoObjects) to the cube(s) and DSOs in BI 7.0. These Cubes and DSOs are BW 3.5 Objects and are in-use in Production.
    My question: After I transport this new/enhanced InfoCube and DSO from Dev/Qua to Production, how to retrieve the old data from original Cube/DSO and populate data to new fields at DSO and Cube level.
    Please advise.
    Thanks,
    ~S

    Hi Siddhartha Nayak
    As ur enhanced the IC & ODS which r 3.x objects, i guess u havn't used the remodeling tool. Then u have to follow the old procedure to enhancing the cube.
    Create a copy cube , Delete the contents of existing cube, enhance it & load the data to existing cube back from copy cube.
    Same thing u have to follow in ur case also.
    Regards,
    Shaiksha.

  • Slow Loading data from DB, one record at a time, HOW?

    I am currently doing a project....
    I need to display a record of data on a "form"(JFrame)
    the way i handle it is
    1. Load first record
    2. pass to the display object
    3. display object -> on click next, call manager object to process the request
    4. manager, get next record with the help of DBManager
    5. manager, set the record into the display object
    6. manager, command object to refresh screen...
    manager contains DBManager and disply instance....
    it work perfectly fine for the first 20 or so record, but after some time, every click is slowed down about 20 over times...
    i.e. first record take .5 sec, 50th take 30 sec
    I traced the memory usage it is ok... but what could the problem be???
    Memory usage list...
    Before total:2031616 Free:810728
    Before total:2031616 Free:1055016
    Before total:2031616 Free:784272
    Before total:2031616 Free:1049648
    Before total:2031616 Free:821464
    Before total:2031616 Free:1091312
    Before total:2031616 Free:832568
    Before total:2031616 Free:1057192
    Before total:2031616 Free:793864
    Before total:2031616 Free:1009872
    Before total:2031616 Free:736552
    Before total:2031616 Free:1000048
    Before total:2031616 Free:776496
    Before total:2031616 Free:563400
    Before total:2031616 Free:834264
    Before total:2031616 Free:600392
    Before total:2031616 Free:883232
    Before total:2031616 Free:637144
    Before total:2031616 Free:877344
    Before total:2031616 Free:625808
    Before total:2031616 Free:876032
    Before total:2031616 Free:603856
    Before total:2031616 Free:440688
    Before total:2031616 Free:681816
    Before total:2031616 Free:431080
    Before total:2031616 Free:714832
    Before total:2031616 Free:465848
    Before total:2031616 Free:790152
    Before total:2031616 Free:613168
    Before total:2031616 Free:364024
    Before total:2031616 Free:677752
    Before total:2031616 Free:421064
    Before total:2031616 Free:716144
    Before total:2031616 Free:541320
    Before total:2031616 Free:372024
    Before total:2031616 Free:595424
    Before total:2031616 Free:425288
    Before total:2031616 Free:250016
    Before total:2031616 Free:576584
    Before total:2031616 Free:405744
    Before total:2031616 Free:225440
    Before total:2031616 Free:546728
    Before total:2031616 Free:370240
    Before total:2031616 Free:204136
    Before total:2031616 Free:540816
    Before total:2031616 Free:367384
    Before total:2031616 Free:196400

    Hey,
    I was wondering about a compelling reason, why you want to go back to the database on every click. I would fetch "X" rows at a time from the Database and cache them as Value Objects. X could be 10 or even 100. It depends on your requirements.
    As far as the performance going down there could be lot of reasons. Do you open and close the connection on every click. How many connections are open at a time? do you close the connection? or the connection is alive?
    Chidda

  • Two data sources in one file

    I would like to write capacitance (double) and fluxvalue (double) as a
    function of time to a single file, preferably a spreadsheet. 
    Currently, each one is written to its own file using "Write LabView
    Measurement File".  This causes problems in data synchronization.
    Do I need to create an array first?  If so I had difficulty inputing the correct data type into the array.
    Also, is it possible to save all of my subvi's into a single vi? 
    Thanks for your time and consideration,
    rjhinton
    Attachments:
    capacitance and mag field.vi ‏255 KB

    You can also just use merge signals.
    To write to a plain spreadheet, you could double-click the lvm express VI and configure as follows:
    No Headers
    comma separated
    Now name the file *.csv at the dialog and excel will open it without problems later.
    Message Edited by altenbach on 12-16-2005 11:46 AM
    LabVIEW Champion . Do more with less code and in less time .
    Attachments:
    MergeSignals.png ‏4 KB

Maybe you are looking for