Data flow between 2 servers

Hi BW Experts,
Can we transport BW Infocube into SCM Server?
Please tell me the process.
Regards
Anjali

Hi,
If there is no Transport landscape then you can ask your Basis team to create the BW system as a 'source system' in the SCM system. So your SCM system will act as a destination system.
Now create a copy cube in your SCM system.
For this cube in SCM system, the cube in your BW acts as a source.
In your BW system cube, right click on it and generate export datasource. Now 8* datasource will be created. Use this 8* datasource in the SCM system and extract data.
Please revert in case of any issues
Regards,
Pramod

Similar Messages

  • Issue with Data flow between Unicode and Non Unicode systems

    Hello,
    I have scenario as below,
    We have  a Unicode – ECC 6.0 and a UTF 7 – Legacy system.
    A message flow between Legacy system to ECC 6.0 system and the data is of 700 KB size.
    Will there be any issue in this as one is Unicode and other is non Unicode?
    Kindly let me know.
    Thanks & Regards
    Vivek

    Hi,
    To add to Mike's post...
    You indicate that your legacy system is non-Unicode and the ERP system is Unicode.  You also said that the data flow is only <i>from</i> the legacy system <i>to</i> the ERP system.  In this case, you should have no data issues, since the Unicode system is the receiving system.  There <b>are</b> data issues when the data flow is in the other direction: <i>from</i> a Unicode system <i>to</i> a non-Unicode system.  Here, the non-Unicode system can only process characters that exist on its codepage and care must be taken from sending systems to ensure that they only send characters that are on the receiving system's codepage (as Mike says above).
    Best Regards,
    Matt

  • Data flow between R/3 AND APO

    How the chars and Navigational attributes are flowing in-between the R/3 AND APO ?
    Why we need to maintain the chars and nav attrr in APO? . Any input will be highly appreciated
    Thanks

    Hello Nick,
    Navigational Attributes : Characteristic attributes can be converted into navigation attributes. They can be selected in the query in exactly the same way as the characteristics for an InfoCube. In simple words an attribute is a characteristic that is logically assigned and subordinated to another characteristic. Navigational attributes offer a way to plan multiple objects while achieving optimum system performance in Demand Planning.
    e.g. Costs of the cost center drilled down by person responsible: You use the attribute u2018Cost Center Manageru2019 for the characteristic u2018Cost Centeru2019. If you want to navigate in the query using the cost center manager, you have to create the attribute u2018Cost Center Manageru2019 as a navigation attribute, and flag it as a navigation characteristic in the InfoCube.
    Characteristics : A planning object such as a product, location, brand or region.
    The master data of Demand Planning or Supply Network Planning encompasses the permitted values of the characteristics, the characteristic values. Characteristic values are discrete names or numbers . For example, the characteristic 'location' could have the values London, Delhi and New York.
    Regards
    Rahul Chitte
    Edited by: Rahul Chitte on Aug 30, 2010 5:07 PM

  • Correct tool for my purpose of data movement between servers

    Hello All,
    I am in the process of trying to copying over data from source sql server table to destination sql server table. The requirements being, only the new or updated data needs to be migrated to destination table, once a week. Our source table has 23 million
    rows and growing. 
    I researched two different solutions and would like to know if anyone has feedback on these. 
    1. Merge - used to sync data between source and destination table using SSIS packahe. But the problem for this according to my research, with the amount of data in consideration, the transactional log will grow by leaps and bounds. Not the way.
    2. Replication - I have started my research in this matter. Would this be an ideal solution? 
    Many thanks.

    Transactional replication is the best fit here. You should be able to get near real time synchronizations between your source and destination servers IF you have a pk on the table you are replicating.
    If your skill set is with SSIS the merge component will also work.
    If you backup your tlog every 20 minutes or so, your tlog will be maintained and you should not see explosive growth.
    Note that both the SSIS merge component and transactional replication will lead to large tlog growths unless you maintain your tlog.
    looking for a book on SQL Server 2008 Administration?
    http://www.amazon.com/Microsoft-Server-2008-Management-Administration/dp/067233044X looking for a book on SQL Server 2008 Full-Text Search?
    http://www.amazon.com/Pro-Full-Text-Search-Server-2008/dp/1430215941

  • Slow data transfer between servers across ACE

    Hello all,
    We have been facing slowness in SFTP file transfer. All three tiers are in different Vlan on ACE module.
    App Server1 to DB Servers (Data Transfer is slow 32KB)
    App Server2 to DB Server (Data Transfer is OK)
    DB Servers to Web Servers (Data Transfer is OK)
    TestPC to App Server1 (Data Transfer is slow 32KB)
    TestPC to App Server2 (Data Transfer is OK)
    I have requested the customer on results of some other test cases, but before i start on ace, just checking if someone has faced similar problem.
    Regards,
    Akhtar

    See this thread.
    http://community.bt.com/t5/Other-BB-Queries/HomeHub-3-LAN-speeds-only-10Mb/m-p/238589/highlight/true...
    There are some useful help pages here, for BT Broadband customers only, on my personal website.
    BT Broadband customers - help with broadband, WiFi, networking, e-mail and phones.

  • Exchange 2010 & 2013 coexistence mail flow between servers

    Hi Everyone,
    I have an exchange 2010 server which I am trying to move away from. I have installed 2013 and it seems to be working well. I have successfully moved a mailbox over to 2013 without any issues.
    The problem i have at the moment is that a user who has their mailbox on the 2010 server can send to the user on the 2013 server. However the 2013 user cannot reply.
    Any ideas?

    Yes, some issues with client connectivity are related to Outlook.
    Exchange Blog:
    www.ntweekly.com
    MCSA, MCSE, MCITP:SA, MCITP:EA, MCITP:Enterprise Messaging Administrator 2010,MCTS:Virtualization

  • Data flow error in workflow runtime

    I have many workflow's (standard or not) that present error in runtime.
    The data flow between container task and container workflow doesn't work if one element is empty, but this element not is mandatory.
    List of errors:
    - ParForEach 000000
    - Object FLOWITEM method EXECUTE cannot be executed
    - and others...
    These errors occurs if <b>some information</b> of the workflow container that is used in <b>some data flow</b> (binding definition) will be <b>empty</b>, the workflow will present error in the start.
    List of specific error:
    Error during result processing of work item 000000395235
    Error when processing node '0000000083' (ParForEach index 000000)
    Error when creating a component of type 'Etapa'
    Error when creating a work item
    Error within method CL_SWF_RUN_WIM_BATCH->_CREATE_WORKITEM_CONTAINER
    Source (expression '&STANDARDMATERIAL.MAILCENTRAL&') of binding assignment is not available
    Source (expression '&STANDARDMATERIAL.MAILCENTRAL&') of binding assignment is not available
    Error in the evaluation of expression '&STANDARDMATERIAL<???>.MAILCENTRAL&' for item '17'
    Error when determining attribute 'MAILCENTRAL' of object instance '[BO.BUS1001006.000000000010279
    Error in the evaluation of expression '&STANDARDMATERIAL<???>.MAILCENTRAL&' for item '17'
    Error when determining attribute 'MAILCENTRAL' of object instance '[BO.BUS1001006.000000000010279
    These errors did not occur before the Support Package SAPKB70012.

    Hello Arghadip,
    Yes, the attribute is empty and not is mandatory. These error occur in Standard Workflow SAP and Customer Workflow (my developments).
    These error occurs if <b>some information</b> (attribute) of the workflow container that is used in <b>some data flow</b> (binding definition) will be empty.
    Example: I have a SendMail step in my workflow, and the email address is one attribute from business object. Before this step (sendmail), when the previous step is concluded and if the attribute (mail address) is empty, the error ocurr's.
    I believe that the email would not have to be sent for nobody, and not ocurr's an error. I think that these problem is one support package error.
    Thanks,
    Kleber

  • How to automate the data flow to content servers?

    We have ECC connected to CS. Could you help tell how to automate the data flow into the CS? Thanks a lot!

    What do you use the Content Server for? If its for archiving you need to run the STORE job to send the data to the CS.
    I can't see a reason to automate that process
    Regards
    Juan

  • What are the differences between the old and new data flow technology?

    I have seen many places mentioned the new data flow tech ( In BI7) are much different from the old one,, can anyone list a few differences?
    Thanks

    Hi Dylan,
        An infosource can be used in the 7 flow model, but its purpose is completely different from the 3.x model.
    In the 3.x model, the infosource consisted of 2 structures  transfer and communication.
    The transfer structure was where you specified which field in the datasource corresponds to which infoobject ie the field type.
    You now do this in the datasource maintainence in 7.0 (the tabs called proposals and fields)
    The communication structure was the structure which was available for uploading.
    You can refer the link for the 3.x flow
    [http://help.sap.com/saphelp_nw70/helpdata/en/90/64553c845ba02de10000000a114084/content.htm]
    In the 7.0 flow, you generally use a  DSO when you perform sequential transformations ie you want to change the data format or do some processing twice before you actually load to the final infoprovider.
    The sequence in this case would be
    Datasource -> transformation1 -> DSO -> Transformation 2-> DSO -> Transformation 3-> Cube
    The disadvantage of doing this, is that the DSO will store data.  To avoid this, you can use an infosource instead of a DSO.
    You can refer the below link for using 7.0 infosources
    [http://help.sap.com/saphelp_nw70/helpdata/en/7e/001342743eda2ce10000000a1550b0/frameset.htm]
    Hope this helps.
    Regards.

  • How data flow when SSIS packages are run on a different server than the DB server

    The scenario is that i have a dedicated SQL SErver 2014 SSIS machine that executes the packages.
    The database server is a separate machine with SQL Server 2008R2.
    1) Running SSIS packages that transfer data within SQL Server 2008R2 (same machine)
    2) Running SSIS packages that transfer data between 2 separate SQL Server servers.
    How the data flow in these two cases and what resource is being used where? (cpu,disk,ram,network)
    Elias

    When you have a dedicated SSIS server, all data read flows to that server, is processed using the resources of that ETL server and then sent back over the network to the destination server.
    It doesn't matter if source and destination are the same server. If you use a data flow, all data flows over the network twice.
    The only exception is when you don't use a data flow, but only SQL statements. In that case, data flows only between source and destination.
    MCSE SQL Server 2012 - Please mark posts as answered where appropriate.

  • Data flow task error failed validation and return validation status "VS_NEEDSNEWMETADATA"

    I have ETL with ~800 tables that I moving from Oracle to SQL Server (Prod Oracle -> Prod SQL)
    Now the Oracle/SQL new version was came from vendor that I need to test, and for that I created new DEV environments for Oracle and SQL , the update includes updated new columns in exists tables and new tables . (DEV Oracle -> DEV SQL)
    So what I tried to do is to take the old ETL(PROD) to change the connection to DEV servers.
    Then I executing the packages from local laptop it's working, and if I trying to execute the packages from job schedule it's giving me errors : "Data flow task error failed validation and return validation status "VS_NEEDSNEWMETADATA"
    I went to each table to check the columns if something different, and I was dropping some of the tables and recreated them in the destination but the error still shows. I also tried to change the package to "DelayValidation" to True but without
    success.

    I do not understand the difference between "... if I going to change the Connection Manager to new connection" and "didn't change the Connection Manager, only changed inside the Server name / user/ pass" 800 tables.
    What I see is some tables your packages sees in Dev (laptop) is not of the same schema once the package is deployed hence the metadata error.
    Arthur
    MyBlog
    Twitter

  • Data flow fails on packed decimal field moving iSeries DB2 data from one iSeries DB to another

    I' trying to use SSIS to move table content from one iSeries DB2 database to another.  I'm using the .Net providers for OleDb\IBM DB2 for i5/OS IBMDA400 OLE DB Provider in the connection managers for the source and destination and the test connection
    works fine.  When I try to run the data flow task however it fails on the first packed decimal field it encounters with the exceptions ...
    [select from hydro520 hydroweb2 blpmstr [16]] Error: SSIS Error Code DTS_E_INDUCEDTRANSFORMFAILUREONERROR.  The "component "select from hydro520 hydroweb2 blpmstr" (16)" failed because error code 0x80004002 occurred, and the error
    row disposition on "output column "MSPRIB" (55)" specifies failure on error. An error occurred on the specified object of the specified component.  There may be error messages posted before this with more information about the failure.
    [select from hydro520 hydroweb2 blpmstr [16]] Error: The component "select from hydro520 hydroweb2 blpmstr" (16) was unable to process the data. Pipeline component has returned HRESULT error code 0xC0209029 from a method call.
    [SSIS.Pipeline] Error: SSIS Error Code DTS_E_PRIMEOUTPUTFAILED.  The PrimeOutput method on component "select from hydro520 hydroweb2 blpmstr" (16) returned error code 0xC02090F5.  The component returned a failure code when the pipeline
    engine called PrimeOutput(). The meaning of the failure code is defined by the component, but the error is fatal and the pipeline stopped executing.  There may be error messages posted before this with more information about the failure.
    ...in the progress tab.  Can someone kindly tell me what I need to do to get the connection manager to work with DB2 packed decimal fields?  Or is it a different issue all together?  Thanks tonnes for any help, Roscoe

    Hi rpfinn,
    From the Data Types mapping rules between SSIS and DB2, we can see that both the NUMERIC and DECIMAL data types in DB2 are mapped to DT_NUMBERIC data type in SSIS. Now that the source data in your DB2 database is NUMERIC data type, changing the DT_NUMBRIC
    data type to DT_DECIMAL is invalid. Besides, if we check the data types of the target External column and Output column from the Advanced Editor for ADO NET Source adapter, the data type should be defined as DT_NUMERIC with Precision as 9 and Scale as 2. I
    am not clear where you see the DT_NUMBERIC(9,0) e.g. DT_NUMERIC with Precision as 9 and Scale as 0, but it may be the cause of the issue. You need to make sure the DT_NUMERIC data type also has Scale 2 instead of 0.
    If you don’t know how to modify the data type, please elaborate the Data Flow Task of the package so that we can make further analysis. Besides, the error messages you posted are incomplete, it will be helpful if you post the complete error message.
    Regards,
    Mike Yin
    If you have any feedback on our support, please click
    here
    Mike Yin
    TechNet Community Support

  • Why does DB2 Data Flow Task query does not accept a date coming from a string or datetime variable in SSIS?

    I am trying to compare a DB2 date format to a date variable from SSIS. I have tried to set the variable as datetime and string. I have also casted the SQL date as date in the data flow task. I have tried a number of combinations of date formats, but no luck
    yet. Does anyone have any insights on how to set a date (without the time) variable and be able to use it in the data flow task SQL? It has to be an easy way to accomplish that. I get the following error below:
    An invalid datetime format was detected; that is, an invalid string representation or value was specified. SQLSTATE=22007".
    Thanks!

    Hi Marcel,
    Based on my research, in DB2, we use the following function to convert a string value to a date value:
    Date(To_Date(‘String’, ‘DD/MM/YYYY’))
    So, you can set the variable type to String in the package, and try the following query:
    ACCOUNT_DATE  BETWEEN  '11/30/2013' AND  Date(To_Date(?, ‘DD/MM/YYYY’))
    References:
    http://stackoverflow.com/questions/4852139/converting-a-string-to-a-date-in-db2
    http://www.dbforums.com/db2/1678158-how-convert-string-time.html
    Regards,
    Mike Yin
    TechNet Community Support

  • Automatic creation of BW data flow documentation

    Dear Gurus,
    I need to write documentation of the data flow of a huge project which I haven't implemented by myself.
    The documentation should contain a mapping of the objects in the dataprovider, towards objects in the source system(s).
    Eventually with the info in which dataproviders the objects are included, e.g. between the multiprovider and the source system.
    Details of transformations can be ignored; eventually mentioning there's a routine involved, but that's the maximum.
    With the data repository, I can have the content of cubes in a graphical overview, but it doesn't really provide me useful information.
    You can imagine I prefer an automatic way to create this documentation.
    Anybody who knows a solution, even if it only provides part of the purpose?
    Any solution via query, standard SAP or customized program, ...
    Recommendations would be very highly appreciated!
    Thx & Rgds, sam

    Worldwide documentation is made on SAP BW projects, but no reply on automatic documentation.
    A lot of time must be lost by manually creating documentation on mapping objects to source system fields.
    ==> SAP, please, work out a solution.
    I didn't find a satisfying solution, but I've done it the following way:
    List all objects for a multiprovider via the meta data repository, and paste in excel document.
    Then listing all objects for the underlying dataproviders, and paste in separate sheets of this excel.
    Compare the objects of the MP with the objects on the other sheets using excel functions, and sign when a dataprovider contains a certain object.
    For the datasources, I checked if an object is present, and if yes, give the original source field.
    This in summary as a not optimal and not complete solution, but it prevents making mistakes.
    Rgds. sam

  • Data flow in R/3 and CRM

    Hi experts
    I am bit confused about the data flow in source systems moving to Delta queue even though i know bits and pices but not able to relate and keep them in order.
    First Scenario- R/3
    When we save a transaction or create a sales order by V1 it will go to Database tables and then through V1 it will reach the Extraction queue or Update tables based on the delta method we select or else if direct delta it will go directly to the Delta queue which is nothing but QRFCOUT AND ARFCQUEUE tables.
    in SM53 we can check the transfer of data records from this delta queue to BW.
    Is my understanding right on the R/3.
    Second scenario- CRM
    when we save a transaction lets take an example of Activity creation how does this pass to Delta queue and to the Database tables and is their a intermidate place where we can check the data in between the Transaction creation and till the Delta Queue of CRM.
    thanks and regards
    Neel

    Hi Rahul,
    Execute Transaction: CRMC_R3_ORG_GENERATE
    Select the org u want to copy to CRM.
    Click on 'Generate Selected Lines'.
    Best Regards,
    Pratik Patel
    <b>Reward with Points!</b>

Maybe you are looking for

  • 10.5.7 install never finishing

    We are upgrading one of our computers and it has been stuck at the 20% mark for the last 20 minutes. What do we do? It isn't moving forward and we're afraid to turn off the computer if it is at a critical part of the update. None of our other compute

  • As soon as i downloaded firefox, this "mozctl.dll" is not working. What do i do?

    I downloaded firefox 3.6 and tried to run another program. My computer said "mozctl.dll was not working. What do i do? == This happened == Not sure how often == I downloaded firefox 3.6

  • How to convert well format xml in coldfusion?

    Hi, I am consuming asp.net webservice in coldfusion.It returns xml but it is not in wellformat. Please suggest me how to convert to well format xml. Advance Thanks,

  • License server general failure

    One workstation using the SAP B1 client generates a "license server general failure" message each time a user attempts to login to SAP B1, this is then followed by the dialogue box requesting the license server name and port number; I have the correc

  • Service centre fraud

    i submitted my sony xperia j at a service centre in delhi on 26th november.it was in good condition,only its mic wasnt working.they submitted my phone under warranty.3 days later wen i contact them they show me my phone with a cracked screen.they bro