Migrate data flow from 3.5 to 7.3?

Dear Experts,
After technical had upgrade SAP BW from 3.5 to 7.3, I did test migrating data flow. I found that if I specified "migration project" to another name different from DataStore Object name, I could not find related objects (e.g. transformation or DTP) under that DataStore Object. And the DataStore Object was also inactive version, even the migration was done without error.
For example
- Original DSO name = AAA was showed inactive
- Migration Project name = AAA_Migrated
- After selecting all the objects including process chains and clicking on 'Migration/Recovery' button, status showed with no error (Migration History displayed all green)
- recheck objects in transaction = RSA1
- DSO name = AAA was still showed inactive
I just wonder where all objects under DSO name = AAA were gone?
What happened to the migration project name = AAA_Migrated?
How should I find the migration project name = AAA_Migrated?
How to recover all objects under DSO name = AAA? (Just in case misspelling "migration project")?
If you have similar case mentioned above, could you share any experience how to handle this?
Thank you very much.
-WJ-

BW 7.30: Data Flow Migration tool: Migrating 3.x flows to 7.3 flows and also the recovery to 3.X flow
Regards,
Sushant

Similar Messages

  • How to migrate  the data flow from DB CONNECT sourse system from 3.5 to BI

    Hi
    can any one tell me how to migrate the data flow from DB CONNECT sourse system from 3.5 to BI 7.

    Hi,
    Go to Infoprovider to which your DB connect DS feeds and Right Click on Data source-> Then Migrate-> With Export---> You have to build new 7.0 Transformations and DTP's etc.
    ~AK

  • How to make data flow from one application to other in BPEL.

    Hi All,
    I am designing work-flow of my application through BPEL(JDeveloper), I am making different BPEL projects for different functions, like sales manager got the order from sales person and sales manager either approve it or reject it, if he approve it it goes to Production manager and he ships the goods, now I want to keep sales person, sales manger,production manager in seperate BPEL files and want to get the output of sales person to sales manager and sales manager to production manager please help me in dong this.
    I was trying to make partner link in Sales manager of sales person and getting the input from there. I dont know this is right even or not, if it is right I dont know how to make data flow from one application to other.
    Experience people please guide.
    Sales Person -----> Sales Manager ----> Production Manager
    Thanks
    Yatan

    Yes you can do this.
    If you each integration point to be in different process, you have to create three BPEL process.
    1. Create a Async BPEL process 'A' which will be initiated when sales person creates the order.
    2. From BPEL process 'A' call a ASync BPEL process 'B' which has the approval flow. Depending on the input from process 'A' the sales manager will review the order in workflow and approve or reject and send the result back to process 'A'.
    3. Based on the result from workflow, invoke the Sync BPEL process 'C', where you can implement the shipping logic.
    -Ramana.

  • Data Flow from CRM to BW

    Dear SAP Experts,
    Greetings for the Day!
    I am looking forward for some information on the Data flow happening from CRM to BW system. Some of the few queries are as below:
    Do we have any settings for this data flow in Transaction SMOEAC.
    How does the below setting impact the BDOC flow in BW. Also, if we un-check the “Do Not Snd”, will BDOCs
    flow to BW system ? <PFA>
    PS: We are on CRM 7.0 with EHP2.
    Thanks!
    Regards,
    Kanika

    Hi Kanika,
    Data flow from CRM to BW happens via XIF using IDocs. You can check in transaction WE21 for your RFC destination of BW and the output parameters, which decides what data would be send to the corresponding destination.
    You can also check my blog:
    External Interface (XIF) Setup but this is XIF setup in general and not specific to BW.
    Hope this helps.
    Best Regards,
    Shanthala.

  • Need to check the data flow from R/3 to BW server.

    Hi BI experts,
    This query is regarding need to check the data flow from R/3 to BW server.
    As of now I have some set of reports which I would need to take up in BW. The requirement is  to go through the list of transaction codes for reports in R/3 and find out if there are already  any existing objects in BW system which I can use for these reports.
    So, can u plz help me.

    Depends what are your Tcode or Reports users run in R/3 and they want the same in BW.Then in BI Content we have Out of the box Delivered reports.You can activate those Load data and use it.
    Gimme T-codes you have I can send you Standard reports in BI or Cube you can get these from.
    ~AK

  • Mapping data flow from R/3 to BW

    Hello,
    I am pretty new to BW and I have been tasked with creating a detailed map of the data flow from R/3 into BW. 
    I need to record where the data originates from in R/3 (field names/tables) and literally track the flow of that data all the way including any info objects along the way to any cubes that it may be sitting in.
    How do I track this flow ? And how can I identify what a characteristic in BW is in R/3 ?
    Has anybody had to create a similar data flow ? If so how did you approach this ?
    Many Thanks,
    Matt

    Hi Matthew,
    From the R/3 side:
    BW treats all the data from R/3 as Datasources.
    From the Datasource the upload of data to the cube is done as..        
    <b>Datasource->Transfer Rule->psa/infosource->communication structure->cube</b>
    (for a 3.5 system)
    in case of 7.0 system... data flow is as follows...
    <b>Datasource->infopackage->psa->transformation/DTP-> Data target(cube)</b>
    -> Go to transaction <b>RSA5</b>( for Business Content datasources ) and <b>RSA6</b>( for all the active Datasources ) found in the system.
    -> There you can find all the data that you want...(For your mapping purpose this will do..)
    -> You can as well check from the BI side in the transaction RSA1 -> click on the Monitor button on the left ( for custom objects ) or Business Content button -> choose the object from the tree... right click and replicate to find if all of them were used.
    Hope this helps!!
    <b>*</b><i>Reward Pts if useful</i><b>*</b>
    regards,
    Naveenan.

  • Data Flow from Source systemside LUWS and Extarction strucures

    Hi
    Can Anybody Explain the Data flow from Source system to Bi System .Especially I mean the Extract Structure and LUWS where does they come in picture ,the core data flow of inbound and out bound queues .If any link for the document  would also be helpful.
    Regards
    Santosh

    Hi See Articles..
    http://wiki.sdn.sap.com/wiki/display/profile/Surendra+Reddy
    Data Flow from LBWQ/SMQ1 to RSA7 in ECC (Records Comparison).
    http://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/com.sap.km.cm.docs/library/enterprise-data-warehousing/data%20flow%20from%20lbwq%20smq1%20to%20rsa7%20in%20ecc%20(Records%20Comparison).pdf
    Checking the Data using Extractor Checker (RSA3) in ECC Delta Repeat Delta etc...
    http://www.sdn.sap.com/irj/scn/index?rid=/library/uuid/80f4c455-1dc2-2c10-f187-d264838f21b5&overridelayout=true 
    Data Flow from LBWQ/SMQ1 to RSA7 in ECC and Delta Extraction in BI
    http://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/com.sap.km.cm.docs/library/business-intelligence/d-f/data%20flow%20from%20lbwq_smq1%20to%20rsa7%20in%20ecc%20and%20delta%20extraction%20in%20bi.pdf
    Thanks
    Reddy

  • COMM_SRTUCTURE is uknown when migrating data flow bw 3.x to 7.4

    Dear ALL,
    2LIS_13_VDHDR data flow migrating 3.x to 7.x , ABAP syntax error COMM_STRUCTURE is unknown infosoure transformation level, present we are using 7.4 sp5 . after migration ABAP code
    TYPES:
          BEGIN OF _ty_s_TG_1_full,
    *      InfoObject: 0CHNGID Change Run ID.
            CHNGID           TYPE /BI0/OICHNGID,
    *      InfoObject: 0RECORDTP Record type.
            RECORDTP           TYPE /BI0/OIRECORDTP,
    *      InfoObject: 0REQUID Request ID.
            REQUID           TYPE /BI0/OIREQUID,
    *      InfoObject: 0CALDAY Calendar Day.
            CALDAY           TYPE /BI0/OICALDAY,
    *      InfoObject: 0CALMONTH Calendar Year/Month.
            CALMONTH           TYPE /BI0/OICALMONTH,
    *      InfoObject: 0CALWEEK Calendar year / week.
            CALWEEK           TYPE /BI0/OICALWEEK,
    *      InfoObject: 0FISCPER Fiscal year / period.
            FISCPER           TYPE /BI0/OIFISCPER,
    *      InfoObject: 0FISCVARNT Fiscal year variant.
            FISCVARNT           TYPE /BI0/OIFISCVARNT,
    *      InfoObject: 0BILLTOPRTY Bill-to party.
            BILLTOPRTY           TYPE /BI0/OIBILLTOPRTY,
    *      InfoObject: 0COMP_CODE Company code.
            COMP_CODE           TYPE /BI0/OICOMP_CODE,
    *      InfoObject: 0DISTR_CHAN Distribution Channel.
            DISTR_CHAN           TYPE /BI0/OIDISTR_CHAN,
    *      InfoObject: 0DOC_CATEG Sales Document Category.
            DOC_CATEG           TYPE /BI0/OIDOC_CATEG,
    *      InfoObject: 0PLANT Plant.
            PLANT           TYPE /BI0/OIPLANT,
    *      InfoObject: 0SALESORG Sales Organization.
            SALESORG           TYPE /BI0/OISALESORG,
    *      InfoObject: 0SALES_GRP Sales group.
            SALES_GRP           TYPE /BI0/OISALES_GRP,
    *      InfoObject: 0SALES_OFF Sales Office.
            SALES_OFF           TYPE /BI0/OISALES_OFF,
    *      InfoObject: 0SHIP_TO Ship-To Party.
            SHIP_TO           TYPE /BI0/OISHIP_TO,
    *      InfoObject: 0SOLD_TO Sold-to party.
            SOLD_TO           TYPE /BI0/OISOLD_TO,
    *      InfoObject: 0VERSION Version.
            VERSION           TYPE /BI0/OIVERSION,
    *      InfoObject: 0VTYPE Value Type for Reporting.
            VTYPE           TYPE /BI0/OIVTYPE,
    *      InfoObject: 0DIVISION Division.
            DIVISION           TYPE /BI0/OIDIVISION,
    *      InfoObject: 0MATERIAL Material.
            MATERIAL           TYPE /BI0/OIMATERIAL,
    *      InfoObject: 0SHIP_POINT Shipping point.
            SHIP_POINT           TYPE /BI0/OISHIP_POINT,
    *      InfoObject: 0PAYER Payer.
            PAYER           TYPE /BI0/OIPAYER,
    *      InfoObject: 0DOC_CLASS Document category /Quotation/Order/Deliver
    *y/Invoice.
            DOC_CLASS           TYPE /BI0/OIDOC_CLASS,
    *      InfoObject: 0DEB_CRED Credit/debit posting (C/D).
            DEB_CRED           TYPE /BI0/OIDEB_CRED,
    *      InfoObject: 0SALESEMPLY Sales Representative.
            SALESEMPLY           TYPE /BI0/OISALESEMPLY,
    *      InfoObject: 0SUBTOT_1S Subtotal 1 from pricing proced. for condit
    *ion in stat. curr..
            SUBTOT_1S           TYPE /BI0/OISUBTOT_1S,
    *      InfoObject: 0SUBTOT_2S Subtotal 2 from pricing proced. for condit
    *ion in stat. curr..
            SUBTOT_2S           TYPE /BI0/OISUBTOT_2S,
    *      InfoObject: 0SUBTOT_3S Subtotal 3 from pricing proced.for conditi
    *on in stat. curr..
            SUBTOT_3S           TYPE /BI0/OISUBTOT_3S,
    *      InfoObject: 0SUBTOT_4S Subtotal 4 from pricing proced. for condit
    *ion in stat. curr..
            SUBTOT_4S           TYPE /BI0/OISUBTOT_4S,
    *      InfoObject: 0SUBTOT_5S Subtotal 5 from pricing proced. for condit
    *ion in stat. curr..
            SUBTOT_5S           TYPE /BI0/OISUBTOT_5S,
    *      InfoObject: 0SUBTOT_6S Subtotal 6 from pricing proced. for condit
    *ion in stat. curr..
            SUBTOT_6S           TYPE /BI0/OISUBTOT_6S,
    *      InfoObject: 0OPORDQTYBM Open orders quantity in base unit of meas
    *ure.
            OPORDQTYBM           TYPE /BI0/OIOPORDQTYBM,
    *      InfoObject: 0OPORDVALSC Net value of open orders in statistics cu
    *rrency.
            OPORDVALSC           TYPE /BI0/OIOPORDVALSC,
    *      InfoObject: 0QUANT_B Quantity in base units of measure.
            QUANT_B           TYPE /BI0/OIQUANT_B,
    *      InfoObject: 0DOCUMENTS No. of docs.
            DOCUMENTS           TYPE /BI0/OIDOCUMENTS,
    *      InfoObject: 0DOC_ITEMS Number of Document Items.
            DOC_ITEMS           TYPE /BI0/OIDOC_ITEMS,
    *      InfoObject: 0NET_VAL_S Net value in statistics currency.
            NET_VAL_S           TYPE /BI0/OINET_VAL_S,
    *      InfoObject: 0COST_VAL_S Cost in statistics currency.
            COST_VAL_S           TYPE /BI0/OICOST_VAL_S,
    *      InfoObject: 0GR_WT_KG Gross weight in kilograms.
            GR_WT_KG           TYPE /BI0/OIGR_WT_KG,
    *      InfoObject: 0NT_WT_KG Net weight in kilograms.
            NT_WT_KG           TYPE /BI0/OINT_WT_KG,
    *      InfoObject: 0VOLUME_CDM Volume in cubic decimeters.
            VOLUME_CDM           TYPE /BI0/OIVOLUME_CDM,
    *      InfoObject: 0HDCNT_LAST Number of Employees.
            HDCNT_LAST           TYPE /BI0/OIHDCNT_LAST,
    *      InfoObject: 0CRM_PROD Product.
            CRM_PROD           TYPE /BI0/OICRM_PROD,
    *      InfoObject: 0CP_CATEG Category.
            CP_CATEG           TYPE /BI0/OICP_CATEG,
    *      InfoObject: 0FISCYEAR Fiscal year.
            FISCYEAR           TYPE /BI0/OIFISCYEAR,
    *      InfoObject: 0BP_GRP BP: Business Partner Group (from Hierarchy).
            BP_GRP           TYPE /BI0/OIBP_GRP,
    *      InfoObject: 0STAT_CURR Statistics Currency.
            STAT_CURR           TYPE /BI0/OISTAT_CURR,
    *      InfoObject: 0BASE_UOM Base Unit of Measure.
            BASE_UOM           TYPE /BI0/OIBASE_UOM,
    *      InfoObject: 0PROD_CATEG Product Category.
            PROD_CATEG           TYPE /BI0/OIPROD_CATEG,
    *      InfoObject: 0VOLUME Volume.
            VOLUME           TYPE /BI0/OIVOLUME,
    *      InfoObject: 0VOLUMEUNIT Volume unit.
            VOLUMEUNIT           TYPE /BI0/OIVOLUMEUNIT,
    *      InfoObject: 0FISCPER3 Posting period.
            FISCPER3           TYPE /BI0/OIFISCPER3,
    *      InfoObject: 0SALES_DIST Sales District.
            SALES_DIST           TYPE /BI0/OISALES_DIST,
    *      InfoObject: 0BILL_TYPE Billing type.
            BILL_TYPE           TYPE /BI0/OIBILL_TYPE,
    *      InfoObject: 0MOVE_PLANT Receiving Plant/Issuing Plant.
            MOVE_PLANT           TYPE /BI0/OIMOVE_PLANT,
    *      InfoObject: 0SHIP_COND Shipping conditions.
            SHIP_COND           TYPE /BI0/OISHIP_COND,
    *      InfoObject: 0AB_RFBSK Status for Transfer to Accounting.
            AB_RFBSK           TYPE /BI0/OIAB_RFBSK,
    *      InfoObject: 0AB_FKSTO Indicator: Document Is Cancelled.
            AB_FKSTO           TYPE /BI0/OIAB_FKSTO,
    *      InfoObject: 0CUST_GRP5 Customer Group 5.
            CUST_GRP5           TYPE /BI0/OICUST_GRP5,
    *      InfoObject: ZCU_COND1 Constomer Condition Group 1.
            /BIC/ZCU_COND1           TYPE /BIC/OIZCU_COND1,
    *      InfoObject: ZCU_COND2 Customer Condition Group 2.
            /BIC/ZCU_COND2           TYPE /BIC/OIZCU_COND2,
    *      InfoObject: ZBATCHCD Batch Code.
            /BIC/ZBATCHCD           TYPE /BIC/OIZBATCHCD,
    *      InfoObject: 0BATCH Batch number.
            BATCH           TYPE /BI0/OIBATCH,
    *      InfoObject: ZBATCH Batch number.
            /BIC/ZBATCH           TYPE /BIC/OIZBATCH,
    *      Field: RECORD Data record number.
            RECORD           TYPE RSARECORD,
          END   OF _ty_s_TG_1_full.
    * Additional declaration for update rule interface
      DATA:
        MONITOR       type standard table of rsmonitor  WITH HEADER LINE,
        MONITOR_RECNO type standard table of rsmonitors WITH HEADER LINE,
        RECORD_NO     LIKE SY-TABIX,
        RECORD_ALL    LIKE SY-TABIX,
        SOURCE_SYSTEM LIKE RSUPDSIMULH-LOGSYS.
    * global definitions from update rules
    * TABLES: ...
    DATA: IN    TYPE F,
          OUT   TYPE F,
          DENOM TYPE F,
          NUMER TYPE F.
    * Def. of 'credit-documents': following doc.categ. are 'credit docs'
    *   reversal invoice (N)
    *   credit memo  (O)
    *   internal credit memo (6)
    * Credit-documents are delivered with negative sign. Sign is switched
    * to positive to provide positive key-figures in the cube.
    * The combination of characteristics DE_CRED and DOC-CLASS provides
    * a comfortable way to distinguisch e.g. positive incoming orders or
    * order returns.
    * Def. der 'Soll-Dokumente': folgende Belegtypen sind 'Soll-Belege'
    *   Storno Rechnung (N)
    *   Gutschrift (O)
    *   Interne Verrechn. Gutschr. (6)
    * Soll-Dokumente werden mit negativem Vorzeichen geliefert. Um die Kenn-
    * zahlen positiv in den Cube zu schreiben, wird das Vorzeich. gedreht
    * Die Kombination der Merkmale DEB_CRED und DOC-CLASS gibt Ihnen die
    * Möglichkeit schnell z.B. zwischen Auftrags-Eingang oder Retouren zu
    * unterscheiden.
    DATA: DEB_CRED(3) TYPE C VALUE 'NO6'.
    FORM routine_0002
      TABLES
       P_MONITOR         structure rsmonitor
      CHANGING
        RESULT         TYPE _ty_s_TG_1_full-DOCUMENTS
        RETURNCODE     LIKE sy-subrc
        ABORT          LIKE sy-subrc
      RAISING
        cx_sy_arithmetic_error
        cx_sy_conversion_error.
    * init variables
    * fill the internal table "MONITOR", to make monitor entries
    CLEAR RESULT.
    RESULT = COMM_STRUCTURE-NO_INV.
    IF COMM_STRUCTURE-DOC_CATEG CA DEB_CRED.
       RESULT = RESULT * ( -1 ).
    ENDIF.
    RETURNCODE = 0.
      p_monitor[] = MONITOR[].
      CLEAR:
        MONITOR[].
    ENDFORM.                    "routine_0002
    FORM routine_0003
      TABLES
       P_MONITOR         structure rsmonitor
      CHANGING
        RESULT         TYPE _ty_s_TG_1_full-DEB_CRED
        RETURNCODE     LIKE sy-subrc
        ABORT          LIKE sy-subrc
      RAISING
        cx_sy_arithmetic_error
        cx_sy_conversion_error.
    * init variables
    * fill the internal table "MONITOR", to make monitor entries
      IF COMM_STRUCTURE-DOC_CATEG CA DEB_CRED.
        RESULT = 'C'.
      ELSE.
        RESULT = 'D'.
      ENDIF.
      RETURNCODE = 0.
      p_monitor[] = MONITOR[].
      CLEAR:
        MONITOR[].
    ENDFORM.   
    Error:
    E:Field "COMM_STRUCTURE-NO_INV" is unknown. It is neither in one of the
    specified tables nor defined by a "DATA" statement. "DATA" statement.
    communication structure chaged to sours fields but no uses , please suggest how can i proceed , Thanks in Advance immediate replay
    Thanks & Regards
    Ramesh G

    Hi Gareth,
    You have two options:
    1. Transport from BW 3.1 to BI 7.0. You'll need to create a transport route between both systems. This may cause you some troubles in the future when you want to modify the objects you transported.
    2. As there are few objects, you can use XML export utility from Transport connection. There, you make an XML file with the objects you need to transport. One thing that you may take care of, in this option, is that the business content objects you are exporting need to be activated in the destination system. Another problem is that querys are not exported.
    Since it's only a cube, maybe you can create the objects manually. Look that in BI 7.0 there are several new functionalities, i don't know how transport or xml export would work.
    Hope this helps.
    Regards,
    Diego

  • New GL Migration - Data transfer from Classic GL to New GL

    Hi Champs!
    Our client is on ECC6.0, with classic GL functionality. Now they want to migrate to New GL and they have already procured SAP New GL migration services. Wanted to know, whether this migration service will take care of data transfer from tables of classic GL to tables of New GL with appropriate New GL functionalities like document splitting etc.
    Your feedback will help a lot.
    Thanks,
    Amish.

    Hi Amish,
    https://websmp207.sap-ag.de/GL
    This link will help you find all documentation pertaining to New GL concept from SAP.
    thanks and regards
    Praveen.J

  • Data Flow from TXT to a table error

    Hello,
    I am trying to fill in the data from a .txt file I have into a table in a DB. Previously this worked fine in DTS and I can still do it when I import the DTS command but I want to update this to a data flow because the DTS commands needs to be run on 32 bit
    and I'm using 64 bit. 
    I'm getting 3 errors:
    [OLE DB Destination [322]] Error: SSIS Error Code DTS_E_OLEDBERROR.  An OLE DB error has occurred. Error code: 0x80040E21.
    An OLE DB record is available.  Source: "Microsoft SQL Server Native Client 10.0"  Hresult: 0x80040E21  Description: "Multiple-step OLE DB operation generated errors. Check each OLE DB status value, if available. No work was done.".
    [OLE DB Destination [322]] Error: SSIS Error Code DTS_E_INDUCEDTRANSFORMFAILUREONERROR.  The "input "OLE DB Destination Input" (335)" failed because error code 0xC020907B occurred, and the error row disposition on "input "OLE
    DB Destination Input" (335)" specifies failure on error. An error occurred on the specified object of the specified component.  There may be error messages posted before this with more information about the failure.
    [SSIS.Pipeline] Error: SSIS Error Code DTS_E_PROCESSINPUTFAILED.  The ProcessInput method on component "OLE DB Destination" (322) failed with error code 0xC0209029 while processing input "OLE DB Destination Input" (335). The identified
    component returned an error from the ProcessInput method. The error is specific to the component, but the error is fatal and will cause the Data Flow task to stop running.  There may be error messages posted before this with more information about the
    failure.
    Before I changed the Flat File Source input advanced editor input and output properties to text stream [DT_TEXT] because the table has VarChar I also had an other error but this seems to be resolved. The only problem is if I look at the mappings the
    input is text stream [DT_TEXT] but the output is a string and I am unable to change this in the advanced editor of the OLE DB destination. I can change it but it changes back on it's own.
    Could I please get some help on these errors?
    Thanks

    Hi SQLNewbie101,
    According to your description, when you change column data type in the advanced editor of OLE DB Destination, it always changes back.
    Based on my research, the column data type is already confirmed by the destination table, it depends on the columns in the table, so we cannot change it.
    To fix this issue, one way as you said, we can use Data Conversion Transformation to convert the [DT_TEXT] data type to [DT_STR] after Flat File Source. Another way is directly change the column data type in the Advanced tab of Flat File Connection Manager
    Editor as below. Then double click the Flat File Source to update the columns.
    If there are any other questions, please feel free to ask.
    Thanks,
    Katherine Xiong
    Katherine Xiong
    TechNet Community Support

  • Data Flow from SAP Source (ECC) system to SAP BI system

    Hi All,
    I wanted to know how data will be flown from SAP Source system to SAP BI system.Data flow should include
    1) Data will be flown by using the IDOCs?
    2) What all are the interfaces involved while data is transferring?
    3) What will happen exactly, if you execute the PSA?.
    If you have any info on this, could you please post here....I
    Regards,
    K.Krishna Chaitanya.

    Hi Krishna,
    Please go through  this article :
    "http://www.trinay.com/C6747810-561C-4ED6-B85C-8F32CF901602/FinalDownload/DownloadId-C2EB7035A229BFC0BB16C09174241DC8/C6747810-561C-4ED6-B85C-8F32CF901602/SAP%20BW%20Extraction.pdf".
    Hope this answers all the mentioned questions.
    Regards,
    Sarika

  • Migration data gathering from ConfigMgr 2007 site is failing

    I have CM12 and CM07 primary sites. In the past I have successfully used the migration features in the CM12 console to migrate data from CM07 source hierarchy. However, to upgrade my CM12 site to R2 I had to stop the data gathering. Now a month later I am
    trying to gather data again but it fails every time. It happens even after cleaning up previous migration data. This is the error I see after 30 minutes of gathering data
    Configuration manager failed to gather data from cm07.domain.local
    The data gathering process failed. Check migmctrl.log.
    Looking at the log these are the only errors I see:
                                                                    ERROR: [Worker]: Microsoft.ConfigurationManagement.Migration.MigrationException:
    1 exceptions occurred during syncing.     at Microsoft.ConfigurationManagement.MigrationManager.SyncAgentJob.<get_ExecutionPlan>d__7.MoveNext()     at Microsoft.ConfigurationManagement.MigrationManager.Job`1.ExecuteNext()
    SMS_MIGRATION_MANAGER 2/27/2014 8:24:46 PM
    6056 (0x17A8)
                                                                    [Worker]: Start processing status
    changed event for MIG_SiteMapping.ID=16777218
    SMS_MIGRATION_MANAGER 2/27/2014 8:24:46 PM
    6056 (0x17A8)
                                                                    [Worker]:        
    Set the schedule item 16777218 end time SMS_MIGRATION_MANAGER
    2/27/2014 8:24:46 PM 6056 (0x17A8)
                                                                    [Worker]:        
    Set the schedule item 16777218 status to Failed
    SMS_MIGRATION_MANAGER 2/27/2014 8:24:46 PM
    6056 (0x17A8)
                                                                    [Worker]: End processing status changed
    event for MIG_SiteMapping.ID=16777218 SMS_MIGRATION_MANAGER
    2/27/2014 8:24:46 PM 6056 (0x17A8)
                                                                    [Worker]: Disposing Job 16777218
    SMS_MIGRATION_MANAGER 2/27/2014 8:24:46 PM
    6056 (0x17A8)
                                                                    [Worker]: Removing Job 16777218 from
    job manager. SMS_MIGRATION_MANAGER
    2/27/2014 8:24:46 PM 6056 (0x17A8)
                                                                    [Worker]: Removing the Job with Id
    16777218. SMS_MIGRATION_MANAGER
    2/27/2014 8:24:46 PM 6056 (0x17A8)
                                                                    [Worker]: Disposing worker
    SMS_MIGRATION_MANAGER 2/27/2014 8:24:46 PM
    6056 (0x17A8)
                                                                    [Worker]: Disposing current site
    connection SMS_MIGRATION_MANAGER
    2/27/2014 8:24:46 PM 6056 (0x17A8)
    ERROR: [MigMCtrl]: FAILED to EXECUTE job. error = Unknown error 0x80131500, 80131500
    SMS_MIGRATION_MANAGER 2/27/2014 8:24:46 PM
    6056 (0x17A8)
    ERROR: [MigMCtrl]: FAILED to EXECUTE job. error = Unknown error 0x80131500, 80131500
    SMS_MIGRATION_MANAGER 2/27/2014 8:24:46 PM
    6056 (0x17A8)
    Now I do see some CM07 objects in my CM12 console (shared DP's and boundary groups) so I know part of the process was successful. Any ideas on my errors above?

    Hi,
    Just like Gerry said, please look through the log to find the problem. The following thread could be helpful.
    http://social.technet.microsoft.com/Forums/en-US/a619f187-9683-4cb2-aa5e-2f9bc619323e/error-during-data-gathering-process?forum=configmanagermigration
    Best Regards,
    Joyce Li
    We
    are trying to better understand customer views on social support experience, so your participation in this
    interview project would be greatly appreciated if you have time.
    Thanks for helping make community forums a great place.

  • Transporting migrated data sources from BID to BIQ

    All,
           I have a 3.x data source which I have migrated to 7.x. Now I want to transport this from BI-D to BI-Q. when I replicate the data source in BI-Q this is likely to be 3.x data source. Will it be overwritten when I transport the data source from BI-D to BI-Q. Is there any thing that I will have to do before moving the transport. Please advise

    If you transport the DataSource on R3/ECC from development to QA that's connected to your BW QA, and then transport the DataSource from your BW development to QA environment, there is no need to replicate the DataSource. It may, however, come into your BW QA environment in a deactivated state. If that happens, just activate it in the BW QA environment using the SAP delivered ABAP program RSDS_DATASOURCE_ACTIVATE_ALL.

  • Why do my rules fail in the data flow from connector view to Meta view?

    I have Meta directory 5.0 alongwith the iplanet Directory Server 5.0 installed which is working fine.
    I have created an instance of NT Domain Connector which retrieves entries in a Connector view.
    Where do I get the examples about writing the data flow rules for the NT Domain Connector for flowing specific entries from CV to MV. Basically I do not want the NT Groups in the MV. Also I want to create an additional attribute e.g myflag whose value should be updated manually in the CV. And now if myflag = 0 I dont want this entry to be moved to MV and if myflag = 1 the entry should be moved to MV.
    I tried to write a few rule but it fails in testing only (Rule Tester). And I am not able to locate the exact error in my rule. Does it require any specific configuration ?
    Thanks
    Amol Talap

    You should post your rule.
    But either way, have you tried this:
    (objectclass==ntuser) or
    (objectclass!=groupofuniquenames)
    The first set allows only entries that are user.
    The second allows only entries that are not groups.
    As for the flags, try this:
    (myflag==1) or
    (myflag!=1)
    Same effect as above.
    Further more if rule testing fails, it could that you are not referencing the right Directory when using the rule tester. The rule tester does not always point to the right location.
    J.F.

  • Loading error after migration of flow from 3.5 to BI 7.0

    Error message from the source system
    Diagnosis
    An error occurred in the source system.
    System Response
    Caller 09 contains an error message.

    Hi,
    There are basically 3 reasons for the error message "Caller 09 contains an error message" to occur.
    1) Timestamp errror
    You need to Replicate the respective Data source from the corresponding source system and Activate the Transfer Rules and load again
    Goto to SE38 --> run the Program RS_TRANSTRU_ACTIVATE_ALL --> mention the infosource and click execute. this will activate the transfer rules and try to reload
    2) Sometimes an authorization issue is involved.
    a.If you are doing these loads using process chains,then please ensure that the chain is executed under 'BWREMOTE/ALEREMOTE' user(Provided ALEREMOTE/BWREMOTE must have all the necessary authorizations to extract the data from the source systems).
    b.If you are loading data manually,then your user must have the necessary profiles which gives you access to extract the data from source system.
    3) This might be shared memory problem in one of the server parameter.Ask your basis team to increase size of this parameter abap/shared_objects_size_MB
    to 100-200MB.this could solve your problem.
    Regards
    TG

Maybe you are looking for

  • Viewing PDF. In Safari

    In Safari, PDFs stop loading at bottom of display.  How do you get entire document to load?

  • The Error doesn't display file on the SAP screen

    Dear experts! Now, I'm getting some Issues about displaying file on SAP Screen use 2D Viewer or 3D Viewer. - The first, I Created document for WBS. - Second, I want to display the file on SAP screen. But the system opened it's by an application on cl

  • PROBLEM  BAPI_SALESORDER_CREATEFROMDAT2 condvalue

    Dear all, We have an issue with BAPI BAPI_SALESORDER_CREATEFROMDAT2. We try change the value for the field condition KONV-KWERT with the following field from the bapi CONDITIONS_IN-CONDVALUE. The condition to  modify is a percentage condition. We can

  • 997's sending through internal delivery channel

    Hi All, My scenario is , this is only for testing purpose.. I am sending PO through internal delivery channel to some other ftp in our local folder, b2b should send a 997 back to ipinqueue,, Is it possible to do that.... Help me to resolve this issue

  • Ensure jsp/pageflow called from within portlet

    Is there anyway to ensure that my jsp is called within a portlet. I want to make sure a user can't call the jsp page directly therefore bypassing the entitlements I have created on the portlet.